Warehouse Stock Clearance Sale

Grab a bargain today!


Sign Up for Fishpond's Best Deals Delivered to You Every Day
Go
Practitioner's Guide to ­Using Research for ­Evidence-Informed Practice, ­Third Edition

Rating
Format
Paperback, 304 pages
Published
United States, 12 April 2022

ALLEN RUBIN, PhD, is the Kantambu Latting College Professorship for Leadership and Change at the University of Houston Graduate College of Social Work. He is the author of several bestselling titles in social work research. JENNIFER BELLAMY, PhD, is Associate Dean for Research and Faculty Development and Professor at the Graduate School of Social Work at the University of Denver. She teaches research and theory courses at the master's and doctoral levels.


Preface xiii Acknowledgments xxi About the Authors xxiii PART I: Overview of Evidence-Informed Practice Chapter 1 Introduction to Evidence-Informed Practice Emergence of Evidence-Informed Practice Defining Evidence-Informed Practice Types of EIP Questions Evidence-Informed Practice Regarding Policy And Social Justice EIP and Black Lives Matter Developing an Evidence-Informed Practice Process Outlook What about unethical research? Key Chapter Concepts Review Exercises Additional Readings Chapter 2 Steps in the EIP Process Step 1: Question Formulation Step 2: Evidence Search Step 3: Critically Appraising Studies and Reviews Step 4: Selecting and Implementing the Intervention Step 5: Monitor Client Progress Feasibility Constraints Strategies for Overcoming Feasibility Obstacles But What About the Dodo Bird Verdict? Key Chapter Concepts Review Exercises Additional Readings Chapter 3 Research Hierarchies: Which Types of Research Are Best for Which Questions? More Than One Type of Hierarchy for More Than One Type of EIP Question Qualitative and Quantitative Studies What Types of Research Designs Apply to What Types of EIP Questions? Key Chapter Concepts Review Exercises Additional Readings PART II: Critically Appraising Studies for EIP Questions About Intervention Effectiveness Chapter 4 Criteria for Inferring Effectiveness: How Do We Know What Works? 71 Internal Validity Inferring the Plausibility of Cauality Degree of Certainty Needed in Making EIP Decisions Measurement Issues Statistical Chance External Validity Synopses of Fictitious Research Studies Key Chapter Concepts Review Exercises Exercise for Critically Appraising Published Articles Additional Readings Chapter 5 Critically Appraising Experiments 99 Classic Pretest-Posttest Control Group Design 100 Posttest-Only Control Group Design 102 Solomon Four-Group Design 103 Alternative Treatment Designs 104 Dismantling Designs 106 Placebo Control Group Designs 107 Experimental Demand and Experimenter Expectancies 109 Obtrusive Versus Unobtrusive Observation 110 Compensatory Equalization and Compensatory Rivalry 111 Resentful Demoralization 111 Treatment Diffusion 112 Treatment Fidelity 113 Practitioner Equivalence 113 Differential Attrition 115 Synopses of Research Studies 117 Key Chapter Concepts 123 Review Exercises 124 Exercise for Critically Appraising Published Articles 125 Additional Readings 125 Chapter 6 Critically Appraising Quasi-Experiments: Nonequivalent Comparison Groups Designs 126 Nonequivalent Comparison Groups Designs 127 Additional Logical Arrangements to Control for Potential Selectivity Biases 130 Statistical Controls for Potential Selectivity Biases 134 Propensity Score Matching Using a Policy Example 144 Pilot Studies 145 Synopses of Research Studies 147 Key Chapter Concepts 152 Review Exercises 153 Exercise for Critically Appraising Published Articles 153 Additional Readings 154 Chapter 7 Critically Appraising Quasi-Experiments: Time-Series Designs and Single-Case Designs 155 Simple Time-Series Designs 156 Multiple Time-Series Designs 159 Single-Case Designs 161 Synopses of Research Studies 168 Key Chapter Concepts 174 Review Exercises 175 Exercise for Critically Appraising Published Articles 176 Additional Reading 176 Chapter 8 Critically Appraising Systematic Reviews and Meta-Analyses 177 Advantages of Systematic Reviews and Meta-Analyses 179 Risks in Relying Exclusively on Systematic Reviews and Meta-Analyses 180 Where to Start 181 What to Look for When Critically Appraising Systematic Reviews 182 What Distinguishes a Systematic Review From Other Types of Reviews? 190 What to Look for When Critically Appraising Meta-Analyses 191 Synopses of Research Studies 205 Key Chapter Concepts 209 Review Exercises 211 Exercise for Critically Appraising Published Articles 212 Additional Readings 212 PART III: Critically Appraising Studies for Alternative EIP Questions Chapter 9 Critically Appraising Nonexperimental Quantitative Studies 215 Surveys 216 Cross-Sectional and Longitudinal Studies 228 Case-Control Studies 229 Synopses of Research Studies 231 Key Chapter Concepts 240 Review Exercises 242 Exercise for Critically Appraising Published Articles 242 Additional Readings 242 Chapter 10 Critically Appraising Qualitative Studies 243 Qualitative Observation 245 Qualitative Interviewing 247 Qualitative Sampling 250 Grounded Theory 252 Frameworks for Appraising Qualitative Studies 252 Mixed Model and Mixed Methods Studies 257 Synopses of Research Studies 258 Key Chapter Concepts 266 Review Exercises 271 Exercise for Critically Appraising Published Articles 272 Additional Readings 272 PART IV: Assessment and Monitoring in Evidence-Informed Practice Chapter 11 Critically Appraising, Selecting, and Constructing Assessment Instruments 275 Reliability 276 Validity 280 Sensitivity 285 Feasibility 287 Sample Characteristics 288 Locating Assessment Instruments Constructing Assessment Instruments Synopses of Research Studies 291 Key Chapter Concepts 296 Review Exercises 297 Exercise for Critically Appraising Published Articles 298 Additional Readings 298 Chapter 12 Monitoring Client Progress 299 A Practitioner-Friendly Single-Case Design Feasible Assessment Techniques Using Within-Group Effect Size Benchmarks Key Chapter Concepts 318 Review Exercises 319 Additional Readings 319 PART V: Additional Aspects of Evidence-Informed Practice Chapter 13 Appraising and Conducting Data Analyses in Evidence-Informed Practice Introduction Ruling Out Statistical Chance What Else Do You Nered To Know? Calculating Within-Group Effect Sizes and Using Benchmarks Conclusion Key Chapter Concepts Review Exercises Additional Readings Chapter 14 Critically Appraising Social Justice Research Studies Introduction Evidence-Informed Social Action What Type of Evidence? Participatory Action Research Illustrations OF Other Types of Social Justice Research Conclusion Key Chapter Concepts Review Exercises Additional Readings Glossary 335 References 345 Index 351

Show more

Our Price
HK$552
Elsewhere
HK$792.97
Save HK$240.97 (30%)
Ships from Australia Estimated delivery date: 18th Apr - 28th Apr from Australia
Free Shipping Worldwide

Buy Together
+
Buy together with Practical Implementation in Social Work Practice at a great price!
Buy Together
HK$896
Elsewhere Price
HK$973.75
You Save HK$77.75 (8%)

Product Description

ALLEN RUBIN, PhD, is the Kantambu Latting College Professorship for Leadership and Change at the University of Houston Graduate College of Social Work. He is the author of several bestselling titles in social work research. JENNIFER BELLAMY, PhD, is Associate Dean for Research and Faculty Development and Professor at the Graduate School of Social Work at the University of Denver. She teaches research and theory courses at the master's and doctoral levels.


Preface xiii Acknowledgments xxi About the Authors xxiii PART I: Overview of Evidence-Informed Practice Chapter 1 Introduction to Evidence-Informed Practice Emergence of Evidence-Informed Practice Defining Evidence-Informed Practice Types of EIP Questions Evidence-Informed Practice Regarding Policy And Social Justice EIP and Black Lives Matter Developing an Evidence-Informed Practice Process Outlook What about unethical research? Key Chapter Concepts Review Exercises Additional Readings Chapter 2 Steps in the EIP Process Step 1: Question Formulation Step 2: Evidence Search Step 3: Critically Appraising Studies and Reviews Step 4: Selecting and Implementing the Intervention Step 5: Monitor Client Progress Feasibility Constraints Strategies for Overcoming Feasibility Obstacles But What About the Dodo Bird Verdict? Key Chapter Concepts Review Exercises Additional Readings Chapter 3 Research Hierarchies: Which Types of Research Are Best for Which Questions? More Than One Type of Hierarchy for More Than One Type of EIP Question Qualitative and Quantitative Studies What Types of Research Designs Apply to What Types of EIP Questions? Key Chapter Concepts Review Exercises Additional Readings PART II: Critically Appraising Studies for EIP Questions About Intervention Effectiveness Chapter 4 Criteria for Inferring Effectiveness: How Do We Know What Works? 71 Internal Validity Inferring the Plausibility of Cauality Degree of Certainty Needed in Making EIP Decisions Measurement Issues Statistical Chance External Validity Synopses of Fictitious Research Studies Key Chapter Concepts Review Exercises Exercise for Critically Appraising Published Articles Additional Readings Chapter 5 Critically Appraising Experiments 99 Classic Pretest-Posttest Control Group Design 100 Posttest-Only Control Group Design 102 Solomon Four-Group Design 103 Alternative Treatment Designs 104 Dismantling Designs 106 Placebo Control Group Designs 107 Experimental Demand and Experimenter Expectancies 109 Obtrusive Versus Unobtrusive Observation 110 Compensatory Equalization and Compensatory Rivalry 111 Resentful Demoralization 111 Treatment Diffusion 112 Treatment Fidelity 113 Practitioner Equivalence 113 Differential Attrition 115 Synopses of Research Studies 117 Key Chapter Concepts 123 Review Exercises 124 Exercise for Critically Appraising Published Articles 125 Additional Readings 125 Chapter 6 Critically Appraising Quasi-Experiments: Nonequivalent Comparison Groups Designs 126 Nonequivalent Comparison Groups Designs 127 Additional Logical Arrangements to Control for Potential Selectivity Biases 130 Statistical Controls for Potential Selectivity Biases 134 Propensity Score Matching Using a Policy Example 144 Pilot Studies 145 Synopses of Research Studies 147 Key Chapter Concepts 152 Review Exercises 153 Exercise for Critically Appraising Published Articles 153 Additional Readings 154 Chapter 7 Critically Appraising Quasi-Experiments: Time-Series Designs and Single-Case Designs 155 Simple Time-Series Designs 156 Multiple Time-Series Designs 159 Single-Case Designs 161 Synopses of Research Studies 168 Key Chapter Concepts 174 Review Exercises 175 Exercise for Critically Appraising Published Articles 176 Additional Reading 176 Chapter 8 Critically Appraising Systematic Reviews and Meta-Analyses 177 Advantages of Systematic Reviews and Meta-Analyses 179 Risks in Relying Exclusively on Systematic Reviews and Meta-Analyses 180 Where to Start 181 What to Look for When Critically Appraising Systematic Reviews 182 What Distinguishes a Systematic Review From Other Types of Reviews? 190 What to Look for When Critically Appraising Meta-Analyses 191 Synopses of Research Studies 205 Key Chapter Concepts 209 Review Exercises 211 Exercise for Critically Appraising Published Articles 212 Additional Readings 212 PART III: Critically Appraising Studies for Alternative EIP Questions Chapter 9 Critically Appraising Nonexperimental Quantitative Studies 215 Surveys 216 Cross-Sectional and Longitudinal Studies 228 Case-Control Studies 229 Synopses of Research Studies 231 Key Chapter Concepts 240 Review Exercises 242 Exercise for Critically Appraising Published Articles 242 Additional Readings 242 Chapter 10 Critically Appraising Qualitative Studies 243 Qualitative Observation 245 Qualitative Interviewing 247 Qualitative Sampling 250 Grounded Theory 252 Frameworks for Appraising Qualitative Studies 252 Mixed Model and Mixed Methods Studies 257 Synopses of Research Studies 258 Key Chapter Concepts 266 Review Exercises 271 Exercise for Critically Appraising Published Articles 272 Additional Readings 272 PART IV: Assessment and Monitoring in Evidence-Informed Practice Chapter 11 Critically Appraising, Selecting, and Constructing Assessment Instruments 275 Reliability 276 Validity 280 Sensitivity 285 Feasibility 287 Sample Characteristics 288 Locating Assessment Instruments Constructing Assessment Instruments Synopses of Research Studies 291 Key Chapter Concepts 296 Review Exercises 297 Exercise for Critically Appraising Published Articles 298 Additional Readings 298 Chapter 12 Monitoring Client Progress 299 A Practitioner-Friendly Single-Case Design Feasible Assessment Techniques Using Within-Group Effect Size Benchmarks Key Chapter Concepts 318 Review Exercises 319 Additional Readings 319 PART V: Additional Aspects of Evidence-Informed Practice Chapter 13 Appraising and Conducting Data Analyses in Evidence-Informed Practice Introduction Ruling Out Statistical Chance What Else Do You Nered To Know? Calculating Within-Group Effect Sizes and Using Benchmarks Conclusion Key Chapter Concepts Review Exercises Additional Readings Chapter 14 Critically Appraising Social Justice Research Studies Introduction Evidence-Informed Social Action What Type of Evidence? Participatory Action Research Illustrations OF Other Types of Social Justice Research Conclusion Key Chapter Concepts Review Exercises Additional Readings Glossary 335 References 345 Index 351

Show more
Product Details
EAN
9781119858560
ISBN
1119858569
Dimensions
26.5 x 17.8 x 1.3 centimeters (0.47 kg)

Table of Contents

Preface xi

Acknowledgements xv

About the Authors xvii

About the Companion Website xix

Part 1 Overview of Evidence-Informed Practice

1 Introduction to Evidence-Informed Practice (EIP) 2

1.1 Emergence of EIP 4

1.2 Defining EIP 4

1.3 Types of EIP Questions 5

1.4 EIP Practice Regarding Policy and Social Justice 13

1.5 EIP and Black Lives Matter 13

1.6 Developing an EIP Practice Process Outlook 14

1.7 EIP as a Client-Centered, Compassionate Means, Not an End unto Itself 16

1.8 EIP and Professional Ethics 17

Key Chapter Concepts 18

Review Exercises 19

Additional Readings 19

2 Steps in the EIP Process 21

2.1 Step 1: Question Formulation 22

2.2 Step 2: Evidence Search 22

2.3 Step 3: Critically Appraising Studies and Reviews 29

2.4 Step 4: Selecting and Implementing the Intervention 30

2.5 Step 5: Monitor Client Progress 33

2.6 Feasibility Constraints 33

2.7 But What about the Dodo Bird Verdict? 36

Key Chapter Concepts 38

Review Exercises 39

Additional Readings 39

3 Research Hierarchies: Which Types of Research Are Best for Which Questions? 40

3.1 More than One Type of Hierarchy for More than One Type of EIP Question 41

3.2 Qualitative and Quantitative Studies 42

3.3 Which Types of Research Designs Apply to Which Types of EIP Questions? 43

Key Chapter Concepts 52

Review Exercises 53

Additional Readings 53

Part 2 Critically Appraising Studies for EIP Questions about Intervention Effectiveness

4 Criteria for Inferring Effectiveness: How Do We Know What Works? 56

4.1 Internal Validity 57

4.2 Measurement Issues 62

4.3 Statistical Chance 65

4.4 External Validity 66

4.5 Synopses of Fictitious Research Studies 67

Key Chapter Concepts 71

Review Exercises 72

Exercise for Critically Appraising Published Articles 73

Additional Readings 73

5 Critically Appraising Experiments 74

5.1 Classic Pretest-Posttest Control Group Design 75

5.2 Posttest-Only Control Group Design 76

5.3 Solomon Four-Group Design 77

5.4 Alternative Treatment Designs 78

5.5 Dismantling Designs 79

5.6 Placebo Control Group Designs 80

5.7 Experimental Demand and Experimenter Expectancies 82

5.8 Obtrusive Versus Unobtrusive Observation 83

5.9 Compensatory Equalization and Compensatory Rivalry 83

5.10 Resentful Demoralization 84

5.11 Treatment Diffusion 84

5.12 Treatment Fidelity 85

5.13 Practitioner Equivalence 85

5.14 Differential Attrition 86

5.15 Synopses of Research Studies 88

Key Chapter Concepts 91

Review Exercises 92

Exercise for Critically Appraising Published Articles 92

Additional Readings 93

6 Critically Appraising Quasi-Experiments: Nonequivalent Comparison Groups Designs 94

6.1 Nonequivalent Comparison Groups Designs 95

6.2 Additional Logical Arrangements to Control for Potential Selectivity Biases 97

6.3 Statistical Controls for Potential Selectivity Biases 101

6.4 Creating Matched Comparison Groups Using Propensity Score Matching 105

6.5 Pilot Studies 108

6.6 Synopses of Research Studies 110

Key Chapter Concepts 113

Review Exercises 114

Exercise for Critically Appraising Published Articles 114

Additional Readings 114

7 Critically Appraising Quasi-Experiments: Time-Series Designs and Single-Case Designs 115

7.1 Simple Time-Series Designs 116

7.2 Multiple Time-Series Designs 118

7.3 Single-Case Designs 119

7.4 Synopses of Research Studies 125

Key Chapter Concepts 129

Review Exercises 130

Exercise for Critically Appraising Published Articles 131

Additional Reading 131

8 Critically Appraising Systematic Reviews and Meta-Analyses 132

8.1 Advantages of Systematic Reviews and Meta-Analyses 133

8.2 Risks in Relying Exclusively on Systematic Reviews and Meta-Analyses 135

8.3 Where to Start 135

8.4 What to Look for When Critically Appraising Systematic Reviews 135

8.5 What Distinguishes a Systematic Review from Other Types of Reviews? 142

8.6 What to Look for When Critically Appraising Meta-Analyses 143

8.7 Synopses of Research Studies 152

Key Chapter Concepts 155

Review Exercises 156

Exercise for Critically Appraising Published Articles 157

Additional Readings 157

Part 3 Critically Appraising Studies for Alternative EIP Questions

9 Critically Appraising Nonexperimental Quantitative Studies 160

9.1 Surveys 161

9.2 Cross-Sectional and Longitudinal Studies 169

9.3 Case-Control Studies 171

9.4 Synopses of Research Studies 172

Key Chapter Concepts 178

Review Exercises 179

Exercise for Critically Appraising Published Articles 179

Additional Readings 179

10 Critically Appraising Qualitative Studies 180

10.1 Qualitative Observation 182

10.2 Qualitative Interviewing 183

10.3 Other Qualitative Methodologies 186

10.4 Qualitative Sampling 186

10.5 Grounded Theory 187

10.6 Alternatives to Grounded Theory 188

10.7 Frameworks for Appraising Qualitative Studies 189

10.8 Mixed Model and Mixed Methods Studies 193

10.9 Synopses of Research Studies 193

Key Chapter Concepts 198

Review Exercises 200

Exercise for Critically Appraising Published Articles 201

Additional Readings 201

Part 4 Assessment and Monitoring in Evidence-Informed Practice

11 Critically Appraising, Selecting, and Constructing Assessment Instruments 204

11.1 Reliability 205

11.2 Validity 208

11.3 Feasibility 214

11.4 Sample Characteristics 214

11.5 Locating Assessment Instruments 215

11.6 Constructing Assessment Instruments 216

11.7 Synopses of Research Studies 218

Key Chapter Concepts 220

Review Exercises 221

Exercise for Critically Appraising Published Articles 222

Additional Readings 222

12 Monitoring Client Progress 223

12.1 A Practitioner-Friendly Single-Case Design 224

12.2 Using Within-Group Effect-Size Benchmarks 234

Key Chapter Concepts 235

Review Exercises 236

Additional Readings 236

Part 5 Additional Aspects of Evidence-Informed Practice

13 Appraising and Conducting Data Analyses in EIP 238

13.1 Introduction 238

13.2 Ruling Out Statistical Chance 239

13.3 What Else Do You Need to Know? 244

13.4 The 05 Cutoff Point Is Not Sacred! 245

13.5 What Else Do You Need to Know? 246

13.6 Calculating Within-Group Effect Sizes and Using Benchmarks 247

13.7 Conclusion 248

Key Chapter Concepts 248

Review Exercises 249

Additional Reading 249

14 Critically Appraising Social Justice Research Studies 250

14.1 Introduction 250

14.2 Evidence-Informed Social Action 251

14.3 What Type of Evidence? 252

14.4 Participatory Action Research (PAR) 253

14.5 Illustrations of Other Types of Social Justice Research 254

14.6 Conclusion 254

Key Chapter Concepts 258

Review Exercises 259

Additional Readings 260

Glossary 261

References 269

Index 273

About the Author

ALLEN RUBIN, PhD, is the Kantambu Latting College Professorship for Leadership and Change at the University of Houston Graduate College of Social Work. He is the author of several bestselling titles in social work research.

JENNIFER BELLAMY, PhD, is Associate Dean for Research and Faculty Development and Professor at the Graduate School of Social Work at the University of Denver. She teaches research and theory courses at the master’s and doctoral levels.

Show more
Review this Product
Ask a Question About this Product More...
 
Look for similar items by category
People also searched for
Item ships from and is sold by Fishpond Retail Limited.

Back to top