Muutke küpsiste eelistusi

Bayes Rules!: An Introduction to Applied Bayesian Modeling [Pehme köide]

, (Smith College, Northampton, MA 01063), (Denison university, OH, USA)
  • Formaat: Paperback / softback, 544 pages, kõrgus x laius: 254x178 mm, kaal: 1020 g, 18 Tables, black and white; 131 Line drawings, color; 101 Line drawings, black and white; 3 Halftones, color; 1 Halftones, black and white; 134 Illustrations, color; 102 Illustrations, black and white
  • Sari: Chapman & Hall/CRC Texts in Statistical Science
  • Ilmumisaeg: 04-Mar-2022
  • Kirjastus: Chapman & Hall/CRC
  • ISBN-10: 0367255391
  • ISBN-13: 9780367255398
Teised raamatud teemal:
  • Formaat: Paperback / softback, 544 pages, kõrgus x laius: 254x178 mm, kaal: 1020 g, 18 Tables, black and white; 131 Line drawings, color; 101 Line drawings, black and white; 3 Halftones, color; 1 Halftones, black and white; 134 Illustrations, color; 102 Illustrations, black and white
  • Sari: Chapman & Hall/CRC Texts in Statistical Science
  • Ilmumisaeg: 04-Mar-2022
  • Kirjastus: Chapman & Hall/CRC
  • ISBN-10: 0367255391
  • ISBN-13: 9780367255398
Teised raamatud teemal:
An engaging, sophisticated, and fun introduction to the field of Bayesian statistics, Bayes Rules!: An Introduction to Applied Bayesian Modeling brings the power of modern Bayesian thinking, modeling, and computing to a broad audience. In particular, the book is an ideal resource for advanced undergraduate statistics students and practitioners with comparable experience. the book assumes that readers are familiar with the content covered in a typical undergraduate-level introductory statistics course. Readers will also, ideally, have some experience with undergraduate-level probability, calculus, and the R statistical software. Readers without this background will still be able to follow along so long as they are eager to pick up these tools on the fly as all R code is provided.Bayes Rules! empowers readers to weave Bayesian approaches into their everyday practice. Discussions and applications are data driven. A natural progression from fundamental to multivariable, hierarchical models emphasizes a practical and generalizable model building process. The evaluation of these Bayesian models reflects the fact that a data analysis does not exist in a vacuum.

Features

Utilizes data-driven examples and exercises.

Emphasizes the iterative model building and evaluation process.

Surveys an interconnected range of multivariable regression and classification models.

Presents fundamental Markov chain Monte Carlo simulation.

Integrates R code, including RStan modeling tools and the bayesrules package.

Encourages readers to tap into their intuition and learn by doing.

Provides a friendly and inclusive introduction to technical Bayesian concepts.

Supports Bayesian applications with foundational Bayesian theory.

Arvustused

Praise for Bayes Rules!: An Introduction to Applied Bayesian Modeling

A thoughtful and entertaining book, and a great way to get started with Bayesian analysis.

Andrew Gelman, Columbia University

The examples are modern, and even many frequentist intro books ignore important topics (like the great p-value debate) that the authors address. The focus on simulation for understanding is excellent.

Amy Herring, Duke University

I sincerely believe that a generation of students will cite this book as inspiration for their use of and love for Bayesian statistics. The narrative holds the readers attention and flows naturally almost conversationally. Put simply, this is perhaps the most engaging introductory statistics textbook I have ever read. [ It] is a natural choice for an introductory undergraduate course in applied Bayesian statistics."

Yue Jiang, Duke University

This is by far the best book Ive seen on how to (and how to teach students to) do Bayesian modeling and understand the underlying mathematics and computation. The authors build intuition and scaffold ideas expertly, using interesting real case studies, insightful graphics, and clear explanations. The scope of this book is vast from basic building blocks to hierarchical modeling, but the authors thoughtful organization allows the reader to navigate this journey smoothly. And impressively, by the end of the book, one can run sophisticated Bayesian models and actually understand the whys, whats, and hows.

Paul Roback, St. Olaf College

The authors provide a compelling, integrated, accessible, and non-religious introduction to statistical modeling using a Bayesian approach. They outline a principled approach that features computational implementations and model assessment with ethical implications interwoven throughout. Students and instructors will find the conceptual and computational exercises to be fresh and engaging.

Nicholas Horton, Amherst College "As an alternative to technical derivations Bayes Rules! centres on intuition and simulation (yay!) via its bayesrule R package. Itself relying on rstan. Learning from example (as R code is always provided), the book proceeds through conjugate priors, MCMC (Metropolis-Hasting) methods, regression models, and hierarchical regression models. Quite impressive given the limited prerequisites set by the authors. (I appreciated the representations of the prior-likelihood-posterior, especially in the sequential case.)" -Christian Robert, University of Warwick, UK, Xi'an's OG

"A thoughtful and entertaining book, and a great way to get started with Bayesian analysis." -Andrew Gelman, Columbia University

"I like the book. Its a comprehensive introduction to Bayesian modelling, including several applied bits of code to help the reader through the complex mathematical details I think [ it] is a solid addition to the literature."

-Gianluca Baio, University College, London

"Great work, wow! Thanks so much for doing this! The exercises are fresh and innovative, and I will be working several of them into my courses. I like the focus/level in appealing to introductory students. The examples are modern, and even many frequentist intro books ignore important topics (like the great p-value debate) that the authors address. The focus on simulation for understanding is excellent."

Amy Herring, Duke University

"I congratulate and commend the authors for writing a textbook that makes statistics come alive. I sincerely believe that a generation of students will cite this book as inspiration for their use of and love for Bayesian statistics. The narrative holds the readers attention and flows naturally almost conversationally. Put simply, this is perhaps the most engaging introductory statistics textbook I have ever read. The small but frequent "Quiz Yourself" sections were effective in reinforcing material in low stakes scenarios, and homework problems were clearly developed with great care, providing a mix of conceptual and applied methodological exercises to cement student understanding. I thoroughly enjoyed reviewing this book; it was a true delight to read and is a natural choice for an introductory undergraduate course in applied Bayesian statistics."

Yue Jiang, Duke University

"It is a lot of fun to read! Its clearly targeted at an undergraduate audience, and the writing style is geared toward less advanced undergraduates It has a lot of examples that would appeal to undergraduate students. It does cover quite a lot of methods and approaches. Its quite comprehensive for a book of its level."



David Hitchcock, University of South Carolina

"It is a nice, well-written text on Bayesian modeling with an emphasis on regression and multilevel modeling. It can be used for a one-semester or two-semester course in Bayesian statistics at the undergraduate or masters level. It introduces the methods in the context of interesting datasets and the computational methods are current using the popular MCMC Stan software. It appears to have a sufficient number of exercises and self-study quizzes that make it useful for self-study or the classroom."

Jim Albert, Bowling Green University

"Books on Bayesian modeling that Ive seen have been flawed in crucial ways that prevented the reader from truly becoming a knowledgeable practitioner of Bayesian methods the examples were only elementary and unrealistic, the presentation of software was too black box-ish, or the mathematical and computational details ramped up way too quickly. This is the first book Ive seen that hits the sweet spot starting with basic building blocks but progressing eventually to realistic case studies, development of intuition behind the methods, and appropriate scaffolding of ideas. Amazing work by the authors!"

Paul Roback, St. Olaf College

"I think the authors do a nice job with examples to illustrate concepts and help the readers (students) develop intuition about Bayesian inference. Chapters include questions that ask the reader to consider likely answers to inferential questions based on the information provided. The formal Bayesian inferences follow the questions, with enough explanation to help the reader determine what thought processes might have led to the correct Bayesian approach to answering the question. A goal is to teach and to reinforce the readers intuition along the way. Many of the examples relate to contemporary media and social matters. Such examples help."

-Gary Rosner, Johns Hopkins University

"The authors provide a compelling, integrated, accessible, and non-religious introduction to statistical modeling using a Bayesian approach. They outline a principled approach that features computational implementations and model assessment with ethical implications interwoven throughout. Students and instructors will find the conceptual and computational exercises to be fresh and engaging."

-Nick Horton, Amherst College

"The [ book] provides a very nice introduction to Bayesian inference and modeling. The authors describe these topics using a wealth of different datasets and provide a number of exercises at the end of every chapter. I have particularly enjoyed how each chapter is structured: the authors present a problem or dataset and they develop the chapter by telling a story that involves the dataset introduced and by addressing the problem using Bayesian inference and modeling. There are also plenty of thought-provoking questions and exercises throughout the book. Furthermore, the authors also discuss different ethical issues on statistical modeling and inference. ...I really enjoyed reading it."

-Virgilio Gomez-Rubio, Universidad de Castilla-la Mancha

"Bayesian statistics needs no introduction, but good, readable, and truly introductory books covering Bayesian approaches are rare. This is one of them, and in all, it is excellent... as an introductory text is does a great job and is readable and approachable for anyone with an interest in Bayesian analysis."

Maia Lesosky, Liverpool School Of Tropical Medicine, UK, ISCB, November 2022

Foreword xv
Preface xvii
About the Authors xxi
I Bayesian Foundations
1(124)
1 The Big (Bayesian) Picture
3(14)
1.1 Thinking like a Bayesian
4(5)
1.1.1 Quiz yourself
5(1)
1.1.2 The meaning of probability
6(1)
1.1.3 The Bayesian balancing act
6(2)
1.1.4 Asking questions
8(1)
1.2 A quick history lesson
9(2)
1.3 A look ahead
11(3)
1.3.1 Unit 1: Bayesian foundations
11(1)
1.3.2 Unit 2: Posterior simulation & analysis
12(1)
1.3.3 Unit 3: Bayesian regression & classification
12(1)
1.3.4 Unit 4: Hierarchical Bayesian models
13(1)
1.4
Chapter summary
14(1)
1.5 Exercises
14(3)
2 Bayes' Rule
17(32)
2.1 Building a Bayesian model for events
19(11)
2.1.1 Prior probability model
19(1)
2.1.2 Conditional probability & likelihood
20(2)
2.1.3 Normalizing constants
22(2)
2.1.4 Posterior probability model via Bayes' Rule!
24(1)
2.1.5 Posterior simulation
25(5)
2.2 Example: Pop vs soda vs coke
30(1)
2.3 Building a Bayesian model for random variables
31(11)
2.3.1 Prior probability model
32(1)
2.3.2 The Binomial data model
33(2)
2.3.3 The Binomial likelihood function
35(1)
2.3.4 Normalizing constant
36(1)
2.3.5 Posterior probability model
37(1)
2.3.6 Posterior shortcut
38(2)
2.3.7 Posterior simulation
40(2)
2.4
Chapter summary
42(1)
2.5 Exercises
42(7)
2.5.1 Building up to Bayes' Rule
42(1)
2.5.2 Practice Bayes' Rule for events
43(2)
2.5.3 Practice Bayes' Rule for random variables
45(2)
2.5.4 Simulation exercises
47(2)
3 The Beta-Binomial Bayesian Model
49(26)
3.1 The Beta prior model
50(5)
3.1.1 Beta foundations
51(3)
3.1.2 Tuning the Beta prior
54(1)
3.2 The Binomial data model & likelihood function
55(2)
3.3 The Beta posterior model
57(4)
3.4 The Beta-Binomial model
61(2)
3.5 Simulating the Beta-Binomial
63(1)
3.6 Example: Milgram's behavioral study of obedience
64(3)
3.6.1 A Bayesian analysis
65(1)
3.6.2 The role of ethics in statistics and data science
66(1)
3.7
Chapter summary
67(1)
3.8 Exercises
68(7)
3.8.1 Practice: Beta prior models
68(3)
3.8.2 Practice: Beta-Binomial models
71(4)
4 Balance and Sequentiality in Bayesian Analyses
75(22)
4.1 Different priors, different posteriors
77(3)
4.2 Different data, different posteriors
80(2)
4.3 Striking a balance between the prior & data
82(3)
4.3.1 Connecting observations to concepts
82(1)
4.3.2 Connecting concepts to theory
83(2)
4.4 Sequential analysis: Evolving with data
85(3)
4.5 Proving data order invariance
88(1)
4.6 Don't be stubborn
89(1)
4.7 A note on subjectivity
90(1)
4.8
Chapter summary
91(1)
4.9 Exercises
92(5)
4.9.1 Review exercises
92(1)
4.9.2 Practice: Different priors, different posteriors
93(1)
4.9.3 Practice: Balancing the data k prior
93(2)
4.9.4 Practice: Sequentiality
95(2)
5 Conjugate Families
97(28)
5.1 Revisiting choice of prior
97(3)
5.2 Gamma-Poisson conjugate family
100(9)
5.2.1 The Poisson data model
100(3)
5.2.2 Potential priors
103(1)
5.2.3 Gamma prior
104(2)
5.2.4 Gamma-Poisson conjugacy
106(3)
5.3 Normal-Normal conjugate family
109(8)
5.3.1 The Normal data model
109(2)
5.3.2 Normal prior
111(2)
5.3.3 Normal-Normal conjugacy
113(3)
5.3.4 Optional: Proving Normal-Normal conjugacy
116(1)
5.4 Why no simulation in this chapter?
117(1)
5.5 Critiques of conjugate family models
118(1)
5.6
Chapter summary
118(1)
5.7 Exercises
118(7)
5.7.1 Practice: Gamma-Poisson
118(2)
5.7.2 Practice: Normal-Normal
120(2)
5.7.3 General practice exercises
122(3)
II Posterior Simulation & Analysis
125(84)
6 Approximating the Posterior
127(32)
6.1 Grid approximation
129(8)
6.1.1 A Beta-Binomial example
129(5)
6.1.2 A Gamma-Poisson example
134(2)
6.1.3 Limitations
136(1)
6.2 Markov chains via rstan
137(8)
6.2.1 A Beta-Binomial example
139(4)
6.2.2 A Gamma-Poisson example
143(2)
6.3 Markov chain diagnostics
145(10)
6.3.1 Examining trace plots
146(1)
6.3.2 Comparing parallel chains
147(1)
6.3.3 Calculating effective sample size & autocorrelation
148(5)
6.3.4 Calculating R-hat
153(2)
6.4
Chapter summary
155(1)
6.5 Exercises
156(3)
6.5.1 Conceptual exercises
156(1)
6.5.2 Practice: Grid approximation
156(1)
6.5.3 Practice: MCMC
157(2)
7 MCMC under the Hood
159(24)
7.1 The big idea
159(5)
7.2 The Metropolis-Hastings algorithm
164(4)
7.3 Implementing the Metropolis-Hastings
168(2)
7.4 Tuning the Metropolis-Hastings algorithm
170(2)
7.5 A Beta-Binomial example
172(3)
7.6 Why the algorithm works
175(1)
7.7 Variations on the theme
176(1)
7.8
Chapter summary
176(1)
7.9 Exercises
176(7)
7.9.1 Conceptual exercises
177(1)
7.9.2 Practice: Normal-Normal simulation
178(2)
7.9.3 Practice: Simulating more Bayesian models
180(3)
8 Posterior Inference & Prediction
183(26)
8.1 Posterior estimation
184(3)
8.2 Posterior hypothesis testing
187(5)
8.2.1 One-sided tests
187(4)
8.2.2 Two-sided tests
191(1)
8.3 Posterior prediction
192(3)
8.4 Posterior analysis with MCMC
195(5)
8.4.1 Posterior simulation
195(1)
8.4.2 Posterior estimation k hypothesis testing
196(3)
8.4.3 Posterior prediction
199(1)
8.5 Bayesian benefits
200(1)
8.6
Chapter summary
201(1)
8.7 Exercises
202(7)
8.7.1 Conceptual exercises
202(1)
8.7.2 Practice exercises
202(2)
8.7.3 Applied exercises
204(5)
III Bayesian Regression & Classification
209(164)
9 Simple Normal Regression
211(32)
9.1 Building the regression model
213(3)
9.1.1 Specifying the data model
213(2)
9.1.2 Specifying the priors
215(1)
9.1.3 Putting it all together
216(1)
9.2 Tuning prior models for regression parameters
216(3)
9.3 Posterior simulation
219(4)
9.3.1 Simulation via rstanarm
220(2)
9.3.2 Optional: Simulation via rstan
222(1)
9.4 Interpreting the posterior
223(3)
9.5 Posterior prediction
226(5)
9.5.1 Building a posterior predictive model
227(2)
9.5.2 Posterior prediction with rstanarm
229(2)
9.6 Sequential regression modeling
231(1)
9.7 Using default rstanarm priors
232(3)
9.8 You're not done yet!
235(1)
9.9
Chapter summary
236(1)
9.10 Exercises
236(7)
9.10.1 Conceptual exercises
236(1)
9.10.2 Applied exercises
237(6)
10 Evaluating Regression Models
243(24)
10.1 Is the model fair?
243(2)
10.2 How wrong is the model?
245(5)
10.2.1 Checking the model assumptions
245(3)
10.2.2 Dealing with wrong models
248(2)
10.3 How accurate are the posterior predictive models?
250(10)
10.3.1 Posterior predictive summaries
251(4)
10.3.2 Cross-validation
255(4)
10.3.3 Expected log-predictive density
259(1)
10.3.4 Improving posterior predictive accuracy
260(1)
10.4 How good is the MCMC simulation vs how good is the model?
260(1)
10.5
Chapter summary
261(1)
10.6 Exercises
261(6)
10.6.1 Conceptual exercises
261(2)
10.6.2 Applied exercises
263(2)
10.6.3 Open-ended exercises
265(2)
11 Extending the Normal Regression Model
267(36)
11.1 Utilizing a categorical predictor
270(4)
11.1.1 Building the model
271(1)
11.1.2 Simulating the posterior
272(2)
11.2 Utilizing two predictors
274(6)
11.2.1 Building the model
275(1)
11.2.2 Understanding the priors
276(1)
11.2.3 Simulating the posterior
277(2)
11.2.4 Posterior prediction
279(1)
11.3 Optional: Utilizing interaction terms
280(6)
11.3.1 Building the model
280(1)
11.3.2 Simulating the posterior
281(2)
11.3.3 Do you need an interaction term?
283(3)
11.4 Dreaming bigger: Utilizing more than 2 predictors!
286(3)
11.5 Model evaluation & comparison
289(9)
11.5.1 Evaluating predictive accuracy using visualizations
290(2)
11.5.2 Evaluating predictive accuracy using cross-validation
292(1)
11.5.3 Evaluating predictive accuracy using ELPD
293(1)
11.5.4 The bias-variance trade-off
294(4)
11.6
Chapter summary
298(1)
11.7 Exercises
299(4)
11.7.1 Conceptual exercises
299(1)
11.7.2 Applied exercises
300(2)
11.7.3 Open-ended exercises
302(1)
12 Poisson & Negative Binomial Regression
303(26)
12.1 Building the Poisson regression model
306(6)
12.1.1 Specifying the data model
306(4)
12.1.2 Specifying the priors
310(2)
12.2 Simulating the posterior
312(1)
12.3 Interpreting the posterior
313(2)
12.4 Posterior prediction
315(2)
12.5 Model evaluation
317(2)
12.6 Negative Binomial regression for overdispersed counts
319(5)
12.7 Generalized linear models: Building on the theme
324(1)
12.8
Chapter summary
325(1)
12.9 Exercises
326(3)
12.9.1 Conceptual exercises
326(1)
12.9.2 Applied exercises
327(2)
13 Logistic Regression
329(26)
13.1 Pause: Odds & probability
330(1)
13.2 Building the logistic regression model
331(5)
13.2.1 Specifying the data model
331(3)
13.2.2 Specifying the priors
334(2)
13.3 Simulating the posterior
336(3)
13.4 Prediction & classification
339(2)
13.5 Model evaluation
341(5)
13.6 Extending the model
346(2)
13.7
Chapter summary
348(1)
13.8 Exercises
349(6)
13.8.1 Conceptual exercises
349(1)
13.8.2 Applied exercises
350(2)
13.8.3 Open-ended exercises
352(3)
14 Naive Bayes Classification
355(18)
14.1 Classifying one penguin
356(9)
14.1.1 One categorical predictor
357(2)
14.1.2 One quantitative predictor
359(3)
14.1.3 Two predictors
362(3)
14.2 Implementing & evaluating naive Bayes classification
365(4)
14.3 Naive Bayes vs logistic regression
369(1)
14.4
Chapter summary
369(1)
14.5 Exercises
370(3)
14.5.1 Conceptual exercises
370(1)
14.5.2 Applied exercises
370(2)
14.5.3 Open-ended exercises
372(1)
IV Hierarchical Bayesian models
373(138)
15 Hierarchical Models are Exciting
375(12)
15.1 Complete pooling
377(3)
15.2 No pooling
380(2)
15.3 Hierarchical data
382(1)
15.4 Partial pooling with hierarchical models
383(1)
15.5
Chapter summary
384(1)
15.6 Exercises
385(2)
15.6.1 Conceptual exercises
385(1)
15.6.2 Applied exercises
385(2)
16 (Normal) Hierarchical Models without Predictors
387(34)
16.1 Complete pooled model
390(3)
16.2 No pooled model
393(4)
16.3 Building the hierarchical model
397(4)
16.3.1 The hierarchy
397(2)
16.3.2 Another way to think about it
399(1)
16.3.3 Within- vs between-group variability
400(1)
16.4 Posterior analysis
401(6)
16.4.1 Posterior simulation
401(2)
16.4.2 Posterior analysis of global parameters
403(1)
16.4.3 Posterior analysis of group-specific parameters
404(3)
16.5 Posterior prediction
407(4)
16.6 Shrinkage & the bias-variance trade-off
411(3)
16.7 Not everything is hierarchical
414(2)
16.8
Chapter summary
416(1)
16.9 Exercises
417(4)
16.9.1 Conceptual exercises
417(1)
16.9.2 Applied exercises
418(3)
17 (Normal) Hierarchical Models with Predictors
421(42)
17.1 First steps: Complete pooling
422(1)
17.2 Hierarchical model with varying intercepts
423(12)
17.2.1 Model building
423(4)
17.2.2 Another way to think about it
427(1)
17.2.3 Tuning the prior
427(2)
17.2.4 Posterior simulation & analysis
429(6)
17.3 Hierarchical model with varying intercepts & slopes
435(12)
17.3.1 Model building
436(4)
17.3.2 Optional: The decomposition of covariance model
440(2)
17.3.3 Posterior simulation & analysis
442(5)
17.4 Model evaluation & selection
447(3)
17.5 Posterior prediction
450(2)
17.6 Details: Longitudinal data
452(1)
17.7 Example: Danceability
452(5)
17.8
Chapter summary
457(1)
17.9 Exercises
458(5)
17.9.1 Conceptual exercises
458(1)
17.9.2 Applied exercises
459(2)
17.9.3 Open-ended exercises
461(2)
18 Non-Normal Hierarchical Regression & Classification
463(22)
18.1 Hierarchical logistic regression
463(10)
18.1.1 Model building & simulation
466(3)
18.1.2 Posterior analysis
469(1)
18.1.3 Posterior classification
470(2)
18.1.4 Model evaluation
472(1)
18.2 Hierarchical Poisson & Negative Binomial regression
473(7)
18.2.1 Model building & simulation
474(3)
18.2.2 Posterior analysis
477(2)
18.2.3 Model evaluation
479(1)
18.3
Chapter summary
480(1)
18.4 Exercises
480(5)
18.4.1 Applied & conceptual exercises
480(3)
18.4.2 Open-ended exercises
483(2)
19 Adding More Layers
485(26)
19.1 Group-level predictors
485(12)
19.1.1 A model using only individual-level predictors
486(3)
19.1.2 Incorporating group-level predictors
489(3)
19.1.3 Posterior simulation & global analysis
492(2)
19.1.4 Posterior group-level analysis
494(3)
19.1.5 We're just scratching the surface!
497(1)
19.2 Incorporating two (or more!) grouping variables
497(8)
19.2.1 Data with two grouping variables
497(2)
19.2.2 Building a model with two grouping variables
499(2)
19.2.3 Simulating models with two grouping variables
501(2)
19.2.4 Examining the group-specific parameters
503(2)
19.2.5 We're just scratching the surface!
505(1)
19.3 Exercises
505(4)
19.3.1 Conceptual exercises
505(1)
19.3.2 Applied exercises
506(3)
19.4 Goodbye!
509(2)
Bibliography 511(6)
Index 517
Alicia Johnson is an Associate Professor of Statistics at Macalester College in Saint Paul, Minnesota. She enjoys exploring and connecting students to Bayesian analysis, computational statistics, and the power of data in contributing to this shared world of ours.

Miles Ott is a Senior Data Scientist at The Janssen Pharmaceutical Companies of Johnson & Johnson. Prior to his current position, he taught at Carleton College, Augsburg University, and Smith College. He is interested in biostatistics, LGBTQ+ health research, analysis of social network data, and statistics/data science education. He blogs at milesott.com and tweets about statistics, gardening, and his dogs on Twitter.

Mine Dogucu is an Assistant Professor of Teaching in the Department of Statistics at University of California Irvine. She spends majority of her time thinking about what to teach, how to teach it, and what tools to use while teaching. She likes intersectional feminism, cats, and R Ladies. She tweets about statistics and data science education on Twitter.