diff --git a/17-linear_mixed_effects_models1.Rmd b/17-linear_mixed_effects_models1.Rmd index dc6b6c5..074a344 100644 --- a/17-linear_mixed_effects_models1.Rmd +++ b/17-linear_mixed_effects_models1.Rmd @@ -273,11 +273,11 @@ fit.augmented = lmer(formula = value ~ 1 + condition + (1 | participant), data = df.original) # compare models -# note: the lmer model has to be supplied first +# note: the lmer model has to be entered as the first argument anova(fit.augmented, fit.compact) ``` -Yes, the likelihood of the data given the linear mixed effects model is significantly higher compared to its likelihood given the linear model. +Yes, the linear mixed effects model explains the data better than the linear model. ## Additional resources diff --git a/18-linear_mixed_effects_models2.Rmd b/18-linear_mixed_effects_models2.Rmd index e107c13..c5336bc 100644 --- a/18-linear_mixed_effects_models2.Rmd +++ b/18-linear_mixed_effects_models2.Rmd @@ -289,8 +289,10 @@ df.complete_pooling = fit.complete_pooling %>% df.no_pooling = df.sleep %>% group_by(subject) %>% nest(data = c(days, reaction)) %>% - mutate(fit = map(data, ~ lm(reaction ~ days, data = .)), - augment = map(fit, augment)) %>% + mutate(fit = map(.x = data, + .f = ~ lm(reaction ~ days, data = .x)), + augment = map(.x = fit, + .f = ~ augment(.x))) %>% unnest(c(augment)) %>% ungroup() %>% clean_names() %>% @@ -400,8 +402,10 @@ df.partial_pooling = fit.random_intercept_slope %>% df.plot = df.sleep %>% group_by(subject) %>% nest(data = c(days, reaction)) %>% - mutate(fit = map(data, ~ lm(reaction ~ days, data = .)), - tidy = map(fit, tidy)) %>% + mutate(fit = map(.x = data, + .f = ~ lm(reaction ~ days, data = .x)), + tidy = map(.x = fit, + .f = ~ tidy(.x))) %>% unnest(c(tidy)) %>% select(subject, term, estimate) %>% pivot_wider(names_from = term, @@ -517,7 +521,7 @@ Let's fit a model to this data now and take a look at the summary output: ```{r} # fit model fit.mixed = lmer(formula = value ~ 1 + condition + (1 | participant), - data = df.mixed) + data = df.mixed) summary(fit.mixed) ``` @@ -687,7 +691,8 @@ n_observations = 10 slope = -10 sd_error = 0.4 sd_participant = 5 -intercept = rnorm(n_participants, sd = sd_participant) %>% sort() +intercept = rnorm(n_participants, sd = sd_participant) %>% + sort() df.simpson = tibble(x = runif(n_participants * n_observations, min = 0, max = 1)) %>% arrange(x) %>% @@ -753,7 +758,7 @@ Let's fit a linear mixed effects model with random intercepts: ```{r} fit.lmer = lmer(formula = y ~ 1 + x + (1 | participant), - data = df.simpson) + data = df.simpson) fit.lmer %>% summary() @@ -762,7 +767,6 @@ fit.lmer %>% As we can see, the fixed effect for `x` is now negative! ```{r} - fit.lmer %>% augment() %>% clean_names() %>% diff --git a/docs/404.html b/docs/404.html index 837738f..b965585 100644 --- a/docs/404.html +++ b/docs/404.html @@ -23,7 +23,7 @@ - + @@ -51,12 +51,6 @@ - - - - - - - @@ -929,20 +901,6 @@

Page not found

}); - - diff --git a/docs/linear-mixed-effects-models-1.html b/docs/linear-mixed-effects-models-1.html index d1b5d2c..9ce429c 100644 --- a/docs/linear-mixed-effects-models-1.html +++ b/docs/linear-mixed-effects-models-1.html @@ -23,7 +23,7 @@ - + @@ -51,24 +51,11 @@ - - - - - - - - - - - - - - @@ -328,36 +292,36 @@
  • 5.8 Session info
  • -
  • 6 Probability and causality +
  • 6 Probability
  • 7 Simulation 1
  • -
  • 7.4 Bayesian inference with the normal distribution +
  • 7.4 Pinguin exercise
  • -
  • 7.5 Additional resources +
  • 7.5 Bayesian inference with the normal distribution
  • -
  • 7.6 Session info
  • +
  • 7.6 Additional resources +
  • +
  • 7.7 Session info
  • 8 Simulation 2
  • 8.5 Additional resources
  • 8.6 Session info
  • @@ -425,7 +393,7 @@
  • 9.3 Hypothesis testing: “One-sample t-test”
  • 9.4 Building a sampling distribution of PRE
  • -
  • 9.5 Misc
  • +
  • 9.5 Misc
  • 9.6 Additional resources
  • @@ -478,7 +446,7 @@
  • 11.8 Additional resources
  • 11.9 Session info
  • 11.10 References
  • @@ -509,7 +477,7 @@
  • 12.7 Additional resources
  • 12.8 Session info
  • @@ -529,7 +497,7 @@
  • 13.6 Additional resources
  • 13.7 Session info
  • @@ -552,7 +520,7 @@
  • 14.8 Session info
  • @@ -607,7 +575,7 @@
  • 16.9 Session info
  • @@ -712,7 +680,7 @@
  • 21.7 Logistic mixed effects model
  • 21.8 Additional information
  • 21.9 Session info
  • @@ -835,7 +803,11 @@ -
  • 26.3 Session info
  • +
  • 26.3 Additional resources +
  • +
  • 26.4 Session info
  • 27 Cheatsheets
  • 5.8 Session info
  • -
  • 6 Probability and causality +
  • 6 Probability
  • 7 Simulation 1
  • -
  • 7.4 Bayesian inference with the normal distribution +
  • 7.4 Pinguin exercise
  • -
  • 7.5 Additional resources +
  • 7.5 Bayesian inference with the normal distribution
  • -
  • 7.6 Session info
  • +
  • 7.6 Additional resources +
  • +
  • 7.7 Session info
  • 8 Simulation 2
      @@ -410,7 +377,8 @@
  • 8.5 Additional resources
  • 8.6 Session info
  • @@ -425,7 +393,7 @@
  • 9.3 Hypothesis testing: “One-sample t-test”
  • 9.4 Building a sampling distribution of PRE
  • -
  • 9.5 Misc
  • +
  • 9.5 Misc
  • 9.6 Additional resources
  • @@ -478,7 +446,7 @@
  • 11.8 Additional resources
  • 11.9 Session info
  • 11.10 References
  • @@ -509,7 +477,7 @@
  • 12.7 Additional resources
  • 12.8 Session info
  • @@ -529,7 +497,7 @@
  • 13.6 Additional resources
  • 13.7 Session info
  • @@ -552,7 +520,7 @@
  • 14.8 Session info
  • @@ -607,7 +575,7 @@
  • 16.9 Session info
  • @@ -712,7 +680,7 @@
  • 21.7 Logistic mixed effects model
  • 21.8 Additional information
  • 21.9 Session info
  • @@ -835,7 +803,11 @@ -
  • 26.3 Session info
  • +
  • 26.3 Additional resources +
  • +
  • 26.4 Session info
  • 27 Cheatsheets