R
如何對 lmer 模型進行事後測試?
這是我的數據框:
Group <- c("G1","G1","G1","G1","G1","G1","G1","G1","G1","G1","G1","G1","G1","G1","G1","G2","G2","G2","G2","G2","G2","G2","G2","G2","G2","G2","G2","G2","G2","G2","G3","G3","G3","G3","G3","G3","G3","G3","G3","G3","G3","G3","G3","G3","G3") Subject <- c("S1","S2","S3","S4","S5","S6","S7","S8","S9","S10","S11","S12","S13","S14","S15","S1","S2","S3","S4","S5","S6","S7","S8","S9","S10","S11","S12","S13","S14","S15","S1","S2","S3","S4","S5","S6","S7","S8","S9","S10","S11","S12","S13","S14","S15") Value <- c(9.832217741,13.62390117,13.19671612,14.68552076,9.26683366,11.67886655,14.65083473,12.20969772,11.58494621,13.58474896,12.49053635,10.28208078,12.21945867,12.58276212,15.42648969,9.466436017,11.46582655,10.78725485,10.66159358,10.86701127,12.97863424,12.85276916,8.672953949,10.44587257,13.62135205,13.64038394,12.45778874,8.655142642,10.65925259,13.18336949,11.96595556,13.5552118,11.8337142,14.01763101,11.37502161,14.14801305,13.21640866,9.141392359,11.65848845,14.20350364,14.1829714,11.26202565,11.98431285,13.77216009,11.57303893) data <- data.frame(Group, Subject, Value)
然後我運行一個線性混合效應模型來比較 3 組在“值”上的差異,其中“主題”是隨機因素:
library(lme4) library(lmerTest) model <- lmer (Value~Group + (1|Subject), data = data) summary(model)
結果是:
Fixed effects: Estimate Std. Error df t value Pr(>|t|) (Intercept) 12.48771 0.42892 31.54000 29.114 <2e-16 *** GroupG2 -1.12666 0.46702 28.00000 -2.412 0.0226 * GroupG3 0.03828 0.46702 28.00000 0.082 0.9353
但是,如何比較 Group2 和 Group3?學術文章中的約定是什麼?
您可以使用
emmeans::emmeans()
或lmerTest::difflsmeans()
, 或multcomp::glht()
。我更喜歡
emmeans
(以前lsmeans
)。library(emmeans) emmeans(model, list(pairwise ~ Group), adjust = "tukey")
下一個選項是
difflsmeans
。注意difflsmeans
不能糾正多重比較,並且默認使用 Satterthwaite 方法來計算自由度,而不是 emmeans 默認使用的Kenward-Roger 方法,因此最好明確指定您喜歡的方法。library(lmerTest) difflsmeans(model, test.effs = "Group", ddf="Kenward-Roger")
Hack-R 在這個問題的另一個答案中描述了該
multcomp::glht()
方法。此外,您可以通過加載
lmerTest
然後使用anova
.library(lmerTest) lmerTest::anova(model)
為了清楚起見,您打算對每個主題評估 3 次價值,對嗎?看起來 Group 是“學科內”,而不是“學科間”。