GLMs in R are estimated using the Fisher count. Two approaches to multi-element logit come to mind: models of proportional coefficients and logarithmic models or multi-linear regression.
The proportional coefficient model is a special type of the aggregate reference model and is implemented in the MASS
package. This is not evaluated using a Fisher rating, so by default glm.fit
workhorse cannot evaluate such a model. It is interesting, however, that the cumulative reference models are GLM and were discussed in the same text by McCulogue and Nelder. A similar problem is encountered with negative binomial GLMs: they are GLM in the strict sense of the coupling function and a probabilistic model, but require specialized evaluation routines. As for the function R glm
, one cannot consider it as an exhaustive estimate for each type of GLM.
nnet
has an implementation of the logarithmic estimation model. This corresponds to their more complex neural network assessment using soft-max entropy, which is an equivalent formulation (theory should show this). Turns out you can evaluate log-line models with glm
by default R if you're passionate. A key point is the relationship between logistics and Poisson regression. Recognizing the conditions of interaction of the account model (the difference in the logarithmic relative speeds) as a first-order term in the logistic model for the result (the odds ratio of the log), you can evaluate the same parameters and the same SEs by "conditioning" on the margins of the table of unforeseen expenses $ K \ times 2 $ for a result with multiple categories. SE related question on this background
Take the following VA lung cancer data from the MASS package as an example:
> summary(multinom(cell ~ factor(treat), data=VA))
Compared with:
> VA.tab <- table(VA[, c('cell', 'treat')]) > summary(glm(Freq ~ cell * treat, data=VA.tab, family=poisson)) Call: glm(formula = Freq ~ cell * treat, family = poisson, data = VA.tab) Deviance Residuals: [1] 0 0 0 0 0 0 0 0 Coefficients: Estimate Std. Error z value Pr(>|z|) (Intercept) 2.708e+00 2.582e-01 10.488 <2e-16 *** cell2 6.931e-01 3.162e-01 2.192 0.0284 * cell3 -5.108e-01 4.216e-01 -1.212 0.2257 cell4 -1.571e-15 3.651e-01 0.000 1.0000 treat2 2.877e-01 3.416e-01 0.842 0.3996 cell2:treat2 -7.985e-01 4.534e-01 -1.761 0.0782 . cell3:treat2 4.055e-01 5.323e-01 0.762 0.4462 cell4:treat2 -5.108e-01 5.164e-01 -0.989 0.3226 --- Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1 (Dispersion parameter for poisson family taken to be 1) Null deviance: 1.5371e+01 on 7 degrees of freedom Residual deviance: 4.4409e-15 on 0 degrees of freedom AIC: 53.066 Number of Fisher Scoring iterations: 3
Compare the interaction parameters and the main levels for treatment in one model with the second. Compare also interception. AICs are different from each other, since the logarithmic model is a probabilistic model for even the table fields, which are caused by other parameters of the model, but from the point of view of forecasting and output, these two approaches give the same results.
So, short, trick question! glm
handles multi-element logistic regression, it just better understands what these models are.