Skip to content

Commit

Permalink
Merge pull request #34 from ikosmidis/develop
Browse files Browse the repository at this point in the history
brglm2 v0.8.2
  • Loading branch information
ikosmidis authored Dec 5, 2021
2 parents 94ae2d2 + d4e909f commit 8b23ecf
Show file tree
Hide file tree
Showing 24 changed files with 303 additions and 636 deletions.
1 change: 1 addition & 0 deletions .Rbuildignore
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
inst/scratch
inst/brglm0
inst/brpr
inst/demos
Expand Down
4 changes: 2 additions & 2 deletions DESCRIPTION
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Package: brglm2
Title: Bias Reduction in Generalized Linear Models
Version: 0.8.0
Version: 0.8.2
Authors@R: c(person(given = "Ioannis", family = "Kosmidis", role = c("aut", "cre"), email = "[email protected]", comment = c(ORCID = "0000-0003-1556-0302")),
person(given = c("Euloge", "Clovis"), family = c("Kenne Pagui"), role = "aut", email = "[email protected]"),
person(given = "Kjell", family = "Konis", role = "ctb", email = "[email protected]"),
Expand All @@ -9,7 +9,7 @@ Description: Estimation and inference from generalized linear models based on va
URL: https://github.com/ikosmidis/brglm2
BugReports: https://github.com/ikosmidis/brglm2/issues
Depends: R (>= 3.3.0)
Imports: MASS, stats, Matrix, graphics, nnet, enrichwith, lpSolveAPI, numDeriv
Imports: MASS, stats, Matrix, graphics, nnet, enrichwith, numDeriv
License: GPL-3
Encoding: UTF-8
LazyData: true
Expand Down
1 change: 1 addition & 0 deletions NAMESPACE
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@ S3method(predict,brmultinom)
S3method(print,brmultinom)
S3method(print,brnb)
S3method(print,summary.bracl)
S3method(print,summary.brglmFit)
S3method(print,summary.brmultinom)
S3method(print,summary.brnb)
S3method(residuals,bracl)
Expand Down
25 changes: 23 additions & 2 deletions NEWS.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,26 @@
# brglm2 0.8.0
# brglm2 0.8.2

## Other improvements, updates and additions

* Housekeeping.
* Removed lpSolveAPI from imports.

# brglm2 0.8.1

## Bug fixes

* Fixed a bug when predicting from `bracl` objects with non-identifiable parameters.

## Other improvements, updates and additions

* Work on output consistently from `print()` methods for `summary.XYZ`
objects; estimator type is now printed and other fixes.

* Enriched warning when algorithm does not converge with more informative text.

* Documentation fixes and updates

# brglm2 0.8.0

## New functionality

Expand Down Expand Up @@ -46,7 +67,7 @@
decomposition of the model matrix), saving some computational effort.

## Other improvements, updates and additions
* Updated DOI links in documentation and some http -> https fixes.
* updated DOI links in documentation and some http -> https fixes.

# brglm2 0.7.0

Expand Down
4 changes: 4 additions & 0 deletions R/bracl.R
Original file line number Diff line number Diff line change
Expand Up @@ -315,6 +315,8 @@ print.summary.bracl <- function(x, digits = x$digits, ...) {
cat("\nResidual Deviance:", format(x$deviance), "\n")
cat("Log-likelihood:", format(x$logLik), "\n")
cat("AIC:", format(x$AIC), "\n")
cat("\n\nType of estimator:", x$type, get_type_description(x$type))
cat("\n", "Number of Fisher Scoring iterations: ", x$iter, "\n", sep = "")
if (!is.null(correl <- x$correlation)) {
p <- dim(correl)[2L]
if (p > 1) {
Expand Down Expand Up @@ -388,6 +390,8 @@ predict.bracl <- function(object, newdata, type = c("class", "probs"), ...) {
rn <- attr(X, "rn_data")
keep <- attr(X, "rn_kept")
cc <- coef(object)
## Ignore unidentifiable parameters
cc[is.na(cc)] <- 0
nams <- names(cc)
if (object$parallel) {
int <- (object$ncat - 1):1
Expand Down
8 changes: 4 additions & 4 deletions R/brglmControl.R
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@
#' of such. Default is \code{NULL}. See Details.
#' @param a power of the Jeffreys prior penalty. See Details.
#' @param ... further arguments passed to
#' \code{\link{brglmControl}}. Currently ignored in the outpup.
#' \code{\link{brglmControl}}. Currently ignored in the output.
#'
#' @details
#'
Expand Down Expand Up @@ -85,9 +85,9 @@
#' values for the iteration in \code{brglmFit}. The value of
#' \code{response_adjustment} governs how the data is
#' adjusted. Specifically, if \code{family} is \code{binomial}, then
#' the responses and totals are adjusted by and \code{2 *
#' response_adjustment}, respectively; if \code{family} is
#' \code{poisson}, then the responses are adjusted by and
#' the responses and totals are adjusted by \code{response_adjustment}
#' and \code{2 * response_adjustment}, respectively; if \code{family}
#' is \code{poisson}, then the responses are adjusted by and
#' \code{response_adjustment}. \code{response_adjustment = NULL}
#' (default) is equivalent to setting it to
#' "number of parameters"/"number of observations".
Expand Down
95 changes: 83 additions & 12 deletions R/brglmFit.R
Original file line number Diff line number Diff line change
Expand Up @@ -69,14 +69,16 @@
#' iteration with the same stationary point.
#'
#' In the special case of generalized linear models for binomial,
#' Poisson and multinomial responses, the adjusted score equations
#' approach returns estimates with improved frequentist properties,
#' that are also always finite, even in cases where the maximum
#' likelihood estimates are infinite (e.g. complete and quasi-complete
#' separation in multinomial regression). See, Kosmidis and Firth
#' (2021) for a proof for binomial-response GLMs with Jeffreys-prior
#' penalties to the log-likelihood, which is equivalent to mean bias
#' reduction for logistic regression. See, also,
#' Poisson and multinomial responses, the adjusted score equation
#' approaches for \code{type = "AS_mixed"}, \code{type = "AS_mean"},
#' and \code{type = "AS_median"} (see below for what methods each
#' \code{type} corresponds) return estimates with improved frequentist
#' properties, that are also always finite, even in cases where the
#' maximum likelihood estimates are infinite (e.g. complete and
#' quasi-complete separation in multinomial regression). See, Kosmidis
#' and Firth (2021) for a proof for binomial-response GLMs with
#' Jeffreys-prior penalties to the log-likelihood, which is equivalent
#' to mean bias reduction for logistic regression. See, also,
#' \code{\link[detectseparation]{detect_separation}} and
#' \code{\link[detectseparation]{check_infinite_estimates}} for
#' pre-fit and post-fit methods for the detection of infinite
Expand Down Expand Up @@ -1031,7 +1033,7 @@ brglmFit <- function(x, y, weights = rep(1, nobs), start = NULL, etastart = NULL

## Convergence analysis
if ((failed | iter >= control$maxit) & !(is_correction)) {
warning("brglmFit: algorithm did not converge", call. = FALSE)
warning("brglmFit: algorithm did not converge. Try changing the optimization algorithm defaults, e.g. the defaults for one or more of `maxit`, `epsilon`, `slowit`, and `response_adjustment`; see `?brglm_control` for default values and available options", call. = FALSE)
converged <- FALSE
}
else {
Expand Down Expand Up @@ -1274,9 +1276,12 @@ summary.brglmFit <- function(object, dispersion = NULL,
dispersion <- object$dispersion
}
}
summary.glm(object, dispersion = dispersion,
correlation = correlation,
symbolic.cor = symbolic.cor, ...)
out <- summary.glm(object, dispersion = dispersion,
correlation = correlation,
symbolic.cor = symbolic.cor, ...)
out$type <- object$type
class(out) <- c("summary.brglmFit", class(out))
out
}

#' Method for computing confidence intervals for one or more
Expand Down Expand Up @@ -1338,3 +1343,69 @@ DD <- function(expr,name, order = 1) {



## Almost all code is from stats:::print.summary.glm apart from minor modifications
#' @rdname summary.brglmFit
#' @method print summary.brglmFit
#' @export
print.summary.brglmFit <- function (x, digits = max(3L, getOption("digits") - 3L),
symbolic.cor = x$symbolic.cor,
signif.stars = getOption("show.signif.stars"), ...) {
cat("\nCall:\n", paste(deparse(x$call), sep = "\n", collapse = "\n"),
"\n\n", sep = "")
cat("Deviance Residuals: \n")
if (x$df.residual > 5) {
x$deviance.resid <- setNames(quantile(x$deviance.resid,
na.rm = TRUE), c("Min", "1Q", "Median", "3Q", "Max"))
}
xx <- zapsmall(x$deviance.resid, digits + 1L)
print.default(xx, digits = digits, na.print = "", print.gap = 2L)
if (length(x$aliased) == 0L) {
cat("\nNo Coefficients\n")
}
else {
df <- if ("df" %in% names(x))
x[["df"]]
else NULL
if (!is.null(df) && (nsingular <- df[3L] - df[1L]))
cat("\nCoefficients: (", nsingular, " not defined because of singularities)\n",
sep = "")
else cat("\nCoefficients:\n")
coefs <- x$coefficients
if (!is.null(aliased <- x$aliased) && any(aliased)) {
cn <- names(aliased)
coefs <- matrix(NA, length(aliased), 4L, dimnames = list(cn,
colnames(coefs)))
coefs[!aliased, ] <- x$coefficients
}
printCoefmat(coefs, digits = digits, signif.stars = signif.stars,
na.print = "NA", ...)
}
cat("\n(Dispersion parameter for ", x$family$family, " family taken to be ",
format(x$dispersion), ")\n\n", apply(cbind(paste(format(c("Null",
"Residual"), justify = "right"), "deviance:"), format(unlist(x[c("null.deviance",
"deviance")]), digits = max(5L, digits + 1L)), " on",
format(unlist(x[c("df.null", "df.residual")])), " degrees of freedom\n"),
1L, paste, collapse = " "), sep = "")
if (nzchar(mess <- naprint(x$na.action)))
cat(" (", mess, ")\n", sep = "")
cat("AIC: ", format(x$aic, digits = max(4L, digits + 1L)))
cat("\n\nType of estimator:", x$type, get_type_description(x$type))
cat("\n", "Number of Fisher Scoring iterations: ", x$iter, "\n", sep = "")
correl <- x$correlation
if (!is.null(correl)) {
p <- NCOL(correl)
if (p > 1) {
cat("\nCorrelation of Coefficients:\n")
if (is.logical(symbolic.cor) && symbolic.cor) {
print(symnum(correl, abbr.colnames = NULL))
}
else {
correl <- format(round(correl, 2L), nsmall = 2L,
digits = digits)
correl[!lower.tri(correl)] <- ""
print(correl[-1, -p, drop = FALSE], quote = FALSE)
}
}
}
invisible(x)
}
15 changes: 9 additions & 6 deletions R/brmultinom.R
Original file line number Diff line number Diff line change
Expand Up @@ -52,11 +52,12 @@
#' overlap, and showed that infinite maximum likelihood estimates
#' result when complete or quasi-complete separation occurs.
#'
#' The adjusted score approach to bias reduction that
#' \code{\link{brmultinom}} implements (\code{type = "AS_mean"}) is an
#' alternative to maximum likelihood that results in estimates with
#' smaller asymptotic bias that are also *always* finite, even in
#' cases of complete or quasi-complete separation.
#' The adjusted score approaches to bias reduction that
#' \code{\link{brmultinom}} implements for \code{type = "AS_mean"} and
#' \code{type = "AS_median"} are alternatives to maximum likelihood
#' that result in estimates with smaller asymptotic mean and median
#' bias, respectively, that are also *always* finite, even in cases of
#' complete or quasi-complete separation.
#'
#' \code{brmultinom} is a wrapper of \code{\link{brglmFit}} that fits
#' multinomial logit regression models through the 'Poisson trick' (see, for
Expand Down Expand Up @@ -393,7 +394,9 @@ print.summary.brmultinom <- function(x, digits = x$digits, ...)
}
cat("\nResidual Deviance:", format(x$deviance), "\n")
cat("Log-likelihood:", format(x$logLik), "\n")
cat("AIC:", format(x$AIC), "\n")
cat("AIC:", format(x$AIC))
cat("\n\nType of estimator:", x$type, get_type_description(x$type))
cat("\n", "Number of Fisher Scoring iterations: ", x$iter, "\n", sep = "")
if (!is.null(correl <- x$correlation)) {
p <- dim(correl)[2L]
if (p > 1) {
Expand Down
13 changes: 3 additions & 10 deletions R/brnb.R
Original file line number Diff line number Diff line change
Expand Up @@ -1001,16 +1001,9 @@ print.summary.brnb <- function(x, digits = max(3, getOption("digits") - 3), ...)
"deviance")]), digits = max(5, digits + 1)), " on",
format(unlist(x[c("df.null", "df.residual")])), " degrees of freedom\n"),
1, paste, collapse = " "), sep = "")
cat(paste("AIC:", round(x$aic,digits), "\n"))
cat("\nType of estimator:", x$type,
switch(x$type,
"ML" = "(maximum likelihood)",
"correction" = "(bias-corrected)",
"AS_mean" = "(mean bias-reduced)",
"AS_median" = "(median bias-reduced)",
"AS_mixed" = "(mixed bias-reduced)"
))
cat(paste("\nNumber of quasi-Fisher scoring iterations:", x$iter, "\n"))
cat(paste("AIC:", round(x$aic,digits)))
cat("\n\nType of estimator:", x$type, get_type_description(x$type))
cat("\n", "Number of quasi-Fisher scoring iterations:", x$iter, "\n", sep = "")
invisible(x)
}

Expand Down
15 changes: 15 additions & 0 deletions R/utils.R
Original file line number Diff line number Diff line change
Expand Up @@ -22,3 +22,18 @@ unless_null <- function(x, if_null) {
}
}



get_type_description <- function(type, parenthesized = TRUE) {
pp <- function(txt) {
ifelse(parenthesized, paste0("(", txt, ")"), txt)
}
switch(type,
"ML" = pp("maximum likelihood"),
"correction" = pp("bias correction"),
"AS_mean" = pp("mean bias-reducing adjusted score equations"),
"AS_median" = pp("median bias-reducing adjusted score equations"),
"AS_mixed" = pp("mixed bias-reducing adjusted score equations"),
"MPL_Jeffreys" = pp("maximum penalized likelihood with Jeffreys'-prior penalty")
)
}
5 changes: 2 additions & 3 deletions README.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ knitr::opts_chunk$set(
<!-- badges: start -->
[![CRAN_Status_Badge](http://www.r-pkg.org/badges/version/brglm2)](https://cran.r-project.org/package=brglm2)
[![R-CMD-check](https://github.com/ikosmidis/brglm2/workflows/R-CMD-check/badge.svg)](https://github.com/ikosmidis/brglm2/actions)
[![Codecov test coverage](https://codecov.io/gh/ikosmidis/brglm2/branch/master/graph/badge.svg)](https://codecov.io/gh/ikosmidis/brglm2?branch=master)
[![Codecov test coverage](https://codecov.io/gh/ikosmidis/brglm2/branch/master/graph/badge.svg)](https://app.codecov.io/gh/ikosmidis/brglm2?branch=master)
[![Licence](https://img.shields.io/badge/licence-GPL--3-blue.svg)](https://www.gnu.org/licenses/gpl-3.0.en.html)
<!-- badges: end -->

Expand Down Expand Up @@ -158,8 +158,7 @@ implicit is as given in [Kosmidis

## References and resources

**brglm2** was presented by [Ioannis
Kosmidis](http://www.ikosmidis.com) at the useR! 2016 international
**brglm2** was presented by [Ioannis Kosmidis](https://www.ikosmidis.com) at the useR! 2016 international
conference at University of Stanford on 16 June 2016. The presentation
was titled "Reduced-bias inference in generalized linear models" and
can be watched online at this
Expand Down
26 changes: 11 additions & 15 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,13 @@
<!-- README.md is generated from README.Rmd. Please edit that file -->

brglm2 <img src="man/figures/hex_brglm2.svg" width="320" align="right">
=======================================================================
# brglm2 <img src="man/figures/hex_brglm2.svg" width="320" align="right">

<!-- badges: start -->

[![CRAN\_Status\_Badge](http://www.r-pkg.org/badges/version/brglm2)](https://cran.r-project.org/package=brglm2)
[![R-CMD-check](https://github.com/ikosmidis/brglm2/workflows/R-CMD-check/badge.svg)](https://github.com/ikosmidis/brglm2/actions)
[![Codecov test
coverage](https://codecov.io/gh/ikosmidis/brglm2/branch/master/graph/badge.svg)](https://codecov.io/gh/ikosmidis/brglm2?branch=master)
coverage](https://codecov.io/gh/ikosmidis/brglm2/branch/master/graph/badge.svg)](https://app.codecov.io/gh/ikosmidis/brglm2?branch=master)
[![Licence](https://img.shields.io/badge/licence-GPL--3-blue.svg)](https://www.gnu.org/licenses/gpl-3.0.en.html)
<!-- badges: end -->

Expand Down Expand Up @@ -43,8 +42,7 @@ for logistic regression (and, for more general binomial-response models
where the likelihood is penalized by a power of the Jeffreys’ invariant
prior).

Installation
------------
## Installation

Install the current version from CRAN:

Expand All @@ -55,8 +53,7 @@ or the development version from github:
# install.packages("remotes")
remotes::install_github("ikosmidis/brglm2", ref = "develop")

Example
-------
## Example

Below we follow the example of [Heinze and Schemper
(2002)](https://doi.org/10.1002/sim.1047) and fit a probit regression
Expand Down Expand Up @@ -166,8 +163,7 @@ reduction and maximum penalized likelihood with Jeffreys’ prior penalty.
Also do not forget to take a look at the vignettes
(`vignette(package = "brglm2")`) for details and more case studies.

Solving adjusted score equations using quasi-Fisher scoring
-----------------------------------------------------------
## Solving adjusted score equations using quasi-Fisher scoring

The workhorse function in **brglm2** is
[`brglm_fit`](https://github.com/ikosmidis/brglm2/blob/master/R/brglmFit.R)
Expand All @@ -192,13 +188,13 @@ models.
The classification of bias reduction methods into explicit and implicit
is as given in [Kosmidis (2014)](https://doi.org/10.1002/wics.1296).

References and resources
------------------------
## References and resources

**brglm2** was presented by [Ioannis Kosmidis](http://www.ikosmidis.com)
at the useR! 2016 international conference at University of Stanford on
16 June 2016. The presentation was titled “Reduced-bias inference in
generalized linear models” and can be watched online at this
**brglm2** was presented by [Ioannis
Kosmidis](https://www.ikosmidis.com) at the useR! 2016 international
conference at University of Stanford on 16 June 2016. The presentation
was titled “Reduced-bias inference in generalized linear models” and can
be watched online at this
[link](https://channel9.msdn.com/Events/useR-international-R-User-conference/useR2016/brglm-Reduced-bias-inference-in-generalized-linear-models).

Motivation, details and discussion on the methods that **brglm2**
Expand Down
Loading

0 comments on commit 8b23ecf

Please sign in to comment.