Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Devex: Resolve merge conflicts dev #143

Merged
merged 88 commits into from
Nov 20, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
88 commits
Select commit Hold shift + click to select a range
e5e4da4
Add primary key field to tables
jthompson-arcus Oct 16, 2024
1cffcdf
Temporary fix to preserve current functionality
jthompson-arcus Oct 16, 2024
ec0c753
Re-create `review_testdb.sqlite`
jthompson-arcus Oct 16, 2024
193f05d
Repair `mod_review_form` tests
jthompson-arcus Oct 16, 2024
58435dc
Repair `fct_SQLite` tests
jthompson-arcus Oct 16, 2024
8f59b04
Rebuild `testdb.sqlite`
jthompson-arcus Oct 16, 2024
ec868d8
Temporary fix for adding queries to work as expected
jthompson-arcus Oct 16, 2024
804f7de
Add preliminary logging table
jthompson-arcus Oct 16, 2024
48314ae
Add "all_review_data_log" to list of expected tables
jthompson-arcus Oct 16, 2024
0b2eb31
Update `db_save_review` to update instead of append
jthompson-arcus Oct 28, 2024
37a7c11
Repair tests from `db_save_review()` changes
jthompson-arcus Oct 28, 2024
91a8c49
Merge branch 'dev' into jt-113-logging_tables
jthompson-arcus Oct 29, 2024
f98f0d2
Only store old records in logging table
jthompson-arcus Oct 29, 2024
9e55193
Modify update process to only updated/append appropriate records
jthompson-arcus Oct 29, 2024
f66f758
Actually pull the newer record when updated
jthompson-arcus Oct 29, 2024
2ad8cc7
Add `db_version` table to DB
jthompson-arcus Oct 30, 2024
75ad1d1
Add `edit_date_time` to logging table
jthompson-arcus Oct 30, 2024
4c8e2e5
Update all columns not considered in common
jthompson-arcus Oct 30, 2024
3f32584
Correct error in creating primary key tables
jthompson-arcus Oct 30, 2024
b81500d
Update `fct_SQLite.R` tests to reflect changes
jthompson-arcus Oct 30, 2024
d268d14
Update test DB
jthompson-arcus Oct 30, 2024
1cb9afa
Incorporate review logging tests into `test-fct_SQLite.R`
jthompson-arcus Oct 30, 2024
c6dace0
Add logging tests to `test-mod_review_forms.R`
jthompson-arcus Oct 30, 2024
7effc4c
Add index to `all_review_data` table
jthompson-arcus Oct 30, 2024
cd5d630
Utilize UPSERT for DB append/update procedure
jthompson-arcus Oct 31, 2024
564d0f9
Make UPSERT process more robust
jthompson-arcus Oct 31, 2024
af8f76c
Add function `update_db_version()` to migrate old DBs
jthompson-arcus Oct 31, 2024
f2ef3a6
Update version and NEWS
jthompson-arcus Oct 31, 2024
2c633d2
Fix warning message popping up in tests
jthompson-arcus Oct 31, 2024
f299d4a
Resolve merge conflicts with `dev`
jthompson-arcus Nov 4, 2024
63dc37d
Resolve merge conflicts with `dev`
jthompson-arcus Nov 5, 2024
0e6b4af
Update version number
jthompson-arcus Nov 5, 2024
c9b8cf8
Resolve merge conflicts with `dev`
jthompson-arcus Nov 5, 2024
abe5703
Revert changes to `mod_review_data_fct_helpers.R`
jthompson-arcus Nov 7, 2024
9c4dc72
Update internal documentation
jthompson-arcus Nov 7, 2024
62aa39d
Add index columns and db version to internal data
jthompson-arcus Nov 7, 2024
8e7f752
Create helper function to add new tables
jthompson-arcus Nov 7, 2024
7b7114d
Oopsies
jthompson-arcus Nov 7, 2024
6e65f04
Update UPSERT documentation
jthompson-arcus Nov 7, 2024
4af9905
Ignore `.Renviron` on git
jthompson-arcus Nov 8, 2024
7af507c
Initialize `db_get_table()`
jthompson-arcus Nov 8, 2024
2191a04
Use `db_get_table()` instead of `db_slice_rows()` to retrieve review …
jthompson-arcus Nov 8, 2024
0582b3b
Use `db_get_table()` instead of `db_slice_rows()` in tests
jthompson-arcus Nov 8, 2024
b370ff5
Remove unneeded slicing and filtering
jthompson-arcus Nov 8, 2024
f0837f2
Capture second snapshot
jthompson-arcus Nov 8, 2024
27c9220
Add comment for `idx_cols`
jthompson-arcus Nov 11, 2024
d1c81ea
Use unique names in function arguments from internal package objects
jthompson-arcus Nov 11, 2024
f271502
Merge pull request #115 from openpharma/jt-113-logging_tables
LDSamson Nov 11, 2024
cea4d43
Merge branch 'dev' into jt-113-simplify_review_process
jthompson-arcus Nov 11, 2024
3e1a650
Add trigger for update on id
jthompson-arcus Nov 11, 2024
851a5ba
Check DB version at run time
jthompson-arcus Nov 11, 2024
485f464
Update NEWS and version
jthompson-arcus Nov 11, 2024
dfdf945
Decode shinyproxy user name with base64 encoding, and return a warnin…
LDSamson Nov 1, 2024
dda0b05
Update news and version
LDSamson Nov 4, 2024
d56c4b1
Add a check if the package base64enc is installed within the `decode_…
LDSamson Nov 4, 2024
bab7a92
Update test database so a form with different review status per item …
LDSamson Nov 7, 2024
a37ff1d
Update mod_review_forms logic to ensure review was saved successfully
LDSamson Nov 7, 2024
7ac3278
Update mod_review_forms tests
LDSamson Nov 13, 2024
29eb7a0
Add all common vars to rows_update, so that review status is only upd…
LDSamson Nov 13, 2024
9940944
Update news and description
LDSamson Nov 15, 2024
bafae70
Resolve R CMD documentation note
LDSamson Nov 15, 2024
d21f519
Mark functions as internal for package website
LDSamson Nov 15, 2024
7efa574
Increase wait time to pass tests within Alpine Docker
LDSamson Nov 15, 2024
9d8555b
Update Alpine Docker test results
LDSamson Nov 15, 2024
0e266d7
Use `DBI` directly instead of `tbl()` and `collect()`
jthompson-arcus Nov 18, 2024
1405702
Fix typo in `user_db` version check
jthompson-arcus Nov 18, 2024
240c4e8
Only update records with new status.
jthompson-arcus Nov 18, 2024
8085287
Remove argument no longer used
jthompson-arcus Nov 18, 2024
a797883
Remove `common_vars` from tests
jthompson-arcus Nov 18, 2024
20f9528
Repair tests with `db_get_table()` returning `data.frame` not `tibble`
jthompson-arcus Nov 18, 2024
6fb21a6
Update `review_testdb.sqlite`
jthompson-arcus Nov 18, 2024
2a7b9e9
Update `mod_review_forms` test with new testing DB
jthompson-arcus Nov 18, 2024
c6ef19c
Update `all_review_data_id_update_trigger` to include `idx_cols`
jthompson-arcus Nov 18, 2024
1a85f40
Add `keys` argument to `db_add_log()`
jthompson-arcus Nov 18, 2024
c427f2d
Resolve some R CMD check issues
jthompson-arcus Nov 18, 2024
5d52f83
Add pkgdown gha workflow
aclark02-arcus Oct 1, 2024
bbf2b53
don't rely on renv.lock
aclark02-arcus Oct 1, 2024
e9ed1a4
update news
aclark02-arcus Oct 1, 2024
734f9ca
Update pkgdown workflow for use on dev and devex
LDSamson Nov 19, 2024
92677b0
Small NEWS.md wording update
jthompson-arcus Nov 19, 2024
573c633
Merge pull request #138 from openpharma/ls_create_v0.1.1
jthompson-arcus Nov 19, 2024
0e8b655
Resolve merge conflicts with `main`
jthompson-arcus Nov 19, 2024
a9488cc
small alignment fix
LDSamson Nov 20, 2024
906daf8
Merge pull request #142 from openpharma/jt-resolve_merge_conflicts_main
LDSamson Nov 20, 2024
b8e598e
Merge branch 'dev' into jt-113-simplify_review_process
LDSamson Nov 20, 2024
f57b549
Improve alignment according to https://style.tidyverse.org/syntax.htm…
LDSamson Nov 20, 2024
740a10a
Merge pull request #135 from openpharma/jt-113-simplify_review_process
LDSamson Nov 20, 2024
993035f
Resolve merge conflicts with `dev`
jthompson-arcus Nov 20, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
.Rhistory
.RData
.Ruserdata
.Renviron
*.html
*.tmp
~$*
Expand Down
2 changes: 1 addition & 1 deletion DESCRIPTION
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Package: clinsight
Title: ClinSight
Version: 0.1.0.9008
Version: 0.1.0.9009
DevexVersion: 9000
Authors@R: c(
person("Leonard Daniël", "Samson", , "[email protected]", role = c("cre", "aut"),
Expand Down
15 changes: 13 additions & 2 deletions NEWS.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,16 +2,27 @@

## Changed

- Added `pkgdown` GHA workflow to automatically update documentation site with PRs & pushes to `main` and `dev`
- Generalized `merge_meta_with_data()` to allow user-defined processing functions.
- Added a feature where, in applicable tables, a user can navigate to a form by double-clicking a table row.
- Fixed warnings in `apply_edc_specific_changes` due to the use of a vector within `dplyr::select`.
- Gave users ability to re-organized the column order in any table.
- Added form type as a class to be used in `create_table()` to display tables.
- Add a logging table to the DB for reviews.
- Simplify pulling data from DB for reviews.

## Bug fixes

- When using the `shinyproxy` deployment configuration, the user name is now expected to be base64 encoded, and will now be base64 encoded by `clinsight` by default, so that the app can also handle non-ASCII signs in user names that are stored in HTTP headers. To display the user name correctly, use base64 encoding in the `application.yml` in ShinyProxy settings (for example: `http-headers.X_SP_USERNAME: "#{T(java.util.Base64).getEncoder().encodeToString(oidcUser.getFullName().getBytes())}"`).

# clinsight 0.1.1

## Changed

- Added `pkgdown` GHA workflow to automatically update documentation site with pushes to `main`

## Bug fixes

- Fixed inconsistencies in app messages when saving a review for a form with items with different review states (with some items reviewed previously by a different reviewer, and some items being completely new).
- Fixed a bug where clinsight deployed with `shinyproxy` would crash when a user with non-ASCII letters in their name would attempt to login. In this new version, when using the `shinyproxy` deployment configuration, the user name is now expected to be base64 encoded, and will now be base64 encoded by `clinsight` by default, so that the app can also handle non-ASCII signs in user names that are stored in HTTP headers. To display the user name correctly, use base64 encoding in the `application.yml` in ShinyProxy settings (for example: `http-headers.X_SP_USERNAME: "#{T(java.util.Base64).getEncoder().encodeToString(oidcUser.getFullName().getBytes())}"`).

## `devex` changes
- Added `Excel` download button to Queries table & patient listings that need review.
Expand Down
4 changes: 2 additions & 2 deletions R/app_server.R
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
#' the `header widgets` ([mod_header_widgets_server()]), and the `query page`
#' ([mod_queries_server()])
#'
#' @param input,output,session Internal parameters for {shiny}.
#' @param input,output,session Internal parameters for `shiny`.
#' @seealso [app_ui()], [run_app()]
#'
app_server <- function(
Expand Down Expand Up @@ -58,7 +58,7 @@ app_server <- function(
)
# think of using the pool package, but functions such as row_update are not yet supported.
r <- reactiveValues(
review_data = db_slice_rows(user_db, db_table = "all_review_data"),
review_data = db_get_table(user_db, db_table = "all_review_data"),
query_data = collect_query_data(user_db),
filtered_subjects = app_vars$subject_id,
filtered_data = app_data,
Expand Down
263 changes: 248 additions & 15 deletions R/fct_SQLite.R
Original file line number Diff line number Diff line change
Expand Up @@ -91,17 +91,131 @@ db_create <- function(
status = status
)

new_data <- list(
new_pk_data <- list(
"all_review_data" = df,
"query_data" = query_data_skeleton,
"db_synch_time" = data.frame(synch_time = data_synch_time)
"query_data" = query_data_skeleton
)
idx_pk_cols <- list(
all_review_data = idx_cols
)
other_data <- list(
"db_synch_time" = data.frame(synch_time = data_synch_time),
"db_version" = data.frame(version = db_version)
)
con <- get_db_connection(db_path)
for(i in names(new_data)){
db_add_tables(con, new_pk_data, idx_pk_cols, other_data)
cat("Finished writing to database\n\n")
}

#' Add new tables to DB
#'
#' @param con A DBI Connection to the SQLite DB
#' @param pk_data A named list of data frames to add a primary key field to DB
#' table. Names will correspond to the DB table names.
#' @param unique_cols A named list of the fields defining unique records for a
#' table. Names will correspond to the table to apply the index constraint.
#' @param other_data A named list of other data frames to add to the DB. Names
#' will correspond to the DB table names.
#'
#' @keywords internal
db_add_tables <- function(con, pk_data, unique_cols, other_data) {
for(i in names(pk_data)){
cat("\nCreating new table: ", i, "\n")
DBI::dbWriteTable(con, i, new_data[[i]])
db_add_primary_key(con, i, pk_data[[i]], unique_cols[[i]])
}
cat("Finished writing to database\n\n")
for(i in names(other_data)){
cat("\nCreating new table: ", i, "\n")
DBI::dbWriteTable(con, i, other_data[[i]])
}
cat("\nCreating log table: all_review_data_log\n")
db_add_log(con)
}

#' Add primary key field
#'
#' @param con A DBI Connection to the SQLite DB
#' @param name The table name
#' @param value A data.frame to add to the table
#' @param keys A character vector specifying which columns define a unique row
#' for the table. If `NULL`, no unique index will be created.
#'
#' @keywords internal
db_add_primary_key <- function(con, name, value, keys = NULL) {
fields <- c(id = "INTEGER PRIMARY KEY AUTOINCREMENT", DBI::dbDataType(con, value))
DBI::dbCreateTable(con, name, fields)
if (!is.null(keys)) {
all_keys <- paste(keys, collapse = ", ")
rs <- DBI::dbSendStatement(
con,
sprintf("CREATE UNIQUE INDEX idx_%1$s ON %1$s (%2$s)", name, all_keys)
)
DBI::dbClearResult(rs)
}
DBI::dbAppendTable(con, name, value)
}

#' Add Logging Table
#'
#' Both creates the logging table and the trigger to update it for
#' all_review_data.
#'
#' @param con A DBI Connection to the SQLite DB
#' @param keys A character vector specifying which columns should not be updated
#' in a table. Defaults to 'id' and the package-defined index columns
#' (`idx_cols`).
#'
#' @keywords internal
db_add_log <- function(con, keys = c("id", idx_cols)) {
stopifnot(is.character(keys))
all_keys <- paste(keys, collapse = ", ")
stopifnot("'keys' parameter cannot be empty" = nchar(all_keys) > 0)

DBI::dbCreateTable(
con,
"all_review_data_log",
c(
id = "INTEGER PRIMARY KEY AUTOINCREMENT",
review_id = "INTEGER NOT NULL",
edit_date_time = "CHAR",
reviewed = "CHAR",
comment = "CHAR",
reviewer = "CHAR",
timestamp = "CHAR",
status = "CHAR",
dml_type = "CHAR NOT NULL",
dml_timestamp = "DATETIME DEFAULT CURRENT_TIMESTAMP"
)
)
# This will trigger before any UPDATEs happen on all_review_data. Instead of
# allowing 'id' to be updated, it will throw an error.
rs <- DBI::dbSendStatement(con, paste(
"CREATE TRIGGER all_review_data_id_update_trigger",
sprintf("BEFORE UPDATE OF %s ON all_review_data", all_keys),
"BEGIN",
sprintf("SELECT RAISE(FAIL, 'Fields %s are read only');", all_keys),
"END"
))
DBI::dbClearResult(rs)
rs <- DBI::dbSendStatement(con, paste(
"CREATE TRIGGER all_review_data_update_log_trigger",
"AFTER UPDATE ON all_review_data FOR EACH ROW",
"BEGIN",
"INSERT INTO all_review_data_log (",
"review_id, edit_date_time, reviewed, comment, reviewer, timestamp, status, dml_type",
")",
"VALUES(",
"NEW.id,",
"OLD.edit_date_time,",
"OLD.reviewed,",
"OLD.comment,",
"OLD.reviewer,",
"OLD.timestamp,",
"OLD.status,",
"'UPDATE'",
");",
"END"
))
DBI::dbClearResult(rs)
}

#' Update app database
Expand Down Expand Up @@ -153,7 +267,7 @@ db_update <- function(
update_time = data_synch_time
)
cat("writing updated review data to database...\n")
DBI::dbWriteTable(con, "all_review_data", updated_review_data, append = TRUE)
db_upsert(con, updated_review_data, common_vars)
DBI::dbWriteTable(
con,
"db_synch_time",
Expand All @@ -163,6 +277,41 @@ db_update <- function(
cat("Finished updating review data\n")
}

#' UPSERT to all_review_data
#'
#' Performs an UPSERT on all_review_data. New records will be appended to the
#' table. Changed/updated records will be applied to the table based on the
#' index column constraint.
#'
#' @param con A DBI Connection to the SQLite DB
#' @param data A data frame containing the data to UPSERT into all_review_data
#' @param idx_cols A character vector specifying which columns define a
#' unique index for a row
#'
#' @return invisibly returns TRUE. Is run for it's side effects on the DB.
#'
#' @keywords internal
db_upsert <- function(con, data, idx_cols) {
if ("id" %in% names(data))
data$id <- NULL
cols_to_update <- names(data)[!names(data) %in% idx_cols]
cols_to_insert <- names(data) |>
paste(collapse = ", ")
constraint_cols <- paste(idx_cols, collapse = ", ")
dplyr::copy_to(con, data, "row_updates")
rs <- DBI::dbSendStatement(con, paste(
"INSERT INTO",
"all_review_data",
sprintf("(%s)", cols_to_insert),
sprintf("SELECT %s FROM row_updates WHERE true", cols_to_insert),
"ON CONFLICT",
sprintf("(%s)", constraint_cols),
"DO UPDATE SET",
sprintf("%1$s = excluded.%1$s", cols_to_update) |> paste(collapse = ", ")
))
DBI::dbClearResult(rs)
}


#' Save review in database
#'
Expand All @@ -175,7 +324,6 @@ db_update <- function(
#' @param db_path Character vector. Path to the database.
#' @param tables Character vector. Names of the tables within the database to
#' save the review in.
#' @param common_vars A character vector containing the common key variables.
#' @param review_by A character vector, containing the key variables to perform
#' the review on. For example, the review can be performed on form level
#' (writing the same review to all items in a form), or on item level, with a
Expand All @@ -189,8 +337,6 @@ db_save_review <- function(
rv_row,
db_path,
tables = c("all_review_data"),
common_vars = c("subject_id", "event_name", "item_group",
"form_repeat", "item_name"),
review_by = c("subject_id", "item_group")
){
stopifnot(is.data.frame(rv_row))
Expand All @@ -214,14 +360,21 @@ db_save_review <- function(
warning("Review state unaltered. No review will be saved.")
)}
new_review_rows <- new_review_rows |>
db_slice_rows(slice_vars = c("timestamp", "edit_date_time"), group_vars = common_vars) |>
dplyr::select(-dplyr::all_of(cols_to_change)) |>
# If there are multiple edits, make sure to only select the latest editdatetime for all items:
# dplyr::slice_max(edit_date_time, by = dplyr::all_of(common_vars)) |>
dplyr::bind_cols(rv_row[cols_to_change]) # bind_cols does not work in a db connection.
cat("write updated review data to database\n")
lapply(tables, \(x){DBI::dbWriteTable(db_con, x, new_review_rows, append = TRUE)}) |>
invisible()
dplyr::copy_to(db_con, new_review_rows, "row_updates")
rs <- DBI::dbSendStatement(db_con, paste(
"UPDATE",
tables,
"SET",
sprintf("%1$s = row_updates.%1$s", cols_to_change) |> paste(collapse = ", "),
"FROM",
"row_updates",
"WHERE",
sprintf("%s.id = row_updates.id", tables)
))
DBI::dbClearResult(rs)
cat("finished writing to the tables:", tables, "\n")
}

Expand Down Expand Up @@ -369,3 +522,83 @@ db_get_review <- function(
dplyr::as_tibble()
})
}

db_get_version <- function(db_path) {
stopifnot(file.exists(db_path))
con <- get_db_connection(db_path)
tryCatch({
DBI::dbGetQuery(con, "SELECT version FROM db_version") |>
unlist(use.names = FALSE)
},
error = \(e) {""}
)
}

update_db_version <- function(db_path, version = "1.1") {
stopifnot(file.exists(db_path))
version <- match.arg(version)
temp_path <- withr::local_tempfile(fileext = ".sqlite")
file.copy(db_path, temp_path)
con <- get_db_connection(temp_path)

current_version <- tryCatch({
DBI::dbGetQuery(con, "SELECT version FROM db_version") |>
unlist(use.names = FALSE)}, error = \(e){""})
if(identical(current_version, db_version)) return("Database up to date. No update needed")

review_skeleton <- DBI::dbGetQuery(con, "SELECT * FROM all_review_data LIMIT 0")
rs <- DBI::dbSendQuery(con, "ALTER TABLE all_review_data RENAME TO all_review_data_old")
DBI::dbClearResult(rs)
rs <- DBI::dbSendQuery(con, "ALTER TABLE query_data RENAME TO query_data_old")
DBI::dbClearResult(rs)

new_pk_data <- list(
"all_review_data" = review_skeleton,
"query_data" = query_data_skeleton
)
idx_pk_cols <- list(
all_review_data = idx_cols
)
other_data <- list(
"db_version" = data.frame(version = db_version)
)
db_add_tables(con, new_pk_data, idx_pk_cols, other_data)

query_cols <- paste(names(query_data_skeleton), collapse = ", ")
cat("\nInserting old query records into new table.\n")
rs <- DBI::dbSendStatement(con, sprintf("INSERT INTO query_data (%1$s) SELECT %1$s FROM query_data_old", query_cols))
DBI::dbClearResult(rs)

stopifnot(DBI::dbGetQuery(con, "SELECT COUNT(*) FROM query_data") ==
DBI::dbGetQuery(con, "SELECT COUNT(*) FROM query_data_old"))

rs <- DBI::dbSendStatement(con, "DROP TABLE query_data_old")
DBI::dbClearResult(rs)

cat("\nInserting old review records into new tables.\n")
cols_to_update <- names(review_skeleton)[!names(review_skeleton) %in% idx_pk_cols$all_review_data]
cols_to_insert <- names(review_skeleton) |>
paste(collapse = ", ")
upsert_statement <- paste(
"INSERT INTO",
"all_review_data",
sprintf("(%s)", cols_to_insert),
sprintf("SELECT %s FROM all_review_data_old WHERE true", cols_to_insert),
"ON CONFLICT",
sprintf("(%s)", paste(idx_pk_cols$all_review_data, collapse = ", ")),
"DO UPDATE SET",
sprintf("%1$s = excluded.%1$s", cols_to_update) |> paste(collapse = ", ")
)
rs <- DBI::dbSendStatement(con, upsert_statement)
DBI::dbClearResult(rs)

stopifnot(DBI::dbGetQuery(con, "SELECT COUNT(*) FROM all_review_data") +
DBI::dbGetQuery(con, "SELECT COUNT(*) FROM all_review_data_log") ==
DBI::dbGetQuery(con, "SELECT COUNT(*) FROM all_review_data_old"))

rs <- DBI::dbSendStatement(con, "DROP TABLE all_review_data_old")
DBI::dbClearResult(rs)

file.copy(temp_path, db_path, overwrite = TRUE)
cat("Finished updating to new database standard\n\n")
}
2 changes: 1 addition & 1 deletion R/fct_data_helpers.R
Original file line number Diff line number Diff line change
Expand Up @@ -391,7 +391,7 @@ get_form_level_data <- function(
#' @param value_column A string containing the column name with the item values.
#' @param id_column The columns identifying a unique participant (subject_id).
#'
#' @return as data frame with an additional column named "base_{varname}".
#' @return as data frame with an additional column named "base_`varname`".
#' @export
#' @examples
#' library(dplyr)
Expand Down
Loading
Loading