Skip to content

Commit

Permalink
Fix crash when there are no periodic segments to assign
Browse files Browse the repository at this point in the history
This includes a simplification of how periodic segments are computed based on all local dates in the data independently of their time zones
  • Loading branch information
JulioV committed Apr 7, 2021
1 parent 9551669 commit 286d317
Show file tree
Hide file tree
Showing 3 changed files with 22 additions and 11 deletions.
8 changes: 8 additions & 0 deletions docs/change-log.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,13 @@
# Change Log

## v1.0.1
- Fix crash in `chunk_episodes` of `utils.py` for multi time zone data
- Fix crash in BT Doryab provider when the number of clusters is 2
- Fix Fitbit multi time zone inference from phone data (simplify)
- Fix missing columns when the input for phone data yield is empty
- Fix wrong date time labels for event segments for multi time zone data (all labels are computed based on a single tz)
- Fix periodic segment crash when there are no segments to assign (only affects wday, mday, qday, or yday)
- Fix crash in Analysis Workflow with new suffix in segments' labels
## v1.0.0
- Add a new [Overview](../setup/overview/) page.
- You can [extend](../datastreams/add-new-data-streams/) RAPIDS with your own [data streams](../datastreams/data-streams-introduction/). Data streams are data collected with other sensing apps besides AWARE (like Beiwe, mindLAMP), and stored in other data containers (databases, files) besides MySQL.
Expand Down
6 changes: 4 additions & 2 deletions src/data/datetime/assign_to_multiple_timezones.R
Original file line number Diff line number Diff line change
Expand Up @@ -148,7 +148,8 @@ multiple_time_zone_assignment <- function(sensor_data, timezone_parameters, devi
mutate(data = map2(data, tzcode, function(nested_data, tz){
nested_data %>% mutate(local_date_time = format(as_datetime(timestamp / 1000, tz=tz), format="%Y-%m-%d %H:%M:%S"))
})) %>%
unnest(cols=everything())
unnest(cols=everything()) %>%
ungroup()
}

tz_intervals <- buils_tz_intervals(data_tz_codes, device_type)
Expand All @@ -159,7 +160,8 @@ multiple_time_zone_assignment <- function(sensor_data, timezone_parameters, devi
group_by(device_id) %>%
nest() %>%
mutate(data = map2(data, device_id, assign_tz_code, tz_intervals, device_type)) %>%
unnest(cols = data)
unnest(cols = data) %>%
ungroup()
}

return(sensor_data)
Expand Down
19 changes: 10 additions & 9 deletions src/data/datetime/readable_datetime.R
Original file line number Diff line number Diff line change
Expand Up @@ -61,8 +61,8 @@ create_mising_temporal_column <- function(data, device_type){
mutate(data = map2(data, local_timezone, function(nested_data, tz){
return(nested_data %>% mutate(timestamp = as.numeric(ymd_hms(local_date_time, tz=tz)) * 1000) %>% drop_na(timestamp))
})) %>%
unnest(cols = everything())) %>%
ungroup()
unnest(cols = everything()) %>%
ungroup())
} else {
# For the rest of devices we infere local date time from timestamp
if(nrow(data) == 0)
Expand All @@ -73,8 +73,8 @@ create_mising_temporal_column <- function(data, device_type){
mutate(data = map2(data, local_timezone, function(nested_data, tz){
return(nested_data %>% mutate(local_date_time = format(as_datetime(timestamp / 1000, tz=tz), format="%Y-%m-%d %H:%M:%S")) %>% drop_na(local_date_time) )
})) %>%
unnest(cols = everything())) %>%
ungroup()
unnest(cols = everything()) %>%
ungroup())
}
}

Expand Down Expand Up @@ -120,11 +120,12 @@ readable_datetime <- function(){
most_common_tz <- get_participant_most_common_tz(timezone_parameters$MULTIPLE$TZCODES_FILE, participant_file) # in assign_to_multiple_timezones.R
}

output <- create_mising_temporal_column(output, device_type)
output <- split_local_date_time(output)
output <- assign_to_time_segment(output, time_segments, time_segments_type, include_past_periodic_segments, most_common_tz)
output <- filter_wanted_dates(output, participant_file, device_type)
output <- output %>% arrange(timestamp)
output %<>%
create_mising_temporal_column(device_type) %>%
split_local_date_time() %>%
assign_to_time_segment(time_segments, time_segments_type, include_past_periodic_segments, most_common_tz) %>%
filter_wanted_dates(participant_file, device_type) %>%
arrange(timestamp)

write_csv(output, snakemake@output[[1]])
}
Expand Down

0 comments on commit 286d317

Please sign in to comment.