Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make consistent across codebase when a child has "done" a study #1322

Open
msheskin opened this issue Dec 3, 2023 · 1 comment
Open

Make consistent across codebase when a child has "done" a study #1322

msheskin opened this issue Dec 3, 2023 · 1 comment

Comments

@msheskin
Copy link

msheskin commented Dec 3, 2023

Description
The rule we have shared with researchers is that a child is marked as having "done" a study when they click into it (with either the "Participate now!" button or the "Schedule a time to participate" button). However, it turns out that this is not consistent across the codebase! According to some quick investigating by Melissa, a child being "done" with a study is "slightly differently, in different parts of the codebase written by different people in different years". We should make the definition of "done" consistent across the codebase, and likely the rule we shared with researchers (described above).

How to reproduce
A good example is a child who has done "Introduction to Project GARDEN!" and "Journey to the GARDEN Library 1" on https://lookit.mit.edu/studies/garden/. These two "studies" are actually just watching a video (and do not contain any other common study features; for example, they do not contain consent). No matter how many times a family does these studies, they remain. This leads to two problems. Smaller problem: They remain visible even when the parent selects "Hide Studies We've Done". Larger problem: Families will still receive recruitment emails for these studies, which leads to confusion for the parents AND means we are wasting recruitment emails for these compared to other studies that are new.

Expected behavior
Consistent definition of "done" across the codebase. (For the specific example described above: very important families no longer receive recruitment emails for "Introduction to Project GARDEN!" and "Journey to the GARDEN Library 1" once they have "done" those studies)

@becky-gilbert
Copy link
Contributor

becky-gilbert commented Dec 4, 2023

Thanks for reporting this issue @msheskin! I agree this is confusing, and we should decide on a single definition for having 'done' or 'participated in' a study.

Just to clarify the definitions:

  • The 'must (not) have participated' study criteria implements its own definition of 'participated', which is that a response object exists (i.e. the participant clicked on the 'schedule/participate now' button). Also, if the study is internal, then the frame sequence must not be empty.
  • The 'hide studies we've done' checkbox hides any studies where the child has a response for the study that contains a consent frame (or if the study is external, has a response).
  • Anything else on the site that checks study completion/participation is probably using the completed flag in the response. This flag is set false initially and becomes true when the participant completes the exit survey. (If the study does not use an exit survey then I think this value is never set to true!).

There are problems with all of these definitions, but I agree that the discrepancy between these is causing unnecessary confusion and we should try to make them consistent. IMO, we should change the 'must (not) have participated' and 'hide studies we've done' definitions so that they use the responses's completed flag.

The reason I think we should move away from the 'started study = participated' definition is because of a pretty common user scenario that is causing confusion:

  1. Parent clicks on 'participate/schedule now'
  2. For whatever reason they do not actually start (or get very far with) the study
  3. Family comes back to the study and finds that they get the red 'ineligible' warning on the page due to prior participation in that study

However, the completed flag is also problematic because:

  1. Responses to studies that do not use the exit survey frame are never marked as completed: true (see Participants are sent automatic general recruitment emails for studies they have already completed #1315).
  2. If a participant gets most of the way through a study but does not submit the exit survey frame, their response is still marked as completed: false, even if their response should be considered complete for practical purposes. (And the exact point at which a study should be considered 'complete' will vary across studies).

So if we decide to consistently use the completed flag to determine prior participation, we should also consider ways to improve it:

  • Consider adding a way for researchers to set a specific completion point (frame) in the form/protocol for internal studies, so that the responses are marked as 'complete' based on this rather than on the exit survey.
  • Consider adding something to the UI that allows researchers to manually mark a response session as 'complete' according to their definition (this would also be useful for external studies!).

Any improvements we make to the completed flag would also be useful for automatically caping responses at a certain number, which is a proposed feature (see #772). This feature requires some automated way of counting responses as they come in, and ideally we'd only be counting responses if they are complete (as well as eligible and non-preview).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants