Skip to content

Commit

Permalink
More linting fix
Browse files Browse the repository at this point in the history
  • Loading branch information
benlansdell committed Jan 29, 2024
1 parent 1036ee6 commit 2ad4cf0
Show file tree
Hide file tree
Showing 59 changed files with 3,529 additions and 3,529 deletions.
2 changes: 1 addition & 1 deletion CODE_OF_CONDUCT.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,4 +70,4 @@ members of the project's leadership.
This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4,
available at https://www.contributor-covenant.org/version/1/4/code-of-conduct.html

[homepage]: https://www.contributor-covenant.org
[homepage]: https://www.contributor-covenant.org
4 changes: 2 additions & 2 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ The sections below outline the steps in each case.
## You have a question

1. use the search functionality [here](https://github.com/benlansdell/ethome/issues) to see if someone already filed the same issue;
1. in some cases checking the documentation may help but if your issue search did not yield any relevant results, make a new issue;
1. in some cases checking the documentation may help but if your issue search did not yield any relevant results, make a new issue;
1. apply the "Question" label; apply other labels when relevant.

## You think you may have found a bug
Expand All @@ -37,4 +37,4 @@ The sections below outline the steps in each case.
1. push your feature branch to (your fork of) the Python Template repository on GitHub;
1. create the pull request, e.g. following the instructions [here](https://help.github.com/articles/creating-a-pull-request/).

In case you feel like you've made a valuable contribution, but you don't know how to write or run tests for it, or how to generate the documentation: don't let this discourage you from making the pull request; we can help you! Just go ahead and submit the pull request, but keep in mind that you might be asked to append additional commits to your pull request.
In case you feel like you've made a valuable contribution, but you don't know how to write or run tests for it, or how to generate the documentation: don't let this discourage you from making the pull request; we can help you! Just go ahead and submit the pull request, but keep in mind that you might be asked to append additional commits to your pull request.
2 changes: 1 addition & 1 deletion MANIFEST.in
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,4 @@ include ./ethome/data/sample_nwb_.nwb
include ./ethome/data/videos/m3v1mp4.mp4
recursive-include ./ethome/data/dlc *.csv
recursive-include ./ethome/data/boris *.csv
recursive-include ./ethome/features/pretrained_models *
recursive-include ./ethome/features/pretrained_models *
6 changes: 3 additions & 3 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -21,12 +21,12 @@ build:
python -m build

clean:
rm -rf dist
rm -rf dist
rm -rf ethome_ml.egg-info

#Upload built package to testpypi repository
deploytest:
python -m twine upload --repository testpypi --skip-existing dist/*
python -m twine upload --repository testpypi --skip-existing dist/*
#Then you can test install with:
#python3 -m pip install --index-url https://test.pypi.org/simple/ --no-deps ethome-ml==0.6.0
#or with version number
Expand All @@ -36,7 +36,7 @@ deploytest:
deploy:
python -m twine upload dist/*
#Then can install simply with:
#pip install ethome
#pip install ethome

#Pointer to demo script for testing/experimenting with functionality
demo:
Expand Down
44 changes: 22 additions & 22 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,19 +5,19 @@

# Ethome

Tools for machine learning of animal behavior.
Tools for machine learning of animal behavior.

This library interprets pose-tracking files and behavior annotations to create features, train behavior classifiers, interpolate pose tracking data and other common analysis tasks.
This library interprets pose-tracking files and behavior annotations to create features, train behavior classifiers, interpolate pose tracking data and other common analysis tasks.

At present pose tracking data from DLC, SLEAP and NWB formats are supported, and behavior annotations from BORIS and NWB formats are supported.

Full documentation is posted here: [https://benlansdell.github.io/ethome/](https://benlansdell.github.io/ethome/).
Full documentation is posted here: [https://benlansdell.github.io/ethome/](https://benlansdell.github.io/ethome/).

## Features

* Read in animal pose data and corresponding behavior annotations to make supervised learning easy
* Scale data to desired physical units
* Interpolate pose data to improve low-confidence predictions
* Interpolate pose data to improve low-confidence predictions
* Create generic features for analysis and downstream ML tasks
* Create features specifically for mouse resident-intruder setup
* Quickly generate plots and movies with behavior predictions
Expand All @@ -28,13 +28,13 @@ Full documentation is posted here: [https://benlansdell.github.io/ethome/](https
pip install ethome-ml
```

`ethome` has been tested with Python 3.7 and 3.8.
`ethome` has been tested with Python 3.7 and 3.8.

### Conda environment

Note that dependencies have tried to be kept to a minimum so that `ethome` can work easily alongside other programs that may be part of your behavior analysis pipeline (e.g. `DeepLabCut`) -- thus you can try running the `pip install` line above in an existing virtual environment.
Note that dependencies have tried to be kept to a minimum so that `ethome` can work easily alongside other programs that may be part of your behavior analysis pipeline (e.g. `DeepLabCut`) -- thus you can try running the `pip install` line above in an existing virtual environment.

That said, you may want a separate environment for running `ethome`. A conda environment can be created with the following steps:
That said, you may want a separate environment for running `ethome`. A conda environment can be created with the following steps:

1. Download the conda environment yaml file [ethome-conda.yaml](www.google.com)
2. (From the location you downloaded the yaml file) Create the environment: `conda env create -f ethome-conda.yaml`
Expand All @@ -45,7 +45,7 @@ With both install methods, you may want to also install `tensorflow` if you want

## Quickstart

It's easiest to start with an NWB file, which has metadata already connected to the pose data.
It's easiest to start with an NWB file, which has metadata already connected to the pose data.

Import
```python
Expand All @@ -70,7 +70,7 @@ dataset.pose.body_parts
A key functionality of `ethome` is the ability to easily create features for machine learning. You can use pre-built featuresets or make your own. For instance:
```python
dataset.features.add('distances')
```
```
will compute all distances between all body parts (both between and within animals).

We can load pose data from DLC, and behavior annotation data from BORIS, provided we also provide a little metadata for context. E.g.:
Expand All @@ -80,11 +80,11 @@ metadata = create_metadata(pose_files, labels = behavior_files, fps = 30)
dataset = create_dataset(metadata)
```

There are featuresets specifically tailored for social mice studies (the resident-intruder setup). For instance,
There are featuresets specifically tailored for social mice studies (the resident-intruder setup). For instance,
```python
dataset.features.add('cnn1d_prob')
```
Uses a pretrained CNN to output probabilities of 3 behaviors (attack, mount, social investigation). For this, you must have labeled your body parts in a certain way (refer to [How To](https://benlansdell.github.io/ethome/how-to/)). Other, more generic, feature creation functions are provided that work for any animal configuration.
Uses a pretrained CNN to output probabilities of 3 behaviors (attack, mount, social investigation). For this, you must have labeled your body parts in a certain way (refer to [How To](https://benlansdell.github.io/ethome/how-to/)). Other, more generic, feature creation functions are provided that work for any animal configuration.

Now you can access a features table, labels, and groups for learning with `dataset.ml.features, dataset.ml.labels, dataset.ml.group`. From here it's easy to use some ML libraries to train a behavior classifier. For example:
```python
Expand All @@ -93,20 +93,20 @@ from sklearn.model_selection import cross_val_score, LeaveOneGroupOut

cv = LeaveOneGroupOut()
model = RandomForestClassifier()
cross_val_score(model,
dataset.ml.features,
dataset.ml.labels,
groups = dataset.ml.group,
cross_val_score(model,
dataset.ml.features,
dataset.ml.labels,
groups = dataset.ml.group,
cv = cv)
```

Since the `dataset` object is just an extended Pandas dataframe we can manipulate it as such. E.g. we can add our model predictions to the dataframe:
```python
from sklearn.model_selection import cross_val_predict
predictions = cross_val_predict(model,
dataset.ml.features,
dataset.ml.labels,
groups = dataset.ml.group,
predictions = cross_val_predict(model,
dataset.ml.features,
dataset.ml.labels,
groups = dataset.ml.group,
cv = cv)
dataset['prediction'] = predictions
```
Expand All @@ -129,7 +129,7 @@ The following animal pose/behavior annotation data formats are supported.

From DLC documentation: The labels are stored in a MultiIndex Pandas Array, which contains the name of the network, body part name, (x, y) label position in pixels, and the likelihood for each frame per body part. These arrays are stored in an efficient Hierarchical Data Format (HDF) in the same directory, where the video is stored. However, if the flag save_as_csv is set to True, the data can also be exported in comma-separated values format (.csv), which in turn can be imported in many programs, such as MATLAB, R, Prism, etc.

### BORIS
### BORIS

[Main project page](https://www.boris.unito.it/)

Expand All @@ -153,7 +153,7 @@ Refer to `CONTRIBUTING.md` for guidelines on how to contribute to the project, a

## Animal data

Sample data was obtained from resident-intruder open field recordings performed as part of on going social memory studies performed in the Zakharenko lab at St Jude Children's Research Hospital (e.g. [1,2]). All animal experiments were reviewed and approved by the Institutional Animal Care & Use Committee of St. Jude Children’s Research Hospital.
Sample data was obtained from resident-intruder open field recordings performed as part of on going social memory studies performed in the Zakharenko lab at St Jude Children's Research Hospital (e.g. [1,2]). All animal experiments were reviewed and approved by the Institutional Animal Care & Use Committee of St. Jude Children’s Research Hospital.

[1] "SCHIZOPHRENIA-RELATED MICRODELETION GENE 2510002D24Rik IS ESSENTIAL FOR SOCIAL MEMORY" US Patent US20220288235A1. Stanislav S. Zakharenko, Prakash DEVARAJU https://patents.google.com/patent/US20220288235A1/en
[2] "A murine model of hnRNPH2-related neurodevelopmental disorder reveals a mechanism for genetic compensation by Hnrnph1". Korff et al. Journal of clinical investigation 133(14). 2023.
[2] "A murine model of hnRNPH2-related neurodevelopmental disorder reveals a mechanism for genetic compensation by Hnrnph1". Korff et al. Journal of clinical investigation 133(14). 2023.
2 changes: 1 addition & 1 deletion codemeta.json
Original file line number Diff line number Diff line change
Expand Up @@ -20,4 +20,4 @@
"license": "MIT",
"title": "ethome",
"version": "v0.4.0"
}
}
2 changes: 1 addition & 1 deletion conda/ethome-conda.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -15,4 +15,4 @@ dependencies:
- umap-learn
- dill
- pynwb
- ipywidgets
- ipywidgets
10 changes: 5 additions & 5 deletions docs/api-docs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
## Modules

- [`interpolation`](./interpolation.md#module-interpolation)
- [`io`](./io.md#module-io): Loading and saving tracking and behavior annotation files
- [`io`](./io.md#module-io): Loading and saving tracking and behavior annotation files
- [`utils`](./utils.md#module-utils): Small helper utilities
- [`video`](./video.md#module-video): Basic video tracking and behavior class that houses data

Expand All @@ -27,22 +27,22 @@
- [`interpolation.interpolate_lowconf_points`](./interpolation.md#function-interpolate_lowconf_points): Interpolate raw tracking points if their probabilities are available.
- [`io.create_behavior_labels`](./io.md#function-create_behavior_labels): Create behavior labels from BORIS exported csv files.
- [`io.get_sample_data`](./io.md#function-get_sample_data): Load a sample dataset of 5 mice social interaction videos. Each video is approx. 5 minutes in duration
- [`io.get_sample_data_paths_dlcboris`](./io.md#function-get_sample_data_paths_dlcboris): Get path to sample data files provided with package.
- [`io.get_sample_data_paths_dlcboris`](./io.md#function-get_sample_data_paths_dlcboris): Get path to sample data files provided with package.
- [`io.get_sample_nwb_paths`](./io.md#function-get_sample_nwb_paths): Get path to a sample NWB file with tracking data for testing and dev purposes.
- [`io.load_data`](./io.md#function-load_data): Load an object from a pickle file
- [`io.load_sklearn_model`](./io.md#function-load_sklearn_model): Load sklearn model from file
- [`io.read_DLC_tracks`](./io.md#function-read_dlc_tracks): Read in tracks from DLC.
- [`io.read_NWB_tracks`](./io.md#function-read_nwb_tracks): Read in tracks from NWB PoseEstimiationSeries format (something saved using the DLC2NWB package).
- [`io.read_boris_annotation`](./io.md#function-read_boris_annotation): Read behavior annotation from BORIS exported csv file.
- [`io.read_boris_annotation`](./io.md#function-read_boris_annotation): Read behavior annotation from BORIS exported csv file.
- [`io.read_sleap_tracks`](./io.md#function-read_sleap_tracks): Read in tracks from SLEAP.
- [`io.save_DLC_tracks_h5`](./io.md#function-save_dlc_tracks_h5): Save DLC tracks in h5 format.
- [`io.save_sklearn_model`](./io.md#function-save_sklearn_model): Save sklearn model to file
- [`io.uniquifier`](./io.md#function-uniquifier): Return a sequence (e.g. list) with unique elements only, but maintaining original list order
- [`utils.checkFFMPEG`](./utils.md#function-checkffmpeg): Check for ffmpeg dependencies
- [`utils.check_keras`](./utils.md#function-check_keras)
- [`video.add_randomforest_predictions`](./video.md#function-add_randomforest_predictions): Perform cross validation of a RandomForestClassifier to predict behavior based on
- [`video.add_randomforest_predictions`](./video.md#function-add_randomforest_predictions): Perform cross validation of a RandomForestClassifier to predict behavior based on
- [`video.create_dataset`](./video.md#function-create_dataset): Creates DataFrame that houses pose-tracking data and behavior annotations, along with relevant metadata, features and behavior annotation labels.
- [`video.create_metadata`](./video.md#function-create_metadata): Prepare a metadata dictionary for defining a ExperimentDataFrame.
- [`video.create_metadata`](./video.md#function-create_metadata): Prepare a metadata dictionary for defining a ExperimentDataFrame.
- [`video.get_sample_openfield_data`](./video.md#function-get_sample_openfield_data): Load a sample dataset of 1 mouse in openfield setup. The video is the sample that comes with DLC.
- [`video.load_experiment`](./video.md#function-load_experiment): Load DataFrame from file.

Expand Down
16 changes: 8 additions & 8 deletions docs/api-docs/interpolation.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,22 +24,22 @@ interpolate_lowconf_points(
) → DataFrame
```

Interpolate raw tracking points if their probabilities are available.
Interpolate raw tracking points if their probabilities are available.



**Args:**
- <b>`edf`</b>: pandas DataFrame containing the tracks to interpolate
- <b>`conf_threshold`</b>: default 0.9. Confidence below which to count as uncertain, and to interpolate its value instead
- <b>`in_place`</b>: default True. Whether to replace data in place
- <b>`rolling_window`</b>: default True. Whether to use a rolling window to interpolate
- <b>`window_size`</b>: default 3. The size of the rolling window to use

- <b>`edf`</b>: pandas DataFrame containing the tracks to interpolate
- <b>`conf_threshold`</b>: default 0.9. Confidence below which to count as uncertain, and to interpolate its value instead
- <b>`in_place`</b>: default True. Whether to replace data in place
- <b>`rolling_window`</b>: default True. Whether to use a rolling window to interpolate
- <b>`window_size`</b>: default 3. The size of the rolling window to use



**Returns:**
Pandas dataframe with the filtered raw columns. Returns None if opted for in_place modification
Pandas dataframe with the filtered raw columns. Returns None if opted for in_place modification



Expand Down
Loading

0 comments on commit 2ad4cf0

Please sign in to comment.