-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Categorical NB Food on Fork #104
Conversation
ada_feeding_perception/ada_feeding_perception/food_on_fork_categorical_naive_bayes_training.py
Show resolved
Hide resolved
ada_feeding_perception/ada_feeding_perception/food_on_fork_categorical_naive_bayes_training.py
Show resolved
Hide resolved
ada_feeding_perception/ada_feeding_perception/food_on_fork_logistic_reg_training.py
Show resolved
Hide resolved
ada_feeding_perception/ada_feeding_perception/food_on_fork_logistic_reg_training.py
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Mostly looks good, a few things should be changed but after you do that I envision is being ready to merge.
`/food_on_fork` topic. More documentation for the webapp can be found on its repository. | ||
|
||
There are a couple models that can be used to determine Food on Fork. They are as follows: | ||
> Note that "frustum" is a 3D space around the fork |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not necessary, but consider linking a page explaining a frustrum, e.g., https://en.wikipedia.org/wiki/Frustum
ParameterDescriptor( | ||
name="max_depth", | ||
type=ParameterType.PARAMETER_INTEGER, | ||
description="maximum depth to consider (note that 330 is approx distance " |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
description="maximum depth to consider (note that 330 is approx distance " | |
description="maximum depth to consider (note that 330mm is approx distance " |
ParameterDescriptor( | ||
name="min_depth", | ||
type=ParameterType.PARAMETER_INTEGER, | ||
description="minimum depth to consider (note that 330 is approx distance " |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
description="minimum depth to consider (note that 330 is approx distance " | |
description="minimum depth to consider (note that 330mm is approx distance " |
|
||
# Tuning the rectangle | ||
# depth_img_copy = np.copy(depth_img) | ||
# cv.rectangle(depth_img_copy, (317, 238), (445, 322), (255, 0, 0)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you want to keep this code in the file, plase explain what these numbers should refer to. e.g., consider making each of them variables that correspond to the parameter names, and then commenting out the whole block of code including those variable assignments
|
||
return np.count_nonzero(mask_img) | ||
|
||
def predict_food_on_fork( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think this is resolved. You still have two different parameters you pass in here. Instead, you should pass in a single parameter, x_test
, which is an ndarray of size (1, num_features)
. Then, this function should just consist of
prediction_prob = self.model.predict_proba(x_test)
return float(prediction_prob[0][1])
print("Running Food On Fork Node") | ||
rclpy.init(args=args) | ||
food_on_fork = FoodOnFork() | ||
executor = MultiThreadedExecutor() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since you only have one subscriber, you don't need the MultiThreadedExecutor. Please change to the SingleThreadedExecutor
# single feature or not | ||
single_feature: False | ||
# testing | ||
test: False |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is convention for every file to end with a newline. Please add one here: https://stackoverflow.com/questions/729692/why-should-text-files-end-with-a-newline
|
||
if __name__ == "__main__": | ||
rosbags_abs_location = ( | ||
"/media/atharvak/oldFoodManip/rosbags_aligned_depth_10-17-23/" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
5 years from now when someone is revisiting this code, they won't have access to this path to be able to check what the folder structure is.
What would be more helpful is: (a) allowing people to pass in an argument for the paths -- don't have it hardcoded into the file; (b) describe (either in a comment in this file, or in the README) what the expected directory structure is that this file needs (e.g., a folder with subfolders that each correspond to a single rosbag). That way, even if future programmers don't have access to the same dataset, they can recreate a dataset with the same directory structure, and then your code should work.
|
||
if __name__ == "__main__": | ||
rosbag_access_folder = ( | ||
"/media/atharvak/oldFoodManip/rosbags_aligned_depth_10-17-23/" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same comment here re. hardcoded paths and lack of transperancy re. the expected directory structure.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do you need two separate files, one to parse images and the other to crop images? why not have one script that does both?
Closed with #169 |
Description
This PR is adding a feature node Food on Fork that will allow for automatically detecting the presence of food on the fork. It is addressing the issue linked #43. It will output a confidence to a topic
/food_on_fork
and it represents the confidence with which the model is able to predict food on the fork.Design Decisions
Outlined, in the readme, there is an overview of the models being used.
Testing procedure
Test the Training Procedure
/ada_feeding_perception/ada_feeding_perception
). Don't unzip the files. The code will unzip and process at runtime. The three datasets as linked below:/ada_feeding_perception/ada_feeding_perception/
.food_on_fork_categorical_naive_bayes_training.py
.python food_on_fork_categorical_naive_bayes_training.py "<boolean>" "<dataset_zip>" --model_save_file "<name>"
boolean
) specifies whether or not to use the entire dataset for training. It also specifies whether or not a model should be saved.dataset_zip
) specifies the location of the zip file that you have downloaded from the first step. Ideally, you should just store the downloaded zip file inada_feeding_perception/ada_feeding_perception
.name
) is optional and is dependent on whether or not the first argument is true or false. If the first argument is true, the third argument is required, otherwise it is not required. This is the name that you want to provide for the model being savedpython food_on_fork_categorical_naive_bayes_training.py "False" "./train_test_data_no_hand_8-30-23.zip"
<dataset_zip>
specifies the dataset you downloaded.Test the model on the robot
Now that you have confirmed that you are able to mimic the training procedure, perform a model training on the full dataset. Run the following command to train on the full dataset:
python food_on_fork_categorical_naive_bayes_training.py "True" "./train_test_data_no_hand_8-30-23.zip" --model_save_file "categorical_naive_bayes_without_hand_8-30-23.pkl"
. This should create a .pkl file, which is the model. Now either move this model into/ada_feeding_perception/model/
folder or use the model already present inside that folder. To be able to use this model, navigate tofood_on_fork.yaml
file and change the location of the model.To test this model, follow these steps:
/food_on_fork
topic (ros2 topic echo /food_on_fork
). If there is food on fork, you should notice values close to 1.0 and if there is no food on fork, you should notice values close to 0.0.ros2 topic echo /food_on_fork
and then visually test out by adding various food items onto the fork.Before opening a pull request
python3 -m black .
ada_feeding
directory, run:pylint --recursive=y --rcfile=.pylintrc .
.Before Merging
Squash & Merge