Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Input trajectories' coordinate system, and necessity to scale coordinates when working on another dataset? #30

Open
PFery4 opened this issue Jun 29, 2023 · 0 comments

Comments

@PFery4
Copy link

PFery4 commented Jun 29, 2023

Hey,

I would like to know something about the format of the input trajectories as they are provided directly within the .txt files. Is any kind of coordinate preprocessing performed when generating those files? Specifically, is there any scaling applied to ensure reasonably similar scales across different datasets?

I have plotted the coordinates for some of the extracted .txt files (from within the preprocessed eth_ucy files).

image
image
image

From the looks of it, the coordinate system seems to be expressed in meters. No centering is performed here. I do recognise that this is unnecessary as this is eventually done in the set_data method of the dataloader anyway:

if scene_orig_all_past:
self.data['scene_orig'] = self.data['pre_motion'].view(-1, 2).mean(dim=0)
else:
self.data['scene_orig'] = self.data['pre_motion'][-1].mean(dim=0)

The reason why I would like to know if you follow any standard practice when it comes to scaling the data is because I would like to apply the model to another dataset, which makes use of pixels as units in their coordinate system, instead of meters.

From my current understanding, there's no need to apply any kind of coordinate scaling here. The model will simply adapt its weights to account for a more widely or narrowly "stretched" version of the input data. However, I also found that the preprocessor does apply some kind of scaling on the input trajectory data:

found_data = past_data[past_data[:, 1] == identity].squeeze()[[self.xind, self.zind]] / self.past_traj_scale

found_data = fut_data[fut_data[:, 1] == identity].squeeze()[[self.xind, self.zind]] / self.traj_scale

Is anything required from me in terms of setting this scaling factor properly with respect to the dataset I intend to use? Why is this scaling factor used here?

Thank you very much for your time and your work on the AgentFormer model!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant