Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Distance object Color not updating on change #45

Open
Birdasaur opened this issue Sep 5, 2023 · 11 comments
Open

Distance object Color not updating on change #45

Birdasaur opened this issue Sep 5, 2023 · 11 comments

Comments

@Birdasaur
Copy link
Collaborator

Related to #35

After a Distance object has been created the color of the 3D line that represents it can be changed using a color picker. However when the color value in the control is altered the Distance object's color is not immediately updated. When selecting a different Distance object in the list view, the previously changed color is then rendered. This implies that either the distance change event is not being fired or the listeners are not hooked in, or possibly the global map of distance objects is not being updated in a timely manner relative to the event handling.

This behavior has been observed in the Projections 3D pane but may also be present in the Hyperspace3D pane.

@czp13
Copy link

czp13 commented Oct 9, 2023

Any file and/or steps to reproduce the issue maybe @Birdasaur? It would help me a lot to be faster fixing this, and to not go totally deep, and dig out all the context. Appreciated if you can help! 🙇

@Birdasaur
Copy link
Collaborator Author

Ah this brings up a great point... we don't have any files for testing!
We need to upload a few different files to facilitate. I will do so but I will need a little time to determine which data files are both good representative examples but are not sensitive. (so I can share)

@czp13
Copy link

czp13 commented Oct 12, 2023

Oh, I see! Thank you for your response.
Yes, indeed it is much easier to fix a bug if you can actually see/reproduce the bug :D.
And I fully understand the concern, and problem with data. Take your time, I also have busy days/weeks behind me with Devoxx and fam/work stuff. I will have a second look at this issue when we have some files/data to spot the issue.

@Birdasaur
Copy link
Collaborator Author

@samypr100 Does it make sense here to create a branch for this issue and add a "sample" folder or something like that?

@samypr100
Copy link
Member

Agreed

@samypr100
Copy link
Member

I'd name it example_data

@Birdasaur
Copy link
Collaborator Author

https://github.com/Birdasaur/Trinity/tree/45-distance-object-color-not-updating-on-change

Added two data files (human vs chatgpt) and a label coloration configuration file.

@Birdasaur
Copy link
Collaborator Author

Adding these files, while just JSON, does add over 50mb to the repo. I didn't stop to think about that.
Is this the right thing to do?

@czp13
Copy link

czp13 commented Oct 17, 2023

@Birdasaur: Thank you very much for working on this! Much appreciated!

I'm with you on this. I think the 100MB limit for larger files on GitLab, after that you need to use GitLFS.
But 20-50MB files are also not nice to add. Usually, from what I have seen or how we've resolved this with different companies, there are a couple of options:

Using a Google Drive link (and we can add it to the README.md with the dataset link).

  • Potentially, we can upload it to an S3 bucket or another storage solution.
  • I vaguely remember that GitHub doesn't handle larger files well, which can make your project more challenging to clone. Also, any changes to the file will increase the repo's size significantly, as it is storing the changes. (at least as far as I know). So, it's better to keep it as small as possible.

Another idea is that you can somehow narrow down the dataset to only include what's relevant to a specific problem. However, in that case, we might need to do every time for each of the bugs.

On the other hand, we can't write unit tests without a test dataset to verify if the problem is fixed and stays fixed in the long run.

So yeah, many thoughts :D, but in short, I am fine if you add it to Google Drive if possible, or anywhere else, but GitHub. And it would still be nice to have it covered with a unit test, I feel.

@samypr100
Copy link
Member

samypr100 commented Oct 17, 2023

Might be easier to provide instructions on how to generate the data instead. Althought if you already commited and already pushed, the size increase is permanent on the history now I think.

@czp13
Copy link

czp13 commented Oct 19, 2023

@samypr100 I do not see any push or PR from recent times containing the big file.

Birdsaur and SamyPr:

I think 50mb is okay, GitHub shall be able to handle it nicely, the downsides I can see now are:

Also if you reset, remove the commit fully from git commit history, and run some kind of GitHub magic, or eventually at some point with garbage collection, the files shall be removed completely and space shall be allocated again, more info about how to completely remove data from Git:

All in all not the end of the world to commit 50MB but for the long run, long/frequently used open-source libs/projects it is not optimal. So I suggest the previous options (GDrive or other storages, or even generating data also seems a good idea, but how do you generate to cover, see the issues, seems a bit more complex to me...)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

When branches are created from issues, their pull requests are automatically linked.

3 participants