-
Notifications
You must be signed in to change notification settings - Fork 184
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: component wise digital clustering #4142
feat: component wise digital clustering #4142
Conversation
WalkthroughModified configurations across C++ and Python examples, they have been. In GeometricConfig, documentation improved and a member variable replaced was. The digitization algorithm now uses arrays for handling weights and channel indices in two dimensions. In digitization.py, the Changes
Suggested labels
Suggested reviewers
Poem
🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do I understand correctly that the digital was weighting xy against each other which leads to funky positions if the pitches are not symmetric?
In any case could you elaborate a bit more on what the issue is and what the fix does?
Hi @paulgessinger, here's the example: Imagine you do a simple weighted mean on the left side, as you have two pixels/strips on both the upper y, it will weight the upper position with weight 2 and the lower one with weight 1, hence it will pull you into a too high value of However, in digital, the weights of every hit cell is one and hence you get this problem. As it is very clear, the second version in this case is a better estimate, hence, what I introduced is an independent weight for single ![]() In the adapted version, blue is ignored, actually. But there are situations when it is only ignored in one direction. (Took me embarrassingly long to understand that) |
Need to fix issues and make CI failures still. |
Thanks @asalzburger! That clears it up. |
ExatrkX python tests are failing ... but I have no idea why? |
@asalzburger Indeed very strange. Could be a transient error, let's rerun. |
Restarted, let's see - it was pretty consistent on this PR though. |
Yeah, it reappaered. I don't see it on other PRs at this time though. @benjaminhuth could the change in the clustering trigger a SEGFAULT in pytorch? |
Test if this fixes ExaTrk.X
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
Examples/Scripts/Python/digitization.py (1)
44-44
: Reduced multiplicity from 2 to 1, I observe.Fewer particles generated will be. For testing component-wise digital clustering, simpler input preferable it is. But documentation of this change, missing it is.
Add comment explaining multiplicity reduction reason, helpful it would be:
- multiplicity=1, + multiplicity=1, # Reduced to simplify component-wise digital clustering testing
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
Examples/Scripts/Python/digitization.py
(2 hunks)
🧰 Additional context used
🧬 Code Definitions (1)
Examples/Scripts/Python/digitization.py (1)
Examples/Python/tests/test_examples.py (1)
field
(39-40)
⏰ Context from checks skipped due to timeout of 90000ms (12)
- GitHub Check: CI Bridge / build_exatrkx
- GitHub Check: CI Bridge / build_exatrkx_cpu
- GitHub Check: CI Bridge / build_gnn_tensorrt
- GitHub Check: merge-sentinel
- GitHub Check: CI Bridge / clang_tidy
- GitHub Check: linux_ubuntu
- GitHub Check: macos
- GitHub Check: linux_ubuntu_extra (ubuntu2404_clang19, 20, clang++-19)
- GitHub Check: linux_ubuntu_extra (ubuntu2204, 20, g++)
- GitHub Check: unused_files
- GitHub Check: missing_includes
- GitHub Check: build_debug
🔇 Additional comments (2)
Examples/Scripts/Python/digitization.py (2)
41-41
: Uniform eta distribution, added you have. Wise, this change is.To reach endcaps with sufficient statistics, uniform distribution necessary it is. Match your PR objectives, this change does.
94-96
: Updated function call with digiConfigFile parameter, you have.Necessary this change is, to match new function signature. Consistent with parameter added in function definition, hmm, yes.
CI debugging
reverting changes to digitisation.py totally
|
@paulgessinger - the issue with CI is resolved, can you reapprove ? |
This PR introduces component-wise digital clustering (as a default for digital).
It removes the problem of wrong weighting.
The PR also fixes the issue that thresholds have not been taking into account correctly for digital clustering.
Before the fix:
After the fix:
--- END COMMIT MESSAGE ---
Any further description goes here, @-mentions are ok here!
feat
,fix
,refactor
,docs
,chore
andbuild
types.Summary by CodeRabbit