Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use Monte-Carlo simulations for frame unwrapping and WFM #163

Merged
merged 50 commits into from
Jan 23, 2025
Merged

Conversation

nvaytet
Copy link
Member

@nvaytet nvaytet commented Jan 16, 2025

Moving frame unwrapping and WFM processing from scippneutron to essreduce

In this PR, we move away from our analytical computations of frame boundaries and fitting the polygons with single lines for frame unwrapping and WFM.

This method yielded errors typically towards the edge of the frames where the polygon fit would be less accurate (the lines would in some cases be outside of the polygons). Using a 3rd order fit helped but still wasn't optimal.

In this work, we instead use the tof package to illuminate the chopper cascade with a pulse of neutrons.
We then propagate the neutrons that exit the cascade to the distances of the pixels (this is just a single broadcast operation) and build a lookup table that gives us wavelength (or time-of-flight) as a function of distance and time-of-arrival.

The results are better than with the previous method (tested on a DREAM simulation), and the code simpler.
The computation is somewhat more expensive than before, but for now it is not very noticeable.

More importantly, this opens the door to other simulation methods. Instead of tof, one could in the future use McStas to make a much more realistic simulation of neutrons travelling through the cascade, including effects like non-instantaneous opening of choppers or neutron reflections along the beam guide.

See the frame-unwrapping and wfm-time-of-flight notebooks for a description of how the algorithm works.

Comparison with previous version:

Plots of dspacing vs 2theta:

  • true: using the true wavelengths of the neutrons from the simulation data
  • from_tof: the current version implemented in this PR
  • from_main: the version on scippneutron/main using linear fit to polygons
  • from_cubic: an (unpublished) improvement on scippneutron/main using a cubic fit to polygons (see https://github.com/scipp/scippneutron/tree/cubic-polygons)

Screenshot_20250114_141344

1D dspacing plot:

Figure 1 (16)
Figure 1 (17)

Copy link
Member

@jl-wynen jl-wynen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I only looked at high level structure as I am unfamiliar with the algorithmic details.

radius=ch.radius,
)
for name, ch in wfm_choppers.items()
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The module-level choppers only seem to be used in tests. Do they need to be in this module or can they be moved into the test module?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could move them, but I would say those choppers are meant to be used with the FakeBeamline, so I would leave them there?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If they are useful outside of ESSreduce / ScippNeutron, yes. But can you actually use them in any other package?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess something that can create fake beamline data that looks like semi-realistic NeXus data could be used by other packages for testing? (even though I don't have anything specific in mind right now)

- Monitor event data including event_time_offset and event_time_zero
"""

# from __future__ import annotations
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# from __future__ import annotations

parameters).
"""
facilities = {"ess": sc.scalar(14.0, unit="Hz")}
return PulsePeriod(1.0 / facilities[facility])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I made an attempt to extract source information and functions into a class in ESSspectroscopy in scipp/essspectroscopy@ff52978
Does it make sense to have something like this in a separate module in ESSreduce?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Keeping a database of useful data would be useful, also the wavelength range of the source, or things like a rough range of distances covered by Ltotal for each instrument?
But maybe we can delay that to another PR?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But maybe we can delay that to another PR?

Definitely. Just wanted to let you know about it.

Raw detector data loaded from a NeXus file, e.g., NXdetector containing
NXevent_data.
"""
return Ltotal[RunType](da.coords["Ltotal"])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Similarly to the source, does it make sense to have a Beamline class / module to encode common data? I am also thinking of metadata like the beamline's name, facility, etc, which would be useful for writing output files. See also scipp/scippneutron#473

Copy link
Contributor

@jokasimr jokasimr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me!

)

# TODO: move toa resolution to workflow parameter
binned = data.bin(distance=ndist, toa=500)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a reason why we are using distance=ndist here instead of distance=distances? (Not that I think it matters, just wondering if there's something subtle going on here that I'm missing.)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for pointing this out. I refactored the distances
There is a bit of funny stuff going on with bin edges and midpoints, when binning data and then sending to the bilinear interpolator, but I think I got it right now.

timeofflight = (sc.midpoints(wavelength.coords["distance"])) / velocity
out = timeofflight.to(unit=time_unit, copy=False)
# Include the variances computed above
out.variances = variance.values
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Aren't the variances the variance of the wavelength?
It seems strange to set the time-of-flight variance equal to the wavelength variance.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, good point.
I need to compute something dimensionless when using the threshold, probably the standard deviation (sigma) over the mean (mu) or something?

I tried that but for some reason, when computing sigma/mu on the wavelengths I don't get the same values as when computing it on the tofs. I thought they would be because it's only a factor of distance * m_n / h everywhere.
What am I missing?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok. I figured out what my mistake was. I do get the same for both. I will use that instead.


lookup_values = lookup.data.to(unit=elem_unit(toas), copy=False).values
# Merge all masks into a single mask
if lookup.masks:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would it make sense to do this in the provider that creates MaskedTimeOfFlightLookup? It seems odd to modify the lookup table in a function that uses it rather than produces it.

@nvaytet nvaytet marked this pull request as draft January 17, 2025 16:15
Comment on lines 27 to 31
time_of_arrival:
Time of arrival of the neutrons at the position where the events were recorded.
For a ``tof`` simulation, this is just the position of the component (chopper or
detector) where the events are recorded. For a ``McStas`` simulation, this is
the position of the event monitor.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Was the second part meant for the distance: field?

Comment on lines 366 to 373
out = da.copy(deep=False)
if out.bins is not None:
parts = out.bins.constituents
out.data = sc.bins(**parts)
parts["data"] = tofs
out.bins.coords["tof"] = _bins_no_validate(**parts)
else:
out.coords["tof"] = tofs
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can use da.bins.assign_coords and da.assign_coords to avoid some back and forth?

Copy link
Member Author

@nvaytet nvaytet Jan 22, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I changed it to

    if da.bins is not None:
        parts = da.bins.constituents
        parts["data"] = tofs
        out = da.bins.assign_coords(tof=_bins_no_validate(**parts))
    else:
        out = da.assign_coords(tof=tofs)

Seems to work. Is that what you meant?

My assumption is that parts["data"] = tofs does not modify the data inside da. Am I right?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, parts is just a dict.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, I need scipp=25.01 to use da.bins.assign_coords :-(

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems it was released now.

@nvaytet nvaytet merged commit ddf196c into main Jan 23, 2025
4 checks passed
@nvaytet nvaytet deleted the unwrap-tof branch January 23, 2025 09:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants