You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@nutterb, I'm starting to come into your camp on this topic. Similar to how your batching occurs inside the function. (Also #51.)
When I was experimenting with eav exports, I think the batching has to be written in that function --specifically before the type.convert() call. Even using metadata cues, a variable can still resolve to different data types across subsequent batches. This easily happens with sparse columns where a variable is completely NA in a batch. But it can even happen if one batch resolves to an integer, and another to a double.
The text was updated successfully, but these errors were encountered:
Current ideas after experimenting w/ the eav playground:
The eav oneshot fx has an optional conversion optional (and it's on, by default)
The batch version calls oneshot repeatedly without the conversion.
Only after stacking all the oneshot results, the type.convert() is called.
The eav accepts a metadata dataset as a parameter. If it's missing, then it will retrieve the metadata itself. Consequently, the metadata API should be called only once for a dataset with k batches.
(Spun off from #138 and #133.)
@nutterb, I'm starting to come into your camp on this topic. Similar to how your batching occurs inside the function. (Also #51.)
When I was experimenting with eav exports, I think the batching has to be written in that function --specifically before the
type.convert()
call. Even using metadata cues, a variable can still resolve to different data types across subsequent batches. This easily happens with sparse columns where a variable is completelyNA
in a batch. But it can even happen if one batch resolves to an integer, and another to a double.The text was updated successfully, but these errors were encountered: