You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Question
Depending on the data I can have different granularity in the same column chart. My use case is that my source of data sometimes returns the data in hourly mode, daily mode or 15 minutes mode.
So I have two questions:
Is there any way to make the chart autodiscover the granularity of the data? that means autosize the columns depending on the amount of data and the period requested. I haven't found this.
the Work arround to make it work is to put the data always as "hourly", which is ok for me. The problem comes with the 15 minutes data that makes a downsampling taking the bigges value withing the hour. Is there anyway to customize this aggregation? Could i configure it to be a Sum, an Average, or another mathematical function?
Here is an example of the aggregation by Week have samples in daily mode, it reprensets only the biggest in each period. https://jsfiddle.net/1mtsLyak/1/
The text was updated successfully, but these errors were encountered:
Is there any way to make the chart autodiscover the granularity of the data?
No, I'm afraid there's not built-in function to discover granularity of data.
Is there anyway to customize this aggregation? Could i configure it to be a Sum, an Average, or another mathematical function?
Have you tried enabling data item grouping? I mean forcing hourly grouping, so that all 15-min data items are aggregated into a single data point as per your rules.
Question
Depending on the data I can have different granularity in the same column chart. My use case is that my source of data sometimes returns the data in hourly mode, daily mode or 15 minutes mode.
So I have two questions:
Here is an example of the aggregation by Week have samples in daily mode, it reprensets only the biggest in each period.
https://jsfiddle.net/1mtsLyak/1/
The text was updated successfully, but these errors were encountered: