Skip to main content
Solved

Flow Metrics: Dashboard vs. Custom Report


Forum|alt.badge.img

I saw a similar question on this but nothing was really resolved.

I am seeing a discrepancy between flow deliverability metrics in my Dashboard and a Custom Report for the same time period. 

The Open Rate in my Flows Performance Widget in my Dashboard is 25.7%, while the Open Rate from an Email Deliverability by Flow custom report is 30.85%, which is an almost 20% difference. The number of emails sent is the same, so it should be looking at the same information.

If I pull the same information for Campaigns, it matches up, so I’m not sure why Flows is different.

Excluding archived flows does not change any numbers.

Any suggestions? Thanks in advance!

Best answer by retention

@schwarze.cass - agree with ​@MANSIR2094, the performance can vary because the custom report could be calculated on the timeframe of “daily, week, or entire range” - so think of it this way:

If it’s daily/weekly then every day/week the open rate is calculated.  And then it’s weighted average over the time period. If it’s over the time range, then they can open the email at any time in the time range.  

Here’s an anecdotal example.  Say you had a flow message sent to 100 people in day one.

  • 20 people open it on day one. 
  • 10 people opened it on day two. 
  • 10 people opened in the second week after it is sent.

Here’s how the open rates differ based on the timeframe:

  • If you do daily, then open rate on day one is 20%, and after that those opens are not calculated since it wasn’t sent on subsequent days.
  • If you do weekly, then the open rate is calculated within the first week, and so the open rate is 30% (since 30 people opened in the week that the flow message was sent). And opens in the second week isn’t calculated.
  • If you did the entire range, then the open rate is 40% (assuming you picked a wide range that includes all 40 opens) since all 40 opens happened in the entire range. 

So as you can see, the data can vary depending on how you choose the timeframe (and time zones, etc as mentioned earlier).  It’s hard to line this up unless you have all the variables lined up as well.  

 

View original
Did this topic or the replies in the thread help you find an answer to your question?

4 replies

MANSIR2094
Problem Solver IV
Forum|alt.badge.img+13
  • Problem Solver IV
  • 175 replies
  • January 9, 2025

Hello ​@schwarze.cass 

The discrepancy might be due to how the metrics are calculated in the dashboard vs. the custom report. Dashboards often show aggregated metrics based on specific filters or time zones, while custom reports may have different calculation methods or data handling.

 

Double-check the following:

 

Ensure both the dashboard and report are using the same time zone.

 

Verify if the custom report includes unique opens or other advanced filtering that may affect percentages.

 

Confirm that all flows, including paused or archived ones, are accounted for consistently in both data sources.

 

If this doesn't resolve it or seems too complex, feel free to reach out for tailored assistance with setup and review!

 


retention
Partner - Platinum
Forum|alt.badge.img+62
  • 2025 Champion
  • 920 replies
  • Answer
  • January 10, 2025

@schwarze.cass - agree with ​@MANSIR2094, the performance can vary because the custom report could be calculated on the timeframe of “daily, week, or entire range” - so think of it this way:

If it’s daily/weekly then every day/week the open rate is calculated.  And then it’s weighted average over the time period. If it’s over the time range, then they can open the email at any time in the time range.  

Here’s an anecdotal example.  Say you had a flow message sent to 100 people in day one.

  • 20 people open it on day one. 
  • 10 people opened it on day two. 
  • 10 people opened in the second week after it is sent.

Here’s how the open rates differ based on the timeframe:

  • If you do daily, then open rate on day one is 20%, and after that those opens are not calculated since it wasn’t sent on subsequent days.
  • If you do weekly, then the open rate is calculated within the first week, and so the open rate is 30% (since 30 people opened in the week that the flow message was sent). And opens in the second week isn’t calculated.
  • If you did the entire range, then the open rate is 40% (assuming you picked a wide range that includes all 40 opens) since all 40 opens happened in the entire range. 

So as you can see, the data can vary depending on how you choose the timeframe (and time zones, etc as mentioned earlier).  It’s hard to line this up unless you have all the variables lined up as well.  

 


Timmy Solomon
Problem Solver III
Forum|alt.badge.img+5
  • Problem Solver III
  • 19 replies
  • January 10, 2025

Hi @schwarze.cass,

This discrepancy between the Flow Performance Widget and the Custom Report might be due to the way Klaviyo calculates metrics for flows versus campaigns. Here are a few points to consider:

  1. Attribution Windows:

    • The Flow Performance Widget might use a default attribution window, whereas the Custom Report could be using a different one. Verify if the same attribution windows are applied in both cases.
  2. Metrics Breakdown:

    • Check if the Custom Report is using unique opens versus total opens. The Flow Performance Widget typically defaults to unique opens, while reports can sometimes show cumulative data.
  3. Excluded Data:

    • Ensure that both the widget and the report are including or excluding the same data, such as skipped recipients or suppressed profiles.
  4. Time Zone Settings:

    • Confirm that both the widget and the report are aligned with the same time zone settings, as time zone mismatches can affect reporting for flows that span multiple days.

Since the sent email count matches, the discrepancy likely lies in how the open rate is calculated or filtered. If these steps don’t resolve the issue, it might help to reach out to Klaviyo support with screenshots of both the Flow Performance Widget and the Custom Report for deeper analysis.

Let us know if this helps or if you need further assistance!


Byrne C
Community Manager
Forum|alt.badge.img+10
  • Community Manager
  • 72 replies
  • January 10, 2025

Hey ​@schwarze.cass!

Have you determined the reason for this discrepancy yet?

I’d recommend checking the date range of your dashboard and the date range of your custom report. Are they both set to the same dates, in the same year? Discrepancies between the date ranges can often cause issues such as this.

Let me know if this helps, or if I can answer any follow up questions.

-Byrne