Just to be clear, we’re talking about Smart Send Time, and not Smart Sending right?
@jsappington - I can speak of only what we’ve encountered through some of our past client work, but we tend to think of Smart Send Time as just one of the tools to try to improve the engagement level of campaign emails if engagement (open/click) is low or declining.
- Because it does require a larger List (12,000 or more), by definition this is something we try only when a merchant has a sizable Lists where there’s enough confirmations.
- With a bit of intuition mixed in with the data science here, we propose to try Smart Send when the merchant has a strong bifurcated “coastal” audience or “international” audience (especially if they don’t split their marketing by regions).
- Some email campaigns are naturally more optimal for Smart Sending - specifically ones that are not time sensitive! Better yet, if there’s an evergreen campaign that is timeless (e.g. Staple Products, Educational Content, etc) you can repeat/retest the same campaign in the future to another cohort of subscribers to keep things as consistent as possible.
- Definitely some intuition or “gut” instincts to go along with the test - but the data might surprise you. In one instance I heard of, the optimal time was so unexpected (very late in evening, whereas the brand typically sends in the AM), they were surprised with the result, and started sending a few subsequent campaigns with the new optimal time (evenings) and it worked out well!
@jsappington @retention Sorry I’m a little late to this Smart Send Time party, but I figured I’d chime in along with Joseph. Although my own personal experience is relatively limited, the Smart Send Time feature, as Joe suggests, is built on the foundation of data. There’s actually a pretty neat blog post (highlighting some data from Beauty and Cosmetic industry) from one of the Klaviyo Data Science leads that digs into why Smart Send Time is optimized for Open Rate.
Although you may already know this, there are some suggested best practices to follow when conducting an exploratory send. For instance, you may consider turning off smart sending while using Smart Send Time to ensure that the same people on your list all have an opportunity to open your emails. If you're sending out other campaigns in a similar window, you may want to send the non-Smart Send Time email campaign with smart sending or on a different day. This article in particular goes into pretty deep details on how to Analyze Smart Sent Time tests, but I also think this blog post from one of our Senior Product Managers is insightful. One specific callout he mentions is:
If you’ve run multiple Smart Send Time tests you can view and compare the data for each test. If those tests targeted the same segment of customer, you can look at the underlying data and compare the hour open rates to more fully understand how your Smart Send Time was formed.
Definitely curious to know what other marketers/business owners are doing though to optimize campaigns via Smart Send Time, or build on the existing data they’ve gathered for more tests.
Thanks for asking such a great question, @jsappington!
-Cass.
Having gone through the full exploratory > focused sends and testing, I’m a little dubious of the results and my “gut” feeling as reported in this thread just doesn’t stack up…
Our audience is US based, advertising premium golf products to our Shopify store. The smart send campaigns start at 12am and run for the full 24hr period. After multiple sends - it is reporting that the best time for opens is 2am!! I find this very hard to believe that our audience are most active at this time, and appears that the result in this is skewed by the fact that email sends begin at 12am.
Has anyone else encountered what seems like a highly unlikely result?