Skip to main content

Hi there,

 

I’m able to export metrics for a given metric and a given campaign, however, there are 22 metrics and we have 54 campaigns, so to export all the combinations, it takes ~1,200 API calls and takes hours.

 

Is there a better way to do this?

 

This is how I’m doing it at the moment;

Step 1

Get all the campaign id's and names;

curl https://a.klaviyo.com/api/v1/campaigns -G -d api_key=XXXXX |  jq '.data|.']| {id: .id, name: .name}'

Returns 54 campaigns;

{  "id": "UTYxxx""name": "July Newsletter"}{  "id": "R7Lxxx""name": "May Newsletter"}...

 

Step 2

Get all the metric id's and their names;

curl https://a.klaviyo.com/api/v1/metrics -G -d api_key=XXXXX |  jq -r '.data|.6]| {id: .id, name: .name}'
 

Returns 22 metrics;
{  "id": "HGExxx""name": "Clicked Email"}{  "id": "N2vxxx""name": "Dropped Email"}...

 

Step 3

Get the metric data by iterating over the metric id's and campaign id's. E.g. For the "Clicked Email" metric, HGExxx, and the "May Newsletter" campaign,R7Lxxx ;

 

curl https://a.klaviyo.com/api/v1/metric/HGExxx/export -G -d api_key=XXXXX -d unit='day' -d start_date='2021-07-01' -d end_date='2021-07-01' -d measurement='count' -d where='le"$campaign","=","R7Lxxx"]]'

Return 1 metric for 1 campaign;
{
  "metric": {    "object": "metric",    "id": "HGExxx",    "name": "Clicked Email",    "integration": {      "object": "integration",      "id": "0rGxxx",      "name": "Klaviyo",      "category": "Internal"    },    "created": "2020-03-10 12:41:05",    "updated": "2020-03-10 12:41:05"  }"start_date": "2021-07-01 00:00:00""end_date": "2021-07-01 00:00:00""unit": "daily""measurement": "c""by": """where": a    <      "$campaign",      "=",      "R7Lxxx"    ]  ]"results": #    {      "segment": "Everyone",      "data": a        {          "date": "2021-07-01 00:00:00",          "values": ,            0          ]        }      ]    }  ]}

 

Now keep iterating over all 22 metrics and all 54 campaigns, making 1,188 API calls

@Graham - If you don’t filter it by the Campaign ID, I believe you’ll get all events in your time span across all Campaigns for that Metric.  Then when you get the data locally, you can then parse it out by Campaign in a simple ETL process.  This reduces your API calls by a magnitude of 54 (for the 54 campaigns).  

Alternatively, depending on the volume of events, another way is to ingest all the raw events in the event timeline (/v1/metric/timeline), storing it locally (or on a secured server), and then make subsequent queries against the entire data set from your own repository. This avoid the multiple API roundtrips, but does mean you have to have a batch process to pull in event data on a regular basis to update your repository and you’ll have to write your own aggregation queries based on how you store the data.


Hi @retention, thank you for your reply. I started out with just the metric (i.e no -d where='[["$campaign","=","R7LsBF"]]'), but it doesn’t seem to give a campaign break down I can parse out. For example, this returned data doesn’t include campaign;

curl https://a.klaviyo.com/api/v1/metric/HGExxx/export -s -G \
-d api_key=XXXX \
-d unit='day' \
-d start_date='2021-07-01' \
-d end_date='2021-07-01' \
-d measurement='count'
{"metric": {"object": "metric", "id": "HGExxx", "name": "Clicked Email", "integration": {"object": "integration", "id": "0rGxxx", "name": "Klaviyo", "category": "Internal"}, "created": "2020-03-10 12:41:05", "updated": "2020-03-10 12:41:05"}, "start_date": "2021-07-01 00:00:00", "end_date": "2021-07-01 00:00:00", "unit": "daily", "measurement": "c", "by": "", "where": "", "results": "{"segment": "Everyone", "data": "{"date": "2021-07-01 00:00:00", "values": "13.0]}]}]}

I was hoping the ‘by’ query param to the api/v1/metric endpoint might help, but I can’t find documentation for it and my guesses of things like “campaign” seem to have no affect.

The iterative approach I took seems to have a problem where the count “value” for all combinations for metrics and campaigns is always 0.0, which doesn’t seem right.

I had a look at the /v1/metric/timeline endpoint to check if the 0.0 values for the given day were correct, but it threw up some strange results about campaigns. It’s returning events for campaigns, where the campaigns don’t exist in the list returned by the /api/v1/campaigns endpoint. I double checked the /api/v1/campaigns endpoint is only returning 54 entries with count=100, and there is no page=1 (i.e. there is only page 0). So, I think either I’m doing something wrong or the /api/v1/campaigns endpoint maybe broken. If it is broken, then my original approach won’t work and timeline would be the only way to go.

Here are the curl’s I’m using that have conflicting data;

# Unique campaigns from all recent timeline events;
curl https://a.klaviyo.com/api/v1/metrics/timeline -s -G \
-d api_key=XXX \
-d since='2021-07-21' \
-d count='100' \
|jq -r '.data|. ]|.event_properties|."Campaign Name"' | sort -u
 
# Unique campaigns from "Received Email" Jf4jhd metric recent timeline events;
curl https://a.klaviyo.com/api/v1/metric/Jf4xxx/timeline -s -G \
-d api_key=XXXX \
-d since='2021-07-21' \
-d count=100 \
|jq -r '.data|.n]|.event_properties|."Campaign Name"' | sort -u
 
# All campaigns;
curl https://a.klaviyo.com/api/v1/campaigns -s -G \
-d api_key=XXXX \
-d count=100 \
| jq '.data|.a]|.name' | sort -u

 


Reply