Hello again everyone, Another day another dashboard! The challenge for today was to utilise an API to gain access to our dataset. Now this API was quite nice in comparison to the monsters we’ve experienced in the past. However, the challenge for today was time boxing. You see today was the Melbourne Cup, so we had a create this dashboard by 2pm to then participate in DSAU 10’s first ever in person social event. So, the race was on… literally with the horse as well as against time.
About the data
What was the API? So, this API was called the Whiskey Hunter and it essentially looks at auction and distillery data about different whiskies, very suited to the day, right? I mean it’s no sangria, but it still does the job…. So, we could either focus on the auction related data or the distillery sales related data. I couldn’t really find a common joining field, so I wasn’t able to join them. So, I chose to use the auction data, to do this we needed to create an Alteryx workflow.
How did we create the workflow?
Let’s start off with the workflow in case you missed Anthony Wong’s awesome API session during the Learn what the Data School learns courses then I’ll give you a brief recap. APIs in Alteryx are spilt into two main steps, 1. Get Data & 2. Clean data. So, we use a text input to insert the link and then we want to utilise the download and JSON parse tools to get the data. Now that we have extracted the data, we want to clean it so we can get it in the format we need. Therefore, we get our values into the columns by using the crosstab tool in Alteryx.
So how does this look:
So, if we look at this API page, here we have 3 links related to the auctions data. What we can see is that to extract the auctions data we need to include a parameter. Now a parameter when working with APIs is what specifically you want to view in the API. In this case we wanted the view the “slugs”. I’m not sure why they’re called slugs but they’re basically the names or types of the auctions. So, to get the “slugs” we needed to extract them from the information auctions URL. First start by extracting the info URL (to get the data) then we want to clean the data to add a text input with the data URL. Then append the fields so we have URLs for each of the rows and create a formula to update the URL so we can download the data specific to that auction name. Find an image of this screenshot below.
Now that we have created this workflow, we need to find our story and create the dashboard. My story looks at which auction a whiskey distributor may want to sell at if they were to get their hands on a priceless bottle of whiskey. This depends on a variety of factors, this includes: a historic look at trade volume, fee comparisons and the frequency that an auction occurs.
Here is my final product:
It was definitely a rush to get through this challenge, timeboxing was key. However, I made it and I was able to capture some insights with this dashboard.
Day 2 is finally over!