After 13 weeks of analytics training and a variety of client projects, we are now officially entering week 14, which is going to be both challenging and rewarding. Yep, it is the one and only Dashboard Week. What it means is we are expected to work on the data we are given, build a dashboard, blog about the process, and present our analysis and insights the very day. And this process repeats for each day of the week, except on Friday, when we are expected to deliver our presentation on the very same day. As you can see, proper time management and proficiency in both data cleaning and visualisation are key for surviving through this week. Hopefully I don’t have to stay up late during this week.
Getting The Data
For our cohort, we are given a bit more flexibility as we have some freedom in choosing which data to work with. We are allowed to get data using any API query service. With this flexibility comes some extra challenge, as the outputs from API requests are usually in JSON format. So we will have to clean and re-format the outputs before importing them into Tableau.
After some exploration with different APIs, I decided to look into the stock performance this year in Australia since it would be interesting to see how Covid has impacted different sectors of the stock market. I used the marketstack API service to get stock price information for the top 100 stocks on ASX based on their market cap. Below is the workflow I created to extract and manipulate the data.
Since each time I’m only able to make one request to the API, that is, getting stock price info for one company, I would have to run the process 100 times to get all the data I needed. To streamline this process, I created a batch macro in Alteryx to make calling the request automatic. Basically, a batch macro runs the same process multiple times in the workflow, creating an output after each run. The outputs can then be unioned together to create the final output, which is the one that contains stock info for all top 100 companies. The batch macro looks like the following:
Batch macros are actually fairly easy to build and can significantly reduce the amount of time spent on extracting the data. Hence, I strongly recommend mastering this skill. A small investment in the beginning can pay back tremendously in your analytics journey.
Building the Dashboard
By saving the data in hyper format, I can directly open and explore it in Tableau. The cleaned data is pretty straightforward to work with. Essentially for each company we are analyzing, it contains information such as open, close, high, and low prices for dates ranging from the last trade day all the way back to beginning of year. Also, we have the sector and market cap.
The purpose of my dashboard is to show the comparison between different sectors based on both number of companies listed on the Top 100 and on the average market cap. Then I’m looking at specific info related to stocks selected by the user, such as its price history since beginning of year and how much the price has changed since then. Here is a screenshot and a link to my dashboard.
Apparently, most stocks experienced notable downfalls around the middle to end of March, with some falling by more than 70% in comparison with price at year start. The hit on companies in the consumer discretionary sector was the hardest, and many still haven’t quite recovered, which is as expected since they rely heavily on tourism. Surprisingly, there is one particular company whose stock price skyrocketed after bouncing back from the lowest point in March, and the company’s name is Afterpay Ltd. In fact, if you had invested in this company back then, your money would have grown by almost 10 times already. But this all is in hindsight now. It takes great courage and faith to buy stocks when everyone in the market is selling. I guess this reminds me of a famous quote by Warren Buffet, “We simply attempt to be fearful when others are greedy and to be greedy only when others are fearful.”