DSAU17 Dashboard Week Day 1

Welcome to Day 1 of Cohort 17’s Dashboard Week in our thirteenth week of training. With only a few short weeks of training left we are so excited to be undertaking the infamous dashboard week. Here is the first of four blog posts showcasing what I can do in a single day with an unknown dataset.

The dataset that we received this morning was the MealDB dataset available at themealdb.com. The MealDB is an open, crowd-sourced database of recipes from around the world, made accessible via an API.


Self-service analytics is about giving users the control and access to answer the questions that they have. Interactive dashboards are superior to simple spreadsheets for the reason that they give users the ability to easily slice and dice data.

With that in mind I put myself in the shoes of a potential user of the MealDB dataset, and thought about how it could help answer the age-old question; what should we eat?

Before going into more detail, I want to outline the 10-stage process I went through in trying to build a tool to achieve this:

10-Stage Process

  1. Explore the MealsDB API
  2. Download Categories, Areas and Ingredients
  3. Scrape Meals Endpoint Using Filter Endpoint
  4. Reshape Ingredients Data For Visualisation
  5. Enrich Data Using YouTube Video Links and youtube-dl
  6. Export Data to Hyper Files and Load into Tableau
  7. Create Relationships between Data Sources
  8. Create Dashboard Template Design in Figma
  9. Create Filters and Parameters for Interactivity
  10. Implement Drill-down Using Dynamic Zoning

With that out of the way, let’s jump in and build this dashboard; people are hungry and they’re ready to eat.

Stage 1: Exploring the Meals DB API

TheMealDB is a crowd-source dataset made available via a simple JSON API. The dataset contains meals, ingredients, and images for recipes sourced from contributors.

The API offers a small selection of endpoints to access the data; the most interesting for my project is the following:

  1. List All Categories, Areas, and Ingredients
  2. Filter by Category, Area, and Main Ingredient
  3. Lookup Full Meal Details by ID

API Limitations

Unfortunately, the API didn’t seem to offer the ability to download the full dataset in one query, so I needed to be creative about getting all the meals. There is a meal lookup endpoint but enumerating the Meal ID from 1 to 52772 would be inefficient, and borderline irresponsible due to the stress it would cause their servers.

The next couple of stages address how I overcame this limitation.

Stage 2: Download Categories, Areas and Ingredients

The first step I took was to make use of the “list all …” endpoints to download all categories, area, and ingredients. The reason I started with this was with the goal of using that data as a query to the “filter by …” endpoints to get all meal IDs.

Here’s a screen shot of the Alteryx workflow to achieve this.

This workflow makes a simple call to the [https://www.themealdb.com/api/json/v1/1/categories.php](<https://www.themealdb.com/api/json/v1/1/categories.php>) endpoint using the “Download” tool.

The output from this tool is JSON data that is then run through a parsing process that is out of the scope for this dashboard week overview.

As you can see above what we end up with is a table of each category present in the database. We can then repeat this process for areas, and ingredients.

Stage 3: Scrape Meals Endpoint Using Filter Endpoint

The next step in the process is to take each category and append it as a query to the “filter” endpoint www.themealdb.com/api/json/v1/1/filter.php?c={CATEGORY_NAME}. Here’s a screenshot of that workflow.

In the screenshot below we can see that the request url, and the query string is ready to be input into another download and parse process.

At the end of this part of the workflow we end up with the data we see below; a table of meal IDs, names and thumbnails but very little else.

With a list of Meal IDs we are now ready to query the meals endpoint to gather all the meals in the database for further preparation.

Stage 4: Reshape Ingredients Data For Visualisation

The MealDB database provides the meal ingredients in a wide fashion with columns Ingredient1..Ingredient20 for each meal.

For the purposes of the viz I wanted to reshape the data as you see below, because:

  • I’d like to filter meals by the number of ingredients in the recipe
  • I’d like to provide a tall ingredient list like a user would expect from a recipe.

Stage 5: Enrich Data Using YouTube Video Links and youtube-dl

The MealDB database provides a great resource of recipes but not much depth or context. However, a large proportion of the recipes include a link to a YouTube video.

This gave me the idea of incorporation statistics from YouTube to enrich the data using view count, likes and subscribers. This data could serve as an estimation of the popularity of the recipe; with this data the dashboard can rank recipes by popularity.

Using an Open-source Terminal Application for Downloading Videos

youtube-dl has been a favourite tool of mine for many years; among other things, it is useful for downloading videos from the around the internet in a programmatic way. With all the fancy graphical interfaces modern computers have, sometimes the power of a text-based interface cannot be overstated.

youtube-dl is normally used for downloading video files, but it can also write JSON files containing information scraped the webpage hosting the video. In the case of YouTube videos this includes views, likes, subscribers, as well as the channel and title to name a few.

youtube-dl needs only the youtube url in order to work its magic, and it can take a list of urls as an input and download each one.

Once we have a folder full of JSON files, we can use the “Directory” tool combined with a “Dynamic Input” tool to parse and join the video information back to the meals data.

Stage 6: Export Data to Hyper Files and Load into Tableau

For anyone familiar with Tableau, this part should be fairly straight-forward. Using Alteryx’ ability to output directly to a Hyper file, we can do so and import the data into Tableau.

Your content goes here. Edit or remove this text inline or in the module Content settings. You can also style every aspect of this content in the module Design settings and even apply custom CSS to this text in the module Advanced settings.

Stage 7: Create Relationships between Data Sources

After that we create relationships by dragging tables onto the workspace. The data is trivial to link together using Meal ID.

Stage 8: Create Dashboard Template Design in Figma

I have a love-hate relationship with the design tools that Tableau provides in its dashboards. Coming from a design background, I’ve been spoilt by professional tools such as Adobe Illustrator and Photoshop that provide pixel perfect control over how your work looks. In Tableau you barely have any control over how text is rendered.

I like using Figma, a professional grade tool used by UX designers to design interfaces for web, mobile and desktop applications. For being free to use it is an extremely powerful tool.

As you can see above, the template for the dashboard was created in Figma with a modern gradient background, and a touch of branding showcasing the simple logo I put together for this project.

Stage 9: Create Filters and Parameters for Interactivity

The key to this project was providing interactivity to the user through parameters, and filters. The screenshot above shows how a user can filter recipes by complexity, and exclude categories or areas to show the most popular recipes that meet their criteria.

Stage 10: Implement Drill-down Using Dynamic Zoning

Dynamic Zoning is a powerful tool added in Tableau Version 2022.3 and it’s a fantastic asset for providing users the ability to drill-down by dynamically hiding and showing content based on a users selection.

From the video above you can see how I used dynamic zoning to give users a detailed view of a recipe; showing ingredients, instructions and a bit more context about the author.

Dynamic zoning gives visualisation authors the ability to embed huge amounts of data and show it progressively without creating too much visual clutter.


There you have it, Day One of Dashboard Week completed. I can’t wait to see what tomorrow’s challenge has in store; I will certainly take some learning from today into the rest of the week to avoid burning out.

With that said, please enjoy the viz and keep an eye out for tomorrow’s.

Daniel Lawson
Author: Daniel Lawson

Right off the bat I can tell you that I’m not your average data analyst. I’ve spent most of my career running my own business as a photographer and videographer, with a sprinkling of Web Development and SEO work as well. My approach to life and work is very T-shaped, in that I have a small set of specific skills complemented by a very broad range of interests; I like to think of myself as a dedicated non-specialist. Data Analytics, and Programming, started as a hobby that quickly grew into a passion. The more I learned the more I looked for opportunities to pull, manipulate, and join data from disparate sources in my life. I learned to interact with REST APIs for services I used, personal data from services I use like Spotify, and health data captured by my devices. I learned SQL to create and query databases, as well as analyse SQLite files containing my iMessages and Photos data on my Mac. Every technique I learned opened up more possibilities; now I’m hooked and there’s no turning back. Learn More About Me: https://danlsn.com.au