For the first challenge of Dashboard Week, we were tasked to build a dashboard on Cooper Hewitt, a museum in New York City. To begin with, we have to pull data requests from an API from the Copper Hewitt website (link to website).

Pulling Data Using API

In the first step, I connected to the API using an access token, and connected the URLs to the Download and JSON Parse tools. The following is a screenshot of my workflow in Alteryx to pull the data request and processing the data into csv files. The data has not been cleaned.

For my dashboard, I wanted to create a wordcloud that consists of word frequencies based on the text description of all objects in the museum. I used the R tool with library packages (tidyverse and tokenizers) to split the text into individual words. The R codes are pretty simple.

R codes

library(tidytext)
library(tm)
library(dplyr)
library(tokenizers)

df = read.Alteryx(“#1″, mode=”data.frame”)
write.Alteryx(df, 1)

table(unlist(strsplit(tolower(df$description), ” “)))

df2 = data.frame(table(unlist(strsplit(tolower(df$description), ” “))))
write.Alteryx(df2, 2)

 

Building Tableau Dashboard

I exported the final outputs as csv files and uploaded them into Tableau. I built the relationships to connect the Exhibition_Object.csv file to the Department_Objects.csv as we as the radial.csv file. The radial chart is an additional step to enable me to create a radial chart later on.

Interactive Dashboard

This is how the final dashboard looks like. Anyone would be able to use the dashboard to view all types of objects that are acquired by the museum. The key insight is that majority of the objects originated from Italy and they were mainly drawings.

Thanks for reading my blog. Stay tuned for more blogs to come during the Dashboard Week!

Shaida Shamuri
Author: Shaida Shamuri