Day 3 of Dashboard Week brought with it a new set of challenges – namely, spatial data. We were free to pick any datasets we liked from the City of Melbourne Open Data website. The aim was to build a dashboard that would help the user make a decision, and spatial analysis must be involved.

See other posts in my Dashboard Week series:

Data exploration

To be honest, I’ve always had a love-hate relationship with spatial data. It is amazing when it is all clean, has matching location names or proper latitude / longitude information, etc. But when it comes to cleaning up and processing spatial data for analysis, ideally you’d want to have all the time and resources in the world, because it does tend to take a very, very long time (at least from my experience).

As soon as I started looking through the datasets available, I had my sights set on parking data. I enjoy driving and I enjoy hanging around the city, but those two things do not go well together. Put simply, it’s been a nightmare every time I’ve had to look for on-street parking in the Melbourne CBD at the last minute before a dinner or a night at the theatre. (So, yes, usually I’d just book an indoor parking space in advance instead.)

And if I were to park in the city, most often it’s because I’m eating out with family or friends. Hence why I decided to work on a dashboard that would help the user first locate a restaurant within the City of Melbourne and then find on-street parking in the vicinity. Interestingly, I also found a dataset on parking bay restrictions (e.g. parking allowed for two hours between 8 am and 6 pm on weekdays). I thought it would be a nice challenge to parse this data and visualise it, to further help the user decide on suitable parking spaces depending on the time and day.

Issues encountered

Some issues I encountered while working on my viz were:

  • Tableau struggled with the 42,000+ records in the combined dataset (using a spatial join involving a 500-metre buffer around each restaurant). Every time I made the tiniest change, Tableau would take around 30 to 60 seconds to update the viz (and sometimes just become completely unresponsive). I took the precaution of saving backup files after every stable version of the viz and ended up with more than a dozen of these Tableau files in total. I got there in the end, but it was nerve-wracking nevertheless having to constantly fear that my dashboard could be lost forever if Tableau decides not to load it successfully the next time I reopened the file.
  • Because of the above, I restricted the parking data to a much smaller sample of 5000 instead. This meant my dashboard wouldn’t actually show all the parking bays available within the City of Melbourne, but at least everything worked much faster now.
  • I spent way too much time trying to show both restaurants and parking bays within one single map using set actions. This took a lot of Googling and going down many rabbit holes, eventually still leading to a dead end. At last I decided to scrap the idea and went with two separate maps instead.
  • Not all parking bays had parking restriction information. This, combined with the fact that I used only a sample of the parking bay dataset, meant that a lot of the time the dashboard would show nothing in the bottom half after a user clicks on a parking bay for more information. But, alas, time was my enemy and I had to leave it in that state for the presentation the next morning.

Alteryx workflow

Here is my Alteryx workflow for processing the data:

Alteryx workflow for on-street parking data (limited to a sample of 5000 parking bays mostly with parking restriction information, since Tableau was running very slowly every time I made a change)

Alteryx workflow for restaurant data (part 1)

Alteryx workflow for restaurant data (part 2)

Alteryx workflow for parking restrictions data (part 1)

Alteryx workflow for parking restrictions data (part 2)

Tableau viz

See my viz on Tableau Public:


Vincent Ging Ho Yim
Author: Vincent Ging Ho Yim

Vincent has always enjoyed learning new things as well as finding elegant and efficient solutions to problems since childhood. He studied linguistics at university and has subsequently worked in theatre lighting and broadcast captioning. In his previous job he found his passion working with data and decided to pursue a change in career. In his spare time he likes reading, learning languages (both human and programming ones) and playing Pic-a-Pix and sudoku. He loves laksa, sushi and burritos.