Why I chose this API

Unless you study the classics or are a big history buff like me, chances are you have never come across this API before. Quoting from their website: “Pleiades is a community-built gazetteer and graph of ancient places.”; which is a very valuable resource when looking to supplement shapefiles with reliable and accurate descriptors.

In my last blog, I wrote about the visualisation that I created to compare the sizes and spread of the Roman and Macedonian empires. To further improve the visualisation potential of this dataset and to access critical mapping information such as geo-coordinates more efficiently, an Alteryx workflow will be used to request data from the API, parse the result, transform and output in a structured format.

Querying the Pleiades API

Below is a screenshot of the output generated the Alteryx workflow that I built:

You can view my workflow in the screenshot below. Note that I did not use the download tool because I was not able to configure the tool to retrieve the data. Therefore, I chose the Python tool as the next best given that I am familiar with the Python syntax.

The Python Tool

The code below is a combination of the download tool, the JSON parser, and Crosstab. In the code snippet below, I used Pandas’ JSON parser, which often requires passing additional parameters for more complex JSON structures (e.g multi-layered/ nested). Pandas’ JSON parser also converts the JSON file into its native two-dimensional structure called a Pandas data frame. Incoming connections into the Python Tool only accept the Pandas data frame format.

Next Steps

Now that I have a workflow that can successfully query data from API and output in the correct format, the workflow is ready to be converted into a macro. Since the API only accepts queries for one location at a time, the macro input will accept a single location id or a list of location ids that a batch macro can loop through.

Fabrice Joseph
Author: Fabrice Joseph

Originally from Mauritius, Fabrice moved to Australia to complete a Bachelor of Commerce (Accounting) at the University of Queensland. Since graduating, Fabrice accumulated 7+ years of experience in primarily management accounting roles and a couple of entrepreneurial projects. Having encountered data in the accounting profession, Fabrice had developed a passion for analysing data, extracting insights and findings ways to improve his workflow. Such is his passion for data that he has data projects as hobbies. Data School was the natural next step for him to launch his career in data. Besides data, Fabrice has other interests such as yoga, reading books on ancient history and philosophy.