History and Data
History is a very data-rich field. The narrative that we call history is based on the data that have survived, whether complete and accurate, partial, distorted, or biased. However, we can always find fresh new perspectives, out-of-the-box thinking, time-tested strategies, or avenues that do not work; simply by getting more curious about the mega dataset that we call the past.
All Day Spatial Day
Earlier this week, we spent a whole day learning spatial tools in Alteryx such as processing spatial joins, extracting properties of a spatial object, creating new spatial objects, and much more. The hands-on practice gave me the confidence and knowledge to start manipulating spatial objects.
I have decided to work with 2 shapefiles; Roman Empire in 117 AD and the Macedonian Empire under Alexander the Great. These files can be downloaded from Project Mercury.
First, the Viz
After processing the shapefiles through an Alteryx workflow, I created a visualisation to visually illustrate the size and spread of these large empires as well as calculate the total surface areas, the surface area over which they intersect, and distances between selected cities within each respective empire.
The stars represent the capital city of each respective empire, whereas the black circles represent some of the major urban settlements that existed in antiquity. Click here to see the viz on Tableau Public.
The Alteryx Workflow
The visualisation uses a single data source in Tableau. However, this data source was built using at least 2 datasets and several input files which had to be processed and then combined. Below is an overview of the workflow:
At the outset, this workflow performs 3 key set of tasks:
(1) Generate one spatial object for Rome and Macedon each ( combining smaller objects into one) and create a spatial object for the regions that overlap.
(2) Have all spatial objects from the two empires in one data source as well as add key information such as surface area and other spatial object descriptors.
(3) Turn a list of cities and geo-coordinates into spatial points, calculate the distance between cities and capital city and join with the main dataset ( that contains spatial objects for each empire).
And Voila! we have a Tableau-ready data source.
I thoroughly enjoyed working with this dataset. I am always looking for ways to combine the multiple interests that I have. The raw data itself is not very insightful; I supplemented the spatial files with additional data inputs or small datasets ( such as the list of cities and geo-coordinates). I believe that there are maybe more opportunities to enrich these datasets.
The next step for this project is to use the Pleiades Places API to query information about an ancient/historical city such as geo-coordinates data, founding date information, and other very useful information about an ancient city. Furthermore, I can increase the efficiency of this workflow by turning it into a macro – all by using the tools taught this week at Data School. Stayed tuned for the sequel to this post!