This week’s challenge was to apply what we have learned to our weekly project. The third week was full of learning and completing Alteryx challenges. We were given the challenge to select and download data from the API of our choice, automate the repetitive process by using Macro, and use Alteryx spatial tools to enrich our analysis.

After reviewing several APIs I decided to use Domain real state website API. I thought it would be an easy task but it wasn’t. I had to read a lot of documentation to understand the process of downloading data from API. Firstly, I created an account on the website. Afterward, I created the project on the website and added my desired API to that project. For this time, I have selected their “Properties and Location” API. The next step was to select an authentication method, I have selected authentication by using Key. After configuration in the download tool, I managed to download the data from API. But it was downloading the data of only one city. In order to download the data of multiple cities either I have to run the same process for each city and union them at the end or automate it by using macro. I went after the second option and created a batch macro to tackle this inadequacy. Below is the workflow of my macro:

After downloading the data of all cities, I did cleansing and filtering of the data and export by data to Tableau.

For the second part, I used Alteryx spatial tools. I created latitude and longitude by using the “creat point” tool, used the “spatial matching” tool to see properties around a specific location.


My third week at data school was full of learning, practising on Alteryx what I have learned and obviously Fun.

Saqib Saeed
Author: Saqib Saeed