Preamble

Parsing API Responses in Alteryx is a staple of the data
analytics process, and the tool that enables is, the
Download
Tool

is key.

If you’ve had experience working with APIs then you might have
use the Formula
Tool

to compose URLs before downloading them.

If you have then this blog post is for you, because today I
will introduce you to a simpler method for achieving the same thing.

This is a technique I use every
time
I use the
Download
Tool

in Alteryx.

The WordPress API

The API we will be using for this blog post is the WordPress
API which provides endpoints for all WordPress websites by default. The API provides the following endpoints for
accessing data about a website:

Resource Base Route
Posts /wp/v2/posts
Post Revisions /wp/v2/posts/<id>/revisions
Categories /wp/v2/categories
Tags /wp/v2/tags
Pages /wp/v2/pages
Page Revisions /wp/v2/pages/<id>/revisions
Comments /wp/v2/comments
Taxonomies /wp/v2/taxonomies
Media /wp/v2/media
Users /wp/v2/users
Post Types /wp/v2/types
Post Statuses /wp/v2/statuses
Settings /wp/v2/settings
Themes /wp/v2/themes
Search /wp/v2/search
Block Types /wp/v2/block-types
Blocks /wp/v2/blocks
Block Revisions /wp/v2/blocks/<id>/autosaves/
Block Renderer /wp/v2/block-renderer
Block Directory
Items
/wp/v2/block-directory/search
Plugins /wp/v2/plugins

This blog, that you’re reading now, has been made using
wordpress, and therefore, is accessible using the endpoints above; you just have to add https://thedataschool.com.au/wp-json/ to the
front of the endpoints first.

Querying the WP API Using Alteryx

As part of our training we’re taught how to interact with APIs
using Alteryx. We use the Download Tool to do this, and construct the request URL using the
Formula Tool.

The request URL we need to give the
Download
Tool

should be formatted like this:

https://thedataschool.com.au/wp-json/wp/v2/pages?page=1&perpage=100&orderby=id&order=asc

A simple workflow using this approach looks something like
this:

This is pretty simple when we’re working with a couple of
parameters (everything after the ? in the URL) and it’s good enough for most purposes. The formula we use to add
the parameters looks like this:

[endpoint]
+ "?page=" + ToString([page])
+ "&perpage=" + ToString([per_page])
+ "&orderby=" + [orderby]
+ "&order=" + [order]

However, there is a simpler approach than this. Adding
parameters manually by concatenating them can be prone to errors unless you’re very careful, also it isn’t very
easy to edit if you need to add or remove them.

Here’s the approach I like to use, which I believe is simpler
and more flexible:


As you can see there is no
Formula
Tool

at all in this workflow; because it simply isn’t necessary. Given the input defined below, making sure that the
column names match the name of the parameter the API is expecting:

Record endpoint page per_page order orderby
1 https://thedataschool.com.au/wp-json/wp/v2/pages 1 100 asc id
2 https://thedataschool.com.au/wp-json/wp/v2/posts 1 100 asc id
3 https://thedataschool.com.au/wp-json/wp/v2/categories 1 100 asc id
4 https://thedataschool.com.au/wp-json/wp/v2/tags 1 100 asc id
5 https://thedataschool.com.au/wp-json/wp/v2/users 1 100 asc id

If you look at the
Payload
tab of the
Download
Tool

you will see an option to add “values from these fields” under the
Compose
Query String/Body

section of the tool; as below:


By simply checking the boxes for the parameters you need, you
can side-step the process of composing your query strings manually entirely.

Closing Thoughts

Simpler is often better, and your solutions should only be as
complex as you need them to be. If you can avoid the laborious process of concatenating parameters together, you
can focus on the actual parsing of the output.

I hope that you can use this technique in your workflows, and
that you focus the time-saved on even better analysis!

Love,
Dan

 

 

Daniel Lawson
Author: Daniel Lawson

Right off the bat I can tell you that I’m not your average data analyst. I’ve spent most of my career running my own business as a photographer and videographer, with a sprinkling of Web Development and SEO work as well. My approach to life and work is very T-shaped, in that I have a small set of specific skills complemented by a very broad range of interests; I like to think of myself as a dedicated non-specialist. Data Analytics, and Programming, started as a hobby that quickly grew into a passion. The more I learned the more I looked for opportunities to pull, manipulate, and join data from disparate sources in my life. I learned to interact with REST APIs for services I used, personal data from services I use like Spotify, and health data captured by my devices. I learned SQL to create and query databases, as well as analyse SQLite files containing my iMessages and Photos data on my Mac. Every technique I learned opened up more possibilities; now I’m hooked and there’s no turning back. Learn More About Me: https://danlsn.com.au