Use API with Alteryx to import QuickBooks GL reports to Tableau


QuickBooks Online delivers a variety of great reports online and the tools it offers to developers are top notch. The reports themselves could use some more visual zazz… Seriously, I need the accounting data available in a data base to serve to a BI tool and take the data set to the next stage, with scenarios such as time series, forecasts or various predictive models. Extracting that data and normalize it is still quite a challenge. I will go through the steps of importing a GL report from a sandbox all the way to Tableau, using QuickBooks API, Alteryx, Snowflake and Tableau. The logic still applies to alternative tools.

From:

To:

Clearing the authentication hurdle

  1. Go to https://developer.intuit.com/app/developer/homepage to create a free developer account using the SIGN UP button top right.
  2. If you don’t have an actual instance of QuickBooks to connect to, you can create a sandbox here: https://developer.intuit.com/app/developer/sandbox
  3. To understand how the API handles the GL report, go to the API explorer here and see how the report data is comprised of 3 levels:
    1. Headers
    2. Rows
    3. Columns
  4. Note also that there are two distinct Base URLs depending on what you will use:
    1. Sandbox: https://sandbox-quickbooks.api.intuit.com
    2. Production: https://quickbooks.api.intuit.com
  5. Now let’s focus on API access for your data. The full documentation is here, but we can take some shortcuts this way:
    1. Create a new App at:
       https://developer.intuit.com/app/developer/dashboard
    2. Go to the excellent Oauth Playground:
      https://developer.intuit.com/app/developer/playground
      1. Select your App
      2. check Accounting Scope and click Get Auth Code
      3. Authorize your app
      4. Click Get tokens at Step 2
      5.  Collect your credentials:
        1. Realm ID (=Company ID) to uniquely identify the data of your QuickBooks company. The Realm ID is assigned to a company by Intuit when a QuickBooks Online user creates a company. that Realm ID will be included in the URL endpoint.
        2. Client ID
        3. Client Secret
        4. Refresh Token. Note the mention that it expires in 101 days, so if you don’t run your workflow within those 101 days, you will need to return to that Oauth playground to regenerate a Refresh token
    3. Use those credentials in the QB connector
      1. Download Alteryx QB Connector from here. Documentation is here.
      2. Configure the connector with your credentials, highlighted is the Realm ID:

You need to check the first time retrieving option to write the Refresh Token locally to your machine and obtain an access token that will be used to run the query; the refresh token will get replaced and overwritten by a new value, written locally as well. You can uncheck the first time option once you ran successfully the first query.

Putting together the URL for the query

Whichever tool you end up using to extract the raw data out of the QB API, you will need to customize the proper URL to get what you need. Here is the structure:

  1. Endpoint URL as mentioned earlier, can be either sandbox-quickbooks.api.intuit.com or quickbooks.api.intuit.com for Production
  2. Realm ID also called Company ID in some parts of the documentation
  3. Parameters: Described here for the General Ledger report, they let you pick which columns to include, which time frame and most important, which indicators, depending on the multi currency settings of your QB instance. If you don’t see your Amount or Balance in the extract, it is probably because you need to adjust those parameters to pick the other set of indicators. For instance, the amount indicator can be either: subt_nat_amount, subt_nat_home_amount, subt_nat_amount_nt or subt_nat_amount_home_nt depending on your instance, and the QB API will not help you, nor return any error message if you pick the wrong one… Trial & Error is your friend…

Raw API Output

The Alteryx Connector will output what looks like a simple JSON file, with only 2 columns and 7,050 rows in my case:

This is very misleading, as this is in fact a VERY nested JSON data structure, comprising several tables as illustrated here in the number of segments between dots separating them:

We will recourse to the Text to Column tool to convert the content of the JSON into several tables to break that structure and ultimately join them into a single flat table:

Flattening Strategy

Using filters, I will break down the data source into multiple tables, each reflecting a different level, starting with Report Headers, then Column Headers and then Rows:

The next steps is to obtain a single row per record with multiple columns, which can be done using simply a Cross Tab tool:

However, looking further at the output of the Rows, the 6,997 records in this case, you will realize it contains more levels of columns. This will require further parsing of the headers from the row, filtering and rebuild of rows with columns, by following this logic:

In my QB sandbox, the Chart of Accounts comprises of 3 levels of accounts. I will therefore parse the rows up to 3 times before getting to the transactions with $ indicators. There could be more levels in other instances. The parsing will have to be adjusted according to the structure of the Chart of Accounts being extracted. Building that parsing logic can be quite tedious, but the logic can be repeated with some Copy+Paste and should be tested right away. I highly recommend the use of the Alteryx workflow cache feature to avoid hitting the QB API too much.

Note that some tools like Precog can ease the challenge of flattening those JSON arrays, even within Alteryx with a dedicated macro. Here is a great explanation of the flattening process: https://precog.com/introduction-to-json-and-nosql-data/.

This entry was posted in Alteryx, Automation, Tableau and tagged . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.