Have you checked out our Syncing Flat-File Data to Transactions tutorial yet? If not, it may be helpful to read through before beginning this one.
App creators can leverage the Sync Data to Transactions Action to support a full data integration between an external system and Onit.
For example, if your organization’s LDAP system maintains a list of all employees, you may want to leverage this data to manage users inside of Onit. If your LDAP system can export information about each user into a delimited flat-file, (e.g., a comma- or pipe-delimited file), you could pass this file to Onit to ingest. When doing so, Onit could do the following:
- Create new users in Onit that don’t already exist.
- Update existing users. For example, a user’s email address and/or last name may have changed.
- Delete users from Onit that no longer exist in your LDAP system.
This tutorial's example will walk through configuring the Sync Data to Transactions Action for an HR-based integration.
Note, however, that you'll need a way to automatically upload your flat-files to Onit. Typically, a SFTP job is configured for this; you should contact Onit for assistance in setting up an SFTP job if you want to use the Action for an integration.
Before We Start ...
This tutorial will assume you understand the following concepts:
Let's Get Started!
In this example, we'll walk through setting up the Sync Data to Transactions Action for an HR-based integration system.
1. Create an orchestrator app for the Sync Data to Transactions Action
For an integration, you should set up a stand-alone orchestrator app, whose only job is to create a transaction to hold each flat-file it receives from the SFTP job and run the Sync Data to Transactions Action with that file.
We'll create an app and name it File Importer. We'll expect it to create, update and delete records in a separate User Profiles app.
A note on User Profiles apps:
User Profiles Apps are a commonly used app in many Onit environments. They are configured to include a record for every user in your system that tracks any properties you care about referencing or building workflow off of in other apps, such as approval authorities or departments.
For flat-file integrations, the app whose transactions your file will affect will need a property with a static value to use as the record's key. This will allow the Sync Data to Transactions Action to differentiate between existing records that it needs to update and brand new records that it needs to create. For an HR integration, employee IDs often make good keys, whereas email addresses make bad keys as they sometimes change (e.g., when a user’s last name changes).
2. Add an Attachment Field to the orchestrator App
We need to provide our orchestrator app an Attachment Field where our flat-file should be stored by the SFTP job. We'll name this Attachment Field File.
You may be wondering: How do I get each new flat-filed stored into the Attachment Field? This is where the SFTP job we mentioned above comes into play. Its job is to create a new Record in our File Importer App for each file it receives from the SFTP server and place the file in the Attachment Field we're creating here.
Reach out to Onit to learn how to setup an SFTP job.
In the meantime, you can either manually add files to you Records' Attachment Fields or use an API to retrieve the flat-file. While explaining the specific API call is outside the scope of this tutorial, below is a sample call that would post a file into an Attachment Field named my_attachment_field:
curl -X PUT --header "Accept: application/json" -F "atom[p_my_attachment_field][email protected]" -F "_attachment_fields[]=atom[p_my_attachment_field]" -k https://acme.onit.com/api/atoms/59930731ed404f5dce00023s.json?auth_token=foo
This API call could be performed directly after a separate API call to create a new Onit record is made. (You can find documentation on this type of call here.) These API calls could be combined together into a scheduled ETL job or CI job.
Regardless of how you get a file into this Attachment Field, you should expect each flat-file to have it's own Record in your orchestrating App.
3. Configure the Sync Data to Transaction Action
Follow the steps from the Sync Flat-File Data to Transactions tutorial to turn on the Action's Beta feature (Step 2), navigate to the new React Advanced Designer (Step 3), and create a new Sync Data to Transaction Action (Step 5).
You'll want to configure a few of the Action's properties differently for a data integration:
- Target App: This should be the App containing the Records that need to be updated. In our example, that's our User Profiles App.
- Change Phase on Completion: If you're using this Action for an integration with an SFTP job, you must change the phase on completion for the server to process future files.
Once your Action is fully configured, make sure you hit Save.
4. Assign the Action to a Phase Change Business Rule
For integrations, your SFTP job will be responsible for creating new transactions for each flat-file it receives from the SFTP server and changing the phase of these new transactions after creation. Therefore you'll want to assign your Action to a Phase Change Business Rule.
Error Handling
You can check the Batch Processes grid (see Checking for Errors) to see if any of the rows from your flat-files are throwing errors. However, since data integrations are automated processes, it is preferable to be automatically notified of any errors when they occur. Liquid contains a few hooks that come in handy here.
To receive an email if errors occur, create a Condition for a Send Notification Action that calls the batch_process_statuses property. The property provides the following Integer properties:
- total: The total number of rows processed.
- processed: The number of rows successfully processed.
- error_count: The number of rows that hit errors during processing.
You can also access a details property for each error. The following variables are available:
- row_num: The row number that had an error.
- message: The message provided by the error.
For example, the following Condition finds the last batch processed by a given transaction, (you can also find the first batch with .first), checks to see if there are any errors, and outputs the row number and message of each error found:
{$ assign last_status = atom.batch_process_statuses.last %} {% if last_status.error_count > 0%} {%for detail in last_status.details%}{{detail.row_num}}: {{detail.message}}{%endfor%}{%endif%}