Skip to content

SAP Integration with Matillion Data Loader

The following article shows how to create a custom connector in Matillion Data Loader that loads SAP data via Xtract Universal into Snowflake. Matillion Data Loader is a cloud based data loading platform that extracts data from popular sources and loads it into cloud destinations, see Official Website: Matillion Data Loader.

Prerequisites

Setup in Xtract Universal

  1. Create an extraction in Xtract Universal, see Getting Started: Create an Extraction.
    The depicted example scenario extracts the SAP table KNA1 (General Data in Customer Master).
    KNA1
  2. Assign the http-json destination to the extraction, see Documentation: Assign Destinations.

Create a Custom Connector in Matillion

To extract SAP data via Xtract Universal you must define a custom connector that contains the connection details of Xtract Universal, see Matillion Documentation: Matillion Custom Connector Overview.

  1. Open the website https://create-connector.matillion.com/ and log in to create the custom connector.
  2. Click [Add Connector] to create a new custom connector.
    matillion-add-connector
  3. Click to change the name of the connector .
  4. Copy the URL the extraction into the designated input field and select GET as the http method . The URL has the following format:
    <Protocol>://<HOST or IP address>:<Port>/?name=<Name of the Extraction>{&<parameter_i>=<value_i>}
    Example: the URL https://6606-185-114-89-133.eu.ngrok.io/?name=kna1 calls the extraction "kna1" via ngrock. For more information about calling extractions via web services, see Web API.
    matillion-test-connector
  5. To test the connection, enter your authentication details and click [Send] . If the connection is successful, the http response contains the SAP customer data extracted by Xtract Universal .
  6. Click to edit the structure (names and data types) of the http response .
    The structure is used when loading data into your destination. This example scenario only extracts the KNA1 columns City_ORT01, Name 1_NAME1, Country Key_LAND1 and Customer Number_KUNNR.
    matillion-structure
  7. Optional: If your extraction uses parameters, open the Parameters tab and define the parameters.
    matillion-parameters
  8. Click [Save] to save the connector.

The custom connector can now be used in a Matillion Data Loader pipeline.

Note

The Matillion Custom Connector must be set to the same region as Matillion Data Loader, e.g., US (N. Virginia).

Create a Pipeline in Matillion Data Loader

Create a pipeline that triggers the extraction and writes the data to a destination, see Matillion Documentation: Create a pipeline with custom connectors.

  1. Open the Matillion Data Loader dashboard.
  2. Click [Add Pipeline] to create a new pipeline .
    matillion-pipelines
  3. Open the Custom Connectors tap to select the custom connector , that contains the connection settings for Xtract Universal.
    matillion-source
  4. Select the endpoint that calls the Xtract Universal extraction and use the arrow buttons to add the endpoint to the list Endpoints to extract and load. Note that a custom connector can have multiple endpoints.
  5. Click [Continue with x endpoint] .
    matillion-endpoints
  6. In the General tab enter a name for the target table under Data warehouse table name.
    matillion-configure-endpoints
  7. Open the Authentication tab and enter the authentication details for the Xtract Universal webservice.
  8. Open the Behaviour tab and select the elements you want to include as columns in the target table. By default, all elements are selected.
  9. Optional: If your endpoint uses parameters, open the Parameters tab to define the parameters.
  10. Open the Keys tab and select a key column that is used to match existing data and prevent duplicates, e.g., Customer Number_KUNNR .
  11. Click [Continue] .
    matillion-configure-endpoints-key
  12. Select the destination to which the data is written to, e.g., Snowflake . For more information on how to connect to Snowflake, see Matillion Documentation: Connect to Snowflake.
    matillion-destination
  13. Configure the destination, see Matillion Documentation: Configure Snowflake.
  14. Click [Continue].
  15. Enter a name for the pipeline .
    matillion-frequency
  16. Select at which interval pipeline is to be executed . The pipeline first runs after it is created and then continues with the specified frequency.
  17. Click [Create pipeline] to create and run the pipeline . The pipeline is now listed in your dashboard.
    matillion-pipeline-done
  18. Check if the data was successfully uploaded to the destination.
    matillion-matillion-snowflake

The pipeline now runs automatically at the specified frequency.



Last update: October 20, 2024