Home Knowledge Base FAQ - Smart Connector Create a Smart Connector with PostgreSQL Connection type

FAQ - Smart Connector

FAQ - Smart Connector

Create a Smart Connector with PostgreSQL Connection type

Currently TapClicks pulls in data through direct integrations with a variety of data sources. The Smart Connectors compliments this by providing an option to bring in data using .csv, .tsv, .xlsx and .odf file formats. Clients will typically export data in .csv, .tsv, .xlsx and .ods file formats from various platforms and import these files using the Smart Connectors, either through a manual upload or by using a FTP / SFTP / Google Drive / Dropbox for data storage purposes. These Smart Connectors can be used either as a long term solution or as a workaround until an integration with the required data source is created. Recently, TapClicks has added an option to connect directly to SQL databases such as MySQL, PostgreSQL, and SOQL (Salesforce). 

The PostgreSQL connection using the Smart Connector enables a new source of data in the system. Data is fetched based on a predefined SQL query and would work similar to a pgAdmin SQL client but allowing only data read operation. The requirements and functionality is similar to  other Smart Connector Connection Type / Delivery Type options.

To Create a Smart Connector,
  1. Open TapClicks and in the left pane, click Administrator 
  2. Click Smart Connector under Data Settings
  3. Click the icon next to Smart Connector on the top of the screen. 
  4. Enter a name and select PostgreSQL from the Delivery type drop down.
  5. After selecting the PostgreSQL option, the user will be prompted to connect to the database using credentials and other necessary information. Please enter the information as given in the table below.
    PostgreSQL
    Host
    IP address of the host on which the database is running / hosted 
    Port Port number on which the database can be accessed | Default Port : 5432 
    Database Name Name of database from which data has to be fetched
    Username Username used to access the database
    Password Password used to access the database
Note: Only Postgresql Databases for which external connections over TCP/IP are allowed can be connected. The server on which the database rests must allow external connection
  1. Before creating an actual connection, check if the database is accessible by clicking test connection. Once the connection is established, the check sign on the test connection turns green.
Unverified Connection​
Verified Connection ( Verified by the Green tick next to the TEST CONNECTION Button)

If the connection is unsuccessful, an error message will be displayed in the top-right corner. ​
  1. If the connection is unsuccessful, an error message appears on the screen. Some common things to check if this occurs are: 
  • Required Fields : Host Address, Port Name, Database Name, Username, Password​
  • Permissions on your host to access the database externally from other sources
  1. After you enter the information for Smart Connector, fields to create a Query request becomes visible. You need to create a Query request to get the required data from the remote PostgreSQL Database. Smart Connectors support both basic and advanced READ queries using SELECT statement. 
  • SELECT 
  • CALL (this procedure is permitted, but please inform your Customer Success Manager if you intend to use this command to ensure that the Smart Connector meets this advanced use case)

Note: Only columns selected in the query will be fetched by the Smart Connector. Therefore, users need to make sure they are including all necessary columns in the query.

  1. The only commands that are not supported include are INSERT, UPDATE and DELETE. In order to achieve memory optimization, the system will automatically paginate the data in batches of rows. Adding a LIMIT at the end of your query will prompt the system to have that many number of rows in the batch. However, because the order in which the rows are fetched from the database could vary from query to query, it’s best to add in your SQL statement an ORDER BY to ensure the data integrity. 
    For example: 
    SELECT * FROM customer_value WHERE (effective_to_date BETWEEN '%START_DATE%' AND '%END_DATE%') OR 1=1 ORDER BY effective_to_date
  2. Data types of all columns are detected automatically. It is recommended that users check all the data types before saving the configuration to avoid incorrect data types. Please see the Field Type column in the image below:

    For example, Auto detection may select a column with number based on what is in that first row of data but that might not be how you intend the field to be used.   For example: Id - as data type Number since it comes through as a 3 in the sample file, but it should be stored as Text as no mathematical operations such as Sum/Average is expected on a Id
  3. Frequency of fetch
    Historic fetch and Daily fetches. 
    Full :  data within the last 6 months is fetched
    Partial : data within the last 3 days is fetched  
    Note: This will depend on whether your Smart Connector is configured to fetch data based on Date or based on Last Modified Date.
  4. Data Update
    If certain fields in the query needs to be modified (unique fields, dates, and assigning configurations), the user would need to delete the data loaded to date through this Smart Connector before making the modifications. This can be done by clicking Delete, which is next to the Smart Connector names in the Smart Connector list.
    However, a user can include additional fields on the query similar to if we were to add a new field from a .csv file to a Smart Connector. User can do this without deleting the previously loaded data.  
    Partial / full fetch apply to existing and new assigning in regards to filling the database with these new fields.  This functionality is in line with all other Connection Type / Delivery Type options in the system today.
    After creating the Smart Connector, users will have to map their Smart Connector data to clients in TapAnalytics and click Update Dashboard before the data starts pouring in the system.