Data Export (Export Definition): Difference between revisions

From Grooper Wiki
m Randallkinard moved page Data Export to Data Export (Export Type): new naming convention
No edit summary
 
(45 intermediate revisions by 4 users not shown)
Line 1: Line 1:
{{Migrated2023}}
{{AutoVersion}}
{{2023:{{PAGENAME}}}}
 
<blockquote>{{#lst:Glossary|Data Export}}</blockquote>
 
The '''Data Export''' is an [[Export Definition]] that enables you to map extracted data from Grooper documents to one or more tables in an external database using a configured {{IconName|Data Connection}} [[Data Connection]]. This Export Definition is added to an [[Export Behavior]] on a {{IconName|Content Model}} [[Content Model]] to automate the export of structured data to a database during the {{IconName|Export}} [[Export]] step in a {{IconName|Batch Process}} [[Batch Process]].
*<li class="fyi-bullet">Most typically Export Behaviors are added to '''Content Models''' but can be added to '''Content Categories''' and '''Document Types''' for more Document Type-specific export scenarios.
 
== Overview ==
 
Data Export provides a flexible and powerful way to send structured data from Grooper to relational databases. It supports:
 
* Exporting to multiple tables in a single operation
* Executing custom SQL statements for advanced scenarios
* Dynamically selecting alternate databases at runtime
* Mapping data from [[Data Model]]s, [[Data Section]]s, and [[Data Table]]s to database columns
 
This Export Definition is ideal for integrating Grooper with accounting systems, ERP platforms, document management systems, and other enterprise applications that rely on relational databases.
 
== How it works ==
 
When a Data Export is executed (typically as part of a Batch Process via an Export activity), it performs the following steps:
 
# Connects to the target database using the configured Data Connection.
# (''Only if configured'') Evaluates the "Alternate Database" expression to determine the target database
# Executes each "Table Mapping" in sequence, exporting rows to the specified tables
# (''Only if configured'') Executes any "Custom Statements" defined in the "Custom Statements" property
# (''Only if configured'') If any Table Mapping has "Return Values" configured, saves the updated data back to the document
 
== Configuration basics ==
 
Data Export is primarily configured using the '''Connection''' and '''Table Mappings''' settings.
 
More advanced scenarios are accommodated by the '''Alternate Databases''', '''Match Columns''' (and '''Match Actions''') and '''Custom Statements''' settings.
 
=== Primary configuration settings ===
 
These settings are required for all database export scenarios.
 
==== Connection ====
 
The "Connection" property specifies the Data Connection used for export. This determines the target database server, authentication method, and connection settings.
 
* Select a connection to an MS SQL Server, Oracle, MySQL, or other supported database
* The connection must be configured in the repository before assigning it to the Data Export
* If the connection is not set, export will not be performed
 
==== Table Mappings ====
 
The "Table Mappings" property defines the list of table export operations to perform. Each Table Mapping specifies:
 
* The source data element (from the document's Data Model) to export
* The target database table
* Column mappings between Grooper fields and database columns
* Optional duplicate detection and match actions
* Optional return values to read back after insert
 
===== Example =====
 
For an invoice document, you might configure two table mappings:
 
# Header Mapping: Maps invoice header fields ("Invoice Number", "Date", "Total", etc.) to an "Invoices" table
# Line Items Mapping: Maps the "Line Items" Data Table to an "InvoiceItems" table, exporting one row per line item
 
Table mappings are executed in the order listed. At least one table mapping or custom statement must be defined for export to occur.
 
=== Advanced configuration settings ===
 
These settings are optional and allow for more advanced database export scenarios.
 
==== Match Columns and Match Actions ====
 
These settings are configured in the '''Table Mapping''' settings of a Data Export. '''Match Columns''' and '''Match Actions''' work together to control how Data Export handles duplicate records during the export process.
 
===== Match Columns =====
 
The "Match Columns" property specifies one or more database columns to check for existing records before inserting new data. When Match Columns are configured, the export process queries the target table to see if a row already exists with matching values.
 
'''How it works:'''
 
# Before exporting each row, a SELECT query is executed against the target table
# The query searches for existing rows where the specified Match Columns match the current record's values
# If a match is found, the "Match Action" determines what happens next
# If no match is found, a new row is inserted normally
 
'''Configuration:'''
 
* Select one or more columns from the target database table
* Multiple columns create a composite match key (combined using AND logic)
* The columns you select should uniquely identify records or represent your duplicate detection criteria
 
'''Example:'''
 
To prevent duplicate invoices from being inserted, set "Match Columns" to <code>InvoiceNumber</code>. The system will check if an invoice with that number already exists before inserting.
 
For a more complex example, you might match on multiple columns like <code>CustomerID</code> and <code>OrderDate</code> to identify duplicate orders.
 
===== Match Action =====
 
The "Match Action" property determines what happens when a duplicate record is detected. This property is only visible when "Match Columns" are specified.
 
'''Available options:'''
 
; Skip
: No changes are made for matching records. Use this option to prevent overwriting existing data—only new records are inserted.
: '''Example:''' When exporting invoices, skip any that already exist in the database to avoid duplicates.
 
; Update
: The existing row is updated with new values from the current record. Only columns in "Column Mappings" (excluding Match Columns and Return Values) are updated.
: '''Example:''' When exporting customer information, update the address and phone number for existing customers while inserting new customers.
 
; Error
: An exception is raised and the export process is halted immediately. Use this option to enforce strict uniqueness and catch unexpected duplicates.
: '''Example:''' When exporting employee records, raise an error if a duplicate SSN is detected to prevent data quality issues.
 
===== Examples =====
 
'''Preventing duplicate invoices:'''
* '''Match Columns''': <code>InvoiceNumber</code>
* '''Match Action''': Skip
* '''Result''': Only new invoices are inserted; existing invoices are ignored
 
'''Updating customer information:'''
* '''Match Columns''': <code>CustomerID</code>
* '''Match Action''': Update
* '''Result''': Updates address, phone, and other fields for existing customers; inserts new customers
 
'''Enforcing uniqueness:'''
* '''Match Columns''': <code>SSN</code>, <code>DateOfBirth</code>
* '''Match Action''': Error
* '''Result''': Stops export if a person with the same SSN and date of birth already exists
 
 
*<li class="fyi-bullet">'''Note:''' When Match Columns are used, the "Rows Updated" statistic is tracked instead of "Rows Exported" to reflect the number of existing records processed.
 
==== Custom Statements ====
 
The "Custom Statements" property allows you to define custom SQL statements to execute during export. These are useful for:
 
* Inserting log entries after export
* Updating status flags or timestamps in related tables
* Calling stored procedures for downstream processing
* Performing calculations or transformations not supported by standard table mappings
 
Custom statements are executed '''after''' all table mappings have completed. They can reference data from the document's Data Model using expressions, allowing for dynamic SQL generation.
 
===== Example =====
 
After exporting an invoice, you might execute a custom statement to update a "LastProcessed" timestamp in a tracking table:
 
<pre>
UPDATE ProcessingLog
SET LastProcessed = GETDATE()
WHERE InvoiceNumber = '@InvoiceNumber'
</pre>
 
==== Alternate Database ====
 
The "Alternate Database" property allows you to specify an expression that calculates the name of an alternate database to which data will be exported. This is useful for:
 
* Multi-tenant environments where each customer has a separate database
* Partitioned databases based on date, region, or other criteria
* Dynamic routing of exports based on document metadata
 
'''Important notes:'''
 
* This feature is only supported for MS SQL Data Connections
* The alternate database must have the same schema as the primary database
* The expression can reference fields from the document's Data Model
* If left blank, the default database from the Data Connection is used
 
===== Example =====
 
To route exports to a customer-specific database:
 
<pre>
"Customer_" & CustomerID
</pre>
 
This would export to a database named "Customer_12345" if the "CustomerID" field contains "12345".
 
== Validation and error handling ==
 
Data Export includes validation to ensure proper configuration:
 
* A Data Connection must be set
* At least one Table Mapping or Custom Statement must be defined
* If "Alternate Database" is set, the Data Connection must utilize an MS SQL database.
 
During export, errors such as connection failures, SQL errors, or constraint violations are logged and will halt the export for the affected document. The document may be flagged for review depending on your Batch Process configuration.
 
== Use cases ==
 
=== Exporting invoice data to an accounting system ===
 
Configure a Data Export on an "Invoice" Content Model with:
 
* '''Connection''': Set to your accounting database connection
* '''Table Mappings''':
** Map invoice header fields from Grooper to an "Invoices" database table
** Map line items Data Table from Grooper to an "InvoiceLineItems" databse table
* '''Return Values''': Configure the header mapping to read back an auto-generated "InvoiceID" column
 
=== Writing extracted form data to multiple related tables ===
 
For a loan application document:
 
* Map applicant information to a "Borrowers" table
* Map employment history to a "EmploymentHistory" table
* Map references to a "References" table
* Use "Match Columns" to prevent duplicate entries
 
=== Executing custom SQL after export ===
 
After exporting a document, execute a stored procedure to trigger downstream processing:
 
<pre>
EXEC ProcessNewDocument @DocumentID = '@DocumentID', @BatchID = '@BatchID'
</pre>
 
=== Dynamically routing exports to different databases ===
 
For a multi-tenant SaaS application, use the "Alternate Database" property to route each customer's data to their dedicated database:
 
<pre>
"TenantDB_" & TenantID
</pre>
 
== Best practices ==
 
* '''Use Table Mappings for standard scenarios''': For straightforward data export, configure Table Mappings rather than writing custom SQL
* '''Use Custom Statements for advanced logic''': Reserve custom statements for scenarios not supported by table mappings, such as calling stored procedures or updating multiple tables
* '''Test with small batches first''': Before deploying to production, test your Data Export configuration with a small test batch to ensure mappings and SQL are correct
* '''Monitor export logs''': Review diagnostic logs and error messages to troubleshoot connection issues, SQL errors, or data validation problems
* '''Ensure database schemas match''': If using "Alternate Database", verify that all target databases have identical schemas to avoid export failures
* '''Use Return Values for generated keys''': When exporting to tables with auto-generated identity columns, configure "Return Values" to capture and store the generated IDs for use in subsequent mappings or activities
 
== How to add a Data Export to an Export Behavior ==
 
This guide explains how to configure a '''Data Export''' within an '''Export Behavior''' on a Content Type, enabling automated export of extracted data to a database as part of your batch processing workflow.
 
=== Prerequisites ===
 
Before configuring a Data Export, ensure you have:
 
* A '''Content Type''' with a '''Data Model''' containing the fields, sections, or tables you want to export
#*<li class="fyi-bullet">Most typically Export Behaviors are added to '''Content Models''' but can be added to '''Content Categories''' and '''Document Types''' for more Document Type-specific export scenarios in larger Content Models with a larger variety of Document Types.
* A '''Data Connection''' configured and tested for your target database
* Appropriate database permissions to insert, update, or execute custom SQL statements
 
=== Step 1: Add an Export Behavior to your Content Type ===
 
# In the Grooper Repository's Node Tree, navigate to your Content Type
# If the Content Type does not already have an '''Export Behavior''', add one:
#* Right-click the Content Type and select '''Add Behavior'''
#* Choose '''Export Behavior''' from the list of available behaviors
#* Click '''OK'''
 
The Export Behavior will appear in the "Behaviors" collection in the Content Type's property grid.
 
=== Step 2: Add a Data Export definition ===
 
# Select the Content Type in the Node Tree.
# In the Property Grid, locate the '''Export Behavior''' section
# Open the "Export Definitions" editor (Click the '''...''' button)
# In the '''Object Collection Editor''' dialog that appears:
#* Click the '''Add''' dropdown button
#* Select '''Data Export''' from the list of available export types
#* Click '''OK'''
 
A new Data Export definition will be added to the collection.
 
=== Step 3: Configure the Data Connection ===
 
# With the Data Export definition selected in the Object Collection Editor, locate the "Connection" property
# Click the dropdown or ellipsis button to select your Data Connection
# Choose the connection that points to your target database
# Click '''OK''' to confirm
 
*<li class="fyi-bullet">Tip: If you haven't created a [[Data Connection]] yet, you'll need to add one to your Project first. Right click the Project or any of its subfolders, add a new Data Connection, and configure the server, database, and authentication settings.
 
=== Step 4: Configure Table Mappings ===
 
Table Mappings define how data from your Data Model is exported to database tables.
 
# In the Data Export properties, open the "Table Mappings" editor (Click the '''...''' button)
# In the Table Mappings Editor:
#* Click '''Add''' to create a new Table Mapping
#* Configure the following properties for each mapping:
#** '''Source Scope''': Select the Data Model (or Data Section or Data Table from your Data Model) to export
#** '''Table Name''': Enter the name of the target database table
#** '''Column Mappings''': Map each field to a corresponding database column
#** '''Match Columns''' (optional): Define columns used to detect duplicates
#** '''Match Action''' (optional): Specify what to do when a duplicate is found (Insert, Update, Skip, or Error)
#** '''Return Values''' (optional): Specify columns whose values should be read back after insert (e.g., auto-generated IDs)
# Click '''OK''' when all mappings are configured
 
*<li class="fyi-bullet">Note: You can add multiple Table Mappings to export data to several tables. For example, export invoice headers to an "Invoices" table and line items to an "InvoiceLineItems" table.
 
===== Choosing the right Source Scope =====
 
The "Source Scope" property in a Table Mapping determines which Data Elements from your Data Model will be exported and how many rows will be created in the database. Selecting the correct scope is critical to achieving the desired export structure.
 
'''Available scope option:'''
 
; Data Model
: Exports '''one row per document''' to the database table. Choose this scope when exporting document-level information that appears once per document.
: '''Examples:''' Invoice header data (invoice number, date, vendor, total amount), loan application details, purchase order summary information.
 
; Data Section
: Exports '''one row per section instance''' to the database table. Choose this scope when exporting repeating sections that may appear multiple times within a document.
: '''Examples:''' Multiple addresses on a form, repeating claim details in an insurance document, multiple signatories on a contract.
 
; Data Table
: Exports '''one row per table row''' to the database table. Choose this scope when exporting line item data organized in a table structure.
: '''Examples:''' Invoice line items, purchase order details, transaction histories, itemized charges.
 
===== Decision guide =====
 
Ask yourself these questions to determine the appropriate scope:
 
'''Question 1: How many times does this data appear in a document?'''
* '''Once per document''' → Use '''Data Model'''
* '''Multiple instances, but not in a table format''' → Use '''Data Section'''
* '''Multiple rows in a table structure''' → Use '''Data Table'''
 
'''Question 2: What does each row in my database table represent?'''
* '''A document''' → Use '''Data Model'''
* '''A repeating section or entity within a document''' → Use '''Data Section'''
* '''A line item or table row''' → Use '''Data Table'''
 
===== Common scenarios =====
 
'''Scenario 1: Exporting invoice headers and line items'''
 
You have an invoice with header information and multiple line items. You need two [[Table Mapping]]s:
 
# '''Header Mapping'''
#* Source Scope: Data Model (the root invoice Data Model)
#* Target Table: "Invoices"
#* Result: One row per invoice document
# '''Line Items Mapping'''
#* Source Scope: LineItems (a Data Table in your Data Model)
#* Target Table: "InvoiceLineItems"
#* Result: Multiple rows per invoice, one for each line item
 
'''Scenario 2: Exporting loan applications with multiple borrowers'''
 
You have a loan application that can have one or more borrowers (co-applicants). You need two Table Mappings:
 
# '''Application Mapping'''
#* Source Scope: Data Model (the root loan application Data Model)
#* Target Table: "LoanApplications"
#* Result: One row per loan application document
# '''Borrower Mapping'''
#* Source Scope: Borrowers (a multi-instance Data Section in your Data Model)
#* Target Table: "Borrowers"
#* Result: One or more rows per application, one for each borrower section
 
'''Scenario 3: Exporting simple form data'''
 
You have a single-page form with customer information (name, address, phone, etc.) and no repeating elements. You need one Table Mapping:
 
# '''Customer Mapping'''
#* Source Scope: Data Model (the root customer Data Model)
#* Target Table: "Customers"
#* Result: One row per document
 
*<li class="fyi-bullet>Tip: When exporting to multiple related tables, use Return Values to capture auto-generated IDs from parent tables and include them in child table mappings to maintain referential integrity.
 
=== Step 5: Configure Custom Statements (Optional) ===
 
Use custom statements to execute additional SQL logic after table mappings are processed.
 
# In the Data Export properties, open the "Custom Statements" editor (Click the '''...''' button)
# In the Custom Statements Collection Editor:
#* Click '''Add''' to create a new Custom Statement
#* Enter your SQL statement in the "Statement" property
#* Use expressions to reference fields from your Data Model (e.g., <code>@InvoiceNumber</code>)
# Click '''OK''' when finished
 
'''Example custom statement:'''
 
<pre>
UPDATE ProcessingLog
SET LastProcessed = GETDATE()
WHERE InvoiceNumber = '@InvoiceNumber'
</pre>
 
=== Step 6: Configure Alternate Database (Optional) ===
 
If you need to dynamically route exports to different databases based on document data, configure the "Alternate Database" property.
*<li class="attn-bullet">Be aware: This feature is only supported for MS SQL Data Connections. The alternate database must have the same schema as the primary database.
 
# In the Data Export properties, locate the "Alternate Database" property
# Open the expression editor (Click the '''...''' button)
# Enter a code expression that evaluates to the target database name
# Click '''OK'''
 
'''Example expression:'''
 
<pre>
"Customer_" & CustomerID
</pre>
 
This would export to a database named "Customer_12345" if the "CustomerID" field contains "12345".
 
=== Step 7: Set the Link Name (Optional) ===
 
The "Link Name" property determines the label for the Content Link created after export.
 
# In the Data Export properties, locate the "Link Name" property
# Enter a descriptive name (e.g., "Database Export" or "ERP Export")
# If left blank, the default name "Export" will be used
 
=== Step 8: Save and test ===
 
# Click '''OK''' to close all dialogs and save your Data Export configuration
# Save the Content Type
# Test the export:
#* Create or open a Batch with documents assigned to this Content Type
#* Ensure the documents have been extracted and contain valid data
#* Run the Export activity (either manually or as part of a Batch Process)
#* Verify that data appears correctly in your target database tables
 
=== Troubleshooting ===
 
* '''Connection errors''': Verify that your Data Connection is configured correctly and that the database server is accessible
* '''SQL errors''': Check that table and column names are correct and that your database user has appropriate permissions
* '''No data exported''': Ensure that your Data Model has been extracted and that the mapped fields contain values
* '''Duplicate key violations''': Configure "Match Columns" and "Match Action" to handle duplicates appropriately
 
== See also ==
 
* [[Export Definition]]
* [[Export Behavior]]
* [[Data Connection]]
* [[Export]] activity

Latest revision as of 17:42, 6 February 2026

This article is about the current version of Grooper.

Note that some content may still need to be updated.

20252024 20232021

Data Export is an Export Definition available when configuring an Export Behavior. It exports extracted document data over a database Data Connection, allowing users to export data to a Microsoft SQL Server or ODBC compliant database.

The Data Export is an Export Definition that enables you to map extracted data from Grooper documents to one or more tables in an external database using a configured database Data Connection. This Export Definition is added to an Export Behavior on a stacks Content Model to automate the export of structured data to a database during the output Export step in a settings Batch Process.

  • Most typically Export Behaviors are added to Content Models but can be added to Content Categories and Document Types for more Document Type-specific export scenarios.

Overview

Data Export provides a flexible and powerful way to send structured data from Grooper to relational databases. It supports:

  • Exporting to multiple tables in a single operation
  • Executing custom SQL statements for advanced scenarios
  • Dynamically selecting alternate databases at runtime
  • Mapping data from Data Models, Data Sections, and Data Tables to database columns

This Export Definition is ideal for integrating Grooper with accounting systems, ERP platforms, document management systems, and other enterprise applications that rely on relational databases.

How it works

When a Data Export is executed (typically as part of a Batch Process via an Export activity), it performs the following steps:

  1. Connects to the target database using the configured Data Connection.
  2. (Only if configured) Evaluates the "Alternate Database" expression to determine the target database
  3. Executes each "Table Mapping" in sequence, exporting rows to the specified tables
  4. (Only if configured) Executes any "Custom Statements" defined in the "Custom Statements" property
  5. (Only if configured) If any Table Mapping has "Return Values" configured, saves the updated data back to the document

Configuration basics

Data Export is primarily configured using the Connection and Table Mappings settings.

More advanced scenarios are accommodated by the Alternate Databases, Match Columns (and Match Actions) and Custom Statements settings.

Primary configuration settings

These settings are required for all database export scenarios.

Connection

The "Connection" property specifies the Data Connection used for export. This determines the target database server, authentication method, and connection settings.

  • Select a connection to an MS SQL Server, Oracle, MySQL, or other supported database
  • The connection must be configured in the repository before assigning it to the Data Export
  • If the connection is not set, export will not be performed

Table Mappings

The "Table Mappings" property defines the list of table export operations to perform. Each Table Mapping specifies:

  • The source data element (from the document's Data Model) to export
  • The target database table
  • Column mappings between Grooper fields and database columns
  • Optional duplicate detection and match actions
  • Optional return values to read back after insert
Example

For an invoice document, you might configure two table mappings:

  1. Header Mapping: Maps invoice header fields ("Invoice Number", "Date", "Total", etc.) to an "Invoices" table
  2. Line Items Mapping: Maps the "Line Items" Data Table to an "InvoiceItems" table, exporting one row per line item

Table mappings are executed in the order listed. At least one table mapping or custom statement must be defined for export to occur.

Advanced configuration settings

These settings are optional and allow for more advanced database export scenarios.

Match Columns and Match Actions

These settings are configured in the Table Mapping settings of a Data Export. Match Columns and Match Actions work together to control how Data Export handles duplicate records during the export process.

Match Columns

The "Match Columns" property specifies one or more database columns to check for existing records before inserting new data. When Match Columns are configured, the export process queries the target table to see if a row already exists with matching values.

How it works:

  1. Before exporting each row, a SELECT query is executed against the target table
  2. The query searches for existing rows where the specified Match Columns match the current record's values
  3. If a match is found, the "Match Action" determines what happens next
  4. If no match is found, a new row is inserted normally

Configuration:

  • Select one or more columns from the target database table
  • Multiple columns create a composite match key (combined using AND logic)
  • The columns you select should uniquely identify records or represent your duplicate detection criteria

Example:

To prevent duplicate invoices from being inserted, set "Match Columns" to InvoiceNumber. The system will check if an invoice with that number already exists before inserting.

For a more complex example, you might match on multiple columns like CustomerID and OrderDate to identify duplicate orders.

Match Action

The "Match Action" property determines what happens when a duplicate record is detected. This property is only visible when "Match Columns" are specified.

Available options:

Skip
No changes are made for matching records. Use this option to prevent overwriting existing data—only new records are inserted.
Example: When exporting invoices, skip any that already exist in the database to avoid duplicates.
Update
The existing row is updated with new values from the current record. Only columns in "Column Mappings" (excluding Match Columns and Return Values) are updated.
Example: When exporting customer information, update the address and phone number for existing customers while inserting new customers.
Error
An exception is raised and the export process is halted immediately. Use this option to enforce strict uniqueness and catch unexpected duplicates.
Example: When exporting employee records, raise an error if a duplicate SSN is detected to prevent data quality issues.
Examples

Preventing duplicate invoices:

  • Match Columns: InvoiceNumber
  • Match Action: Skip
  • Result: Only new invoices are inserted; existing invoices are ignored

Updating customer information:

  • Match Columns: CustomerID
  • Match Action: Update
  • Result: Updates address, phone, and other fields for existing customers; inserts new customers

Enforcing uniqueness:

  • Match Columns: SSN, DateOfBirth
  • Match Action: Error
  • Result: Stops export if a person with the same SSN and date of birth already exists


  • Note: When Match Columns are used, the "Rows Updated" statistic is tracked instead of "Rows Exported" to reflect the number of existing records processed.

Custom Statements

The "Custom Statements" property allows you to define custom SQL statements to execute during export. These are useful for:

  • Inserting log entries after export
  • Updating status flags or timestamps in related tables
  • Calling stored procedures for downstream processing
  • Performing calculations or transformations not supported by standard table mappings

Custom statements are executed after all table mappings have completed. They can reference data from the document's Data Model using expressions, allowing for dynamic SQL generation.

Example

After exporting an invoice, you might execute a custom statement to update a "LastProcessed" timestamp in a tracking table:

UPDATE ProcessingLog 
SET LastProcessed = GETDATE() 
WHERE InvoiceNumber = '@InvoiceNumber'

Alternate Database

The "Alternate Database" property allows you to specify an expression that calculates the name of an alternate database to which data will be exported. This is useful for:

  • Multi-tenant environments where each customer has a separate database
  • Partitioned databases based on date, region, or other criteria
  • Dynamic routing of exports based on document metadata

Important notes:

  • This feature is only supported for MS SQL Data Connections
  • The alternate database must have the same schema as the primary database
  • The expression can reference fields from the document's Data Model
  • If left blank, the default database from the Data Connection is used
Example

To route exports to a customer-specific database:

"Customer_" & CustomerID

This would export to a database named "Customer_12345" if the "CustomerID" field contains "12345".

Validation and error handling

Data Export includes validation to ensure proper configuration:

  • A Data Connection must be set
  • At least one Table Mapping or Custom Statement must be defined
  • If "Alternate Database" is set, the Data Connection must utilize an MS SQL database.

During export, errors such as connection failures, SQL errors, or constraint violations are logged and will halt the export for the affected document. The document may be flagged for review depending on your Batch Process configuration.

Use cases

Exporting invoice data to an accounting system

Configure a Data Export on an "Invoice" Content Model with:

  • Connection: Set to your accounting database connection
  • Table Mappings:
    • Map invoice header fields from Grooper to an "Invoices" database table
    • Map line items Data Table from Grooper to an "InvoiceLineItems" databse table
  • Return Values: Configure the header mapping to read back an auto-generated "InvoiceID" column

Writing extracted form data to multiple related tables

For a loan application document:

  • Map applicant information to a "Borrowers" table
  • Map employment history to a "EmploymentHistory" table
  • Map references to a "References" table
  • Use "Match Columns" to prevent duplicate entries

Executing custom SQL after export

After exporting a document, execute a stored procedure to trigger downstream processing:

EXEC ProcessNewDocument @DocumentID = '@DocumentID', @BatchID = '@BatchID'

Dynamically routing exports to different databases

For a multi-tenant SaaS application, use the "Alternate Database" property to route each customer's data to their dedicated database:

"TenantDB_" & TenantID

Best practices

  • Use Table Mappings for standard scenarios: For straightforward data export, configure Table Mappings rather than writing custom SQL
  • Use Custom Statements for advanced logic: Reserve custom statements for scenarios not supported by table mappings, such as calling stored procedures or updating multiple tables
  • Test with small batches first: Before deploying to production, test your Data Export configuration with a small test batch to ensure mappings and SQL are correct
  • Monitor export logs: Review diagnostic logs and error messages to troubleshoot connection issues, SQL errors, or data validation problems
  • Ensure database schemas match: If using "Alternate Database", verify that all target databases have identical schemas to avoid export failures
  • Use Return Values for generated keys: When exporting to tables with auto-generated identity columns, configure "Return Values" to capture and store the generated IDs for use in subsequent mappings or activities

How to add a Data Export to an Export Behavior

This guide explains how to configure a Data Export within an Export Behavior on a Content Type, enabling automated export of extracted data to a database as part of your batch processing workflow.

Prerequisites

Before configuring a Data Export, ensure you have:

  • A Content Type with a Data Model containing the fields, sections, or tables you want to export
    • Most typically Export Behaviors are added to Content Models but can be added to Content Categories and Document Types for more Document Type-specific export scenarios in larger Content Models with a larger variety of Document Types.
  • A Data Connection configured and tested for your target database
  • Appropriate database permissions to insert, update, or execute custom SQL statements

Step 1: Add an Export Behavior to your Content Type

  1. In the Grooper Repository's Node Tree, navigate to your Content Type
  2. If the Content Type does not already have an Export Behavior, add one:
    • Right-click the Content Type and select Add Behavior
    • Choose Export Behavior from the list of available behaviors
    • Click OK

The Export Behavior will appear in the "Behaviors" collection in the Content Type's property grid.

Step 2: Add a Data Export definition

  1. Select the Content Type in the Node Tree.
  2. In the Property Grid, locate the Export Behavior section
  3. Open the "Export Definitions" editor (Click the ... button)
  4. In the Object Collection Editor dialog that appears:
    • Click the Add dropdown button
    • Select Data Export from the list of available export types
    • Click OK

A new Data Export definition will be added to the collection.

Step 3: Configure the Data Connection

  1. With the Data Export definition selected in the Object Collection Editor, locate the "Connection" property
  2. Click the dropdown or ellipsis button to select your Data Connection
  3. Choose the connection that points to your target database
  4. Click OK to confirm
  • Tip: If you haven't created a Data Connection yet, you'll need to add one to your Project first. Right click the Project or any of its subfolders, add a new Data Connection, and configure the server, database, and authentication settings.

Step 4: Configure Table Mappings

Table Mappings define how data from your Data Model is exported to database tables.

  1. In the Data Export properties, open the "Table Mappings" editor (Click the ... button)
  2. In the Table Mappings Editor:
    • Click Add to create a new Table Mapping
    • Configure the following properties for each mapping:
      • Source Scope: Select the Data Model (or Data Section or Data Table from your Data Model) to export
      • Table Name: Enter the name of the target database table
      • Column Mappings: Map each field to a corresponding database column
      • Match Columns (optional): Define columns used to detect duplicates
      • Match Action (optional): Specify what to do when a duplicate is found (Insert, Update, Skip, or Error)
      • Return Values (optional): Specify columns whose values should be read back after insert (e.g., auto-generated IDs)
  3. Click OK when all mappings are configured
  • Note: You can add multiple Table Mappings to export data to several tables. For example, export invoice headers to an "Invoices" table and line items to an "InvoiceLineItems" table.
Choosing the right Source Scope

The "Source Scope" property in a Table Mapping determines which Data Elements from your Data Model will be exported and how many rows will be created in the database. Selecting the correct scope is critical to achieving the desired export structure.

Available scope option:

Data Model
Exports one row per document to the database table. Choose this scope when exporting document-level information that appears once per document.
Examples: Invoice header data (invoice number, date, vendor, total amount), loan application details, purchase order summary information.
Data Section
Exports one row per section instance to the database table. Choose this scope when exporting repeating sections that may appear multiple times within a document.
Examples: Multiple addresses on a form, repeating claim details in an insurance document, multiple signatories on a contract.
Data Table
Exports one row per table row to the database table. Choose this scope when exporting line item data organized in a table structure.
Examples: Invoice line items, purchase order details, transaction histories, itemized charges.
Decision guide

Ask yourself these questions to determine the appropriate scope:

Question 1: How many times does this data appear in a document?

  • Once per document → Use Data Model
  • Multiple instances, but not in a table format → Use Data Section
  • Multiple rows in a table structure → Use Data Table

Question 2: What does each row in my database table represent?

  • A document → Use Data Model
  • A repeating section or entity within a document → Use Data Section
  • A line item or table row → Use Data Table
Common scenarios

Scenario 1: Exporting invoice headers and line items

You have an invoice with header information and multiple line items. You need two Table Mappings:

  1. Header Mapping
    • Source Scope: Data Model (the root invoice Data Model)
    • Target Table: "Invoices"
    • Result: One row per invoice document
  2. Line Items Mapping
    • Source Scope: LineItems (a Data Table in your Data Model)
    • Target Table: "InvoiceLineItems"
    • Result: Multiple rows per invoice, one for each line item

Scenario 2: Exporting loan applications with multiple borrowers

You have a loan application that can have one or more borrowers (co-applicants). You need two Table Mappings:

  1. Application Mapping
    • Source Scope: Data Model (the root loan application Data Model)
    • Target Table: "LoanApplications"
    • Result: One row per loan application document
  2. Borrower Mapping
    • Source Scope: Borrowers (a multi-instance Data Section in your Data Model)
    • Target Table: "Borrowers"
    • Result: One or more rows per application, one for each borrower section

Scenario 3: Exporting simple form data

You have a single-page form with customer information (name, address, phone, etc.) and no repeating elements. You need one Table Mapping:

  1. Customer Mapping
    • Source Scope: Data Model (the root customer Data Model)
    • Target Table: "Customers"
    • Result: One row per document
  • Tip: When exporting to multiple related tables, use Return Values to capture auto-generated IDs from parent tables and include them in child table mappings to maintain referential integrity.

Step 5: Configure Custom Statements (Optional)

Use custom statements to execute additional SQL logic after table mappings are processed.

  1. In the Data Export properties, open the "Custom Statements" editor (Click the ... button)
  2. In the Custom Statements Collection Editor:
    • Click Add to create a new Custom Statement
    • Enter your SQL statement in the "Statement" property
    • Use expressions to reference fields from your Data Model (e.g., @InvoiceNumber)
  3. Click OK when finished

Example custom statement:

UPDATE ProcessingLog 
SET LastProcessed = GETDATE() 
WHERE InvoiceNumber = '@InvoiceNumber'

Step 6: Configure Alternate Database (Optional)

If you need to dynamically route exports to different databases based on document data, configure the "Alternate Database" property.

  • Be aware: This feature is only supported for MS SQL Data Connections. The alternate database must have the same schema as the primary database.
  1. In the Data Export properties, locate the "Alternate Database" property
  2. Open the expression editor (Click the ... button)
  3. Enter a code expression that evaluates to the target database name
  4. Click OK

Example expression:

"Customer_" & CustomerID

This would export to a database named "Customer_12345" if the "CustomerID" field contains "12345".

Step 7: Set the Link Name (Optional)

The "Link Name" property determines the label for the Content Link created after export.

  1. In the Data Export properties, locate the "Link Name" property
  2. Enter a descriptive name (e.g., "Database Export" or "ERP Export")
  3. If left blank, the default name "Export" will be used

Step 8: Save and test

  1. Click OK to close all dialogs and save your Data Export configuration
  2. Save the Content Type
  3. Test the export:
    • Create or open a Batch with documents assigned to this Content Type
    • Ensure the documents have been extracted and contain valid data
    • Run the Export activity (either manually or as part of a Batch Process)
    • Verify that data appears correctly in your target database tables

Troubleshooting

  • Connection errors: Verify that your Data Connection is configured correctly and that the database server is accessible
  • SQL errors: Check that table and column names are correct and that your database user has appropriate permissions
  • No data exported: Ensure that your Data Model has been extracted and that the mapped fields contain values
  • Duplicate key violations: Configure "Match Columns" and "Match Action" to handle duplicates appropriately

See also