Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Anchor
top
top

...

  • Execute and manage batch processes.
  • Roll back single entities if a batch process is interrupted.
  • Archive data to control database size
  • Mark data associated with archived data for reference. 
  • Use import files to import hardware directly into a warehouse and to create rewards participants.
  • Export vouchers.
  • Customize imports and exports with Pentaho kettles. Schedule imports and exports to run once or on a recurring basis.

...

Info

Foundation Application > Utilities > Archiving Utility > Perform Data Archiving Runs

Foundation Application > Utilities > Archiving Utility > Archived Data

Data used and generated by businesses increases exponentially.  Insufficient storage space on database servers is directly related to slower performance.  Consequently, the need to archive unused data (data created up to a certain date) is expanding.  

The archiving utility:

  • Compresses data that no longer has financial or marketing impact.
  • Saves the data in an archived structure in the database.
  • Updates the existing entities still available in the system that are associated with the archived data. 

...

Designated system users can execute the archive utility and delete archived datathe processes are password protected.

Note

Non-archived data that refers to archived data (such as awarded events, spend requests, customer events and spend reward transactions) can be accessed without restrictions but cannot be modified. Actions (other than deleting, where allowed) cannot be applied to data that refers to archived data.

...

To archive data, an archiving data run definition must be set up.

  1. Navigate to Foundation > Utilities > Archiving Utility > Perform Data Archiving Runs and click on NEW from the Actions menu.
  2. Provide the mandatory information and criteria.
    1. Data Archiving Entity: Select an entity to archive from the drop-down menu.
    2. Archive Data Created X Months Ago: Identify the data to be archived using the date on which it was entered into the system. The minimum (and default) value for new definitions is 12 months.
  3. SAVE the configured definition and SUBMIT it to the scheduler to be executed.
  4. A list of previously executed runs is available under Process Runs.
    A run is created each time the definition is submitted.

...

 

Info

Foundation > Utilities > Integrate Using Pentaho

A Kettle is a data integration tool implemented by Pentaho and embedded in CRM.COM.
The Pentaho utility can be used to create custom Kettles for your application.
Import and export Kettles can be scheduled to be executed once or on a recurring basis. 

 

Note

CRM.COM imports and exports are handled by Data Imports & Exports. Pentaho is only used for custom-made imports and exports.

Imports

Large data sets can be imported simultaneously by using Kettles (Pentaho Data Integration), a third-party open source ETL (Extract, Transform, Load) tool.  The data is imported in a predefined file type.

  1. Navigate to Integrate Using Pentaho.

  2. Choose the module you are interested in.
  3. Locate and click on the import Kettle to run. 
    1. Click on the Template link to download the template of the import file.
    2. Create and save the file.
    3. Click on the file.
      1. The import details will open in a new tab.
      2. Click on Choose File to browse and select the created file.
      3. Once the file is uploaded click on Next.
        1. Provide the required parameters (if any).  Each import Kettle requires different parameters. 
        2. Click on Start Import to execute the import.
        3. Click to Close the page and go back to the Main page.
  4. Once the Kettle is executed, results are available on the Main page.
    1. Click on View Results (next to the executed import) to display the instances on which the import was executed, the run's successful records and those with errors.

 

Exports

Large data sets can be imported simultaneously by using Kettles (Pentaho Data Integration), a third-party open source ETL (Extract, Transform, Load) tool.  The data is exported in a predefined file type that is automatically generated by the system. 

  1. Navigate to Integrate Using Pentaho

  2. Chose the module you are interested in.
  3. Locate and click on the export Kettle to run.
  4. The export details will open in a new tab.
    1. Provide the required parameters (if any).  Each export Kettle requires different parameters. 
      1. Click on Start to execute the export immediately.
      2. Click to Close the page and go back to the Main page.
  5. Once the Kettle is executed, results are available on the Main page.
    1. Click on View Results (next to the executed export) to display the instances on which the export was executed, the run's successful records and those with errors.
Note

Only records defined as 'To be logged' when creating the Kettle are displayed. If none were defined as 'To be logged', then there will be no information available under 'Successful' or 'Error' drill-downs.

 

Anchor
schedule
schedule
Scheduling import and export Kettles

Kettle import and export runs can be configured to start automatically. The Integrate Using Pentaho screen is used to define the necessary information and trigger the process. 

Once a schedule is created and submitted, use Scheduling Settings to plan when and how often the Kettle should be executed.

  • Navigate to Integrate Using Pentaho and explore imports and exports.
  • Select the import or export you are interested in and go to the Data Entry page to enter the required information.
  • VIEW SCHEDULED Kettle runs or SCHEDULE a new one.

Back to top 

 

...