Skip to end of banner
Go to start of banner

Utilities

Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 5 Next »

On this page

Overview

Utilities are the tools that are used to implement, maintain or support business processes within the system.  These include a scheduling engine for planning and executing CRM.COM processes in one flow, a data integration mechanism used for imports and exports, and an archiving utility for saving storage space occupied by unused data.

Major features

  • Execute and manage batch processes.
  • Roll back single entities if a batch process is interrupted.
  • Archive data to control database size
  • Mark data associated with archived data for reference. 
  • Use import files to import hardware directly into a warehouse and to create rewards participants.
  • Export vouchers.
  • Customize imports and exports with Pentaho kettles. Schedule imports and exports to run once or on a recurring basis.

Related configuration areas

Mandatory modules must be configured for the utilities module to work.

Manual link
Area
Module Description
Configuration
PlatformScheduler Settings

Used to define the server(s) that will handle batch processes sent to the scheduler. A default server and specific servers for handling particular processes can be defined.

Mandatory

Using the Archiving Utility

Foundation Application > Utilities > Archiving Utility > Perform Data Archiving Runs

Foundation Application > Utilities > Archiving Utility > Archived Data

Data used and generated by businesses increases exponentially.  Insufficient storage space on database servers is directly related to slower performance.  Consequently, the need to archive unused data (data created up to a certain date) is expanding.  

The archiving utility:

  • Compresses data that no longer has financial or marketing impact.
  • Saves the data in an archived structure in the database.
  • Updates the existing entities still available in the system that are associated with the archived data. 

A record is created for every 'data archiving entity'.  The record is available for download and can be permanently deleted if no longer needed.  

Designated system users can execute the archive utility and delete archived datathe processes are password protected.

Non-archived data that refers to archived data (such as awarded events, spend requests, customer events and spend reward transactions) can be accessed without restrictions but cannot be modified. Actions (other than deleting, where allowed) cannot be applied to data that refers to archived data.

The following data can be archived:   

  • Purchase customer events
  • Spend request customer events
  • Referral customer events
  • Social media customer events
  • Web customer events
  • Achievement customer events
  • Financial achievement customer events
  • Financial statement customer events
  • Provisioning requests
  • Process run logs
  • Rated billing items

As well as explicitly defined entities, associated records can also be archived, such as:

  • Awarded events
  • Spend reward transactions
  • Provisioning request parameters
  • Process run log entity
  • Applied additive discounts

Archiving data run definitions

To archive data, an archiving data run definition must be set up.

  1. Navigate to Foundation > Utilities > Archiving Utility > Perform Data Archiving Runs and click on NEW from the Actions menu.
  2. Provide the mandatory information and criteria.
    1. Data Archiving Entity: Select an entity to archive from the drop-down menu.
    2. Archive Data Created X Months Ago: Identify the data to be archived using the date on which it was entered into the system. The minimum (and default) value for new definitions is 12 months.
  3. SAVE the configured definition and SUBMIT it to the scheduler to be executed.
  4. A list of previously executed runs is available under Process Runs.
    A run is created each time the definition is submitted.

Viewing archived data

After the utility is executed, the archived data can be accessed, downloaded or deleted from a dedicated screen.

 

Downloading archived data

  1. Navigate to the archived data Summary page and search for the data you are interested in.
  2. Click on DOWNLOAD.
  3. Provide a password.
    A compressed file will be downloaded.

Deleting archived data

  1. Navigate to the archived data summary page and search for the data to delete.
  2. Check the select box on the left of the item to be deleted.
  3. Click on Delete from the Actions menu.
  4. Provide a password.
  5. Click on Delete.

 

Back to top

Using the Scheduler


Foundation > Utilities > Manage Scheduled Processes

 

The scheduler is used to program and execute system tasks that are processed in one flow.  Each task consists of a series of steps involving the simultaneous handling of multiple entries, without the need for manual intervention.

Each time a run definition is submitted, a task is created and sent to the scheduler for processing.  The scheduler settings defined in the run definition determine whether the task is executed once and removed from the scheduler or whether it is recurring (executed and next one scheduled).

The scheduler can be in one of two states:

  • 'Running':  All 'Effective' tasks that have a CRON Expression equal to the current time are executed. 
  • 'Standby Mode': Scheduled tasks are not executed.

Each scheduled process is addressed by authorized application servers registered with the organization. If an application server is not defined for a specific process, then the organization's default application server is used.

Refer to Scheduler Settings for more information.

 

  1. Navigate to the scheduler utility and explore scheduled tasks through the Summary page. 
  2. Click on an entry to access the task Data Entry page and DELETE a scheduled task or STOP a running one.
  3. Select one of the following options from the Actions menu:
    • VIEW SCHEDULED TASKS 
    • VIEW RUNNING TASKS 
    • VIEW COMPLETED TASKS
    • VIEW LOG: Click a day of the month to display tasks executed during specific days of the current month and their results.
    • START SCHEDULER: This action can be applied only if the current application server is enabled to run scheduled processes for the specific organization. More information is available at Scheduler Settings.
    • STOP SCHEDULER.

 

Task fields

The following information is available for each task on the Summary page. 

Scheduler Task

Task Number

Organisation: The database for which the task is scheduled (the scheduler of one application server can handle multiple organizations).

Class, Method: The name of the java class and method used to execute the task.   

Parameters that will be processed by the scheduler. Multiple parameters can be specified by using the character separator. The default parameter type is 'String'. Parameter types can be defined by using the notation type@value.   

Example of Integer plus Double parameters: java.lang.Integer@1|java.lang.Double@10.2 .

Application Server: The name of the application server whose scheduler is being used.  

CRON Expression: Defines when the task should be performed.

State: Can be 'Draft' or 'Effective'.

 

Deleting a scheduled task

A task can be deleted so that it is not executed. 

  1. Navigate to the scheduler utility. 
  2. Click on the task to go to its Data Entry page.
  3. Click on DELETE from the Top Menu.
  4. The scheduled task is deleted permanently and removed from the list.

Stopping a running task

A task that is being processed can be stopped once the current step is completed. 

  1. Navigate to the scheduler utility.
  2. Click on the task to go to its Data Entry page.
  3. Click on STOP.
    1. Provide a reason for stopping the task.
  4. The following information is kept on the associated log entry of the scheduler task:
    1. The user that stopped the task.
    2. The date and time that the task was stopped.
    3. The reason for stopping the task. 
  • When a task is stopped, changes on records whose processing was incomplete are reverted, but changes on processed records are not.
  • A log is available on the server and can be provided by a system administrator.

Back to top 


Using Data Imports & Exports


 

 

Foundation > Utilities > Import Data

Foundation > Utilities > Export Data

CRM.COM supports the rapid import and export of data. 

Imports use files (for which templates are available) to import data from third-party systems into CRM.COM. Refer to the following processes for details:

Exports format CRM.COM data as a structured file for use in third-party systems. 

Refer to Export Vouchers for details.

Back to top 

Using Pentaho Imports & Exports


 

Foundation > Utilities > Integrate Using Pentaho

A Kettle is a data integration tool implemented by Pentaho and embedded in CRM.COM.
The Pentaho utility can be used to create custom Kettles for your application.
Import and export Kettles can be scheduled to be executed once or on a recurring basis. 

 

CRM.COM imports and exports are handled by Data Imports & Exports. Pentaho is only used for custom-made imports and exports.

Imports

Large data sets can be imported simultaneously by using Kettles (Pentaho Data Integration), a third-party open source ETL (Extract, Transform, Load) tool.  The data is imported in a predefined file type.

  1. Navigate to Integrate Using Pentaho.

  2. Choose the module you are interested in.
  3. Locate and click on the import Kettle to run. 
    1. Click on the Template link to download the template of the import file.
    2. Create and save the file.
    3. Click on the file.
      1. The import details will open in a new tab.
      2. Click on Choose File to browse and select the created file.
      3. Once the file is uploaded click on Next.
        1. Provide the required parameters (if any).  Each import Kettle requires different parameters. 
        2. Click on Start Import to execute the import.
        3. Click to Close the page and go back to the Main page.
  4. Once the Kettle is executed, results are available on the Main page.
    1. Click on View Results (next to the executed import) to display the instances on which the import was executed, the run's successful records and those with errors.

 

Exports

Large data sets can be imported simultaneously by using Kettles (Pentaho Data Integration), a third-party open source ETL (Extract, Transform, Load) tool.  The data is exported in a predefined file type that is automatically generated by the system. 

  1. Navigate to Integrate Using Pentaho

  2. Chose the module you are interested in.
  3. Locate and click on the export Kettle to run.
  4. The export details will open in a new tab.
    1. Provide the required parameters (if any).  Each export Kettle requires different parameters. 
      1. Click on Start to execute the export immediately.
      2. Click to Close the page and go back to the Main page.
  5. Once the Kettle is executed, results are available on the Main page.
    1. Click on View Results (next to the executed export) to display the instances on which the export was executed, the run's successful records and those with errors.

Only records defined as 'To be logged' when creating the Kettle are displayed. If none were defined as 'To be logged', then there will be no information available under 'Successful' or 'Error' drill-downs.

 

Scheduling import and export Kettles

Kettle import and export runs can be configured to start automatically. The Integrate Using Pentaho screen is used to define the necessary information and trigger the process. 

Once a schedule is created and submitted, use Scheduling Settings to plan when and how often the Kettle should be executed.

  • Navigate to Integrate Using Pentaho and explore imports and exports.
  • Select the import or export you are interested in and go to the Data Entry page to enter the required information.
  • VIEW SCHEDULED Kettle runs or SCHEDULE a new one.

Back to top 

 

Utilities Business Examples

Archiving purchases

Archiving old data

Scenario 1

Company ZX wants to save space in its database through an annual cleanup process.  Data logged in the system regarding purchases that is more than 6 months old should be compressed and removed from the database.


Solution

Configuration

An archiving definition run must be set as follows:

Data Archiving EntityArchive Data Created X Months Ago
Purchase customer events6

User Process

Authorized personnel (a super user) should access the run definition and submit the process to the scheduler each time the cleaning process must be executed. The archiving procedure will be password protected to ensure only authorized users have access to it.

Notes

 

Back to top 

  • No labels