Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Anchor
top
top

Excerpt
hiddentrue

Learn to work with Utilities

Panel
nameblue

On this page

Table of Contents
maxLevel2

Overview

Utilities is a set of tools necessary to are the tools that are used to implement, maintain or support business processes within the system.  These tools include a scheduling engine responsible for scheduling planning and executing batch processes in CRM.COM processes in one flow, a data integration mechanism used for data imports and exports, and an archiving utility for archiving old data to control the size of a databasesaving storage space occupied by unused data.

Major features

  • Execute and manage batch process tasks, such as stopping or deleting them
  • Supports single entity rollback upon batch process task interruption. For example, if the billing is interrupted after billing 1000 accounts, the changes will not be roll-backed. However, if upon interruption the billing of a single account was not complete, then the system will revert any changes done on the specific account.
  • Archiving of  batch processes.
  • Roll back single entities if a batch process is interrupted.
  • Archive data to control database size. Records associated to
  • Mark data associated with archived data are marked for reference.Import hardware directly in  
  • Use import files to import hardware directly into a warehouse and to create rewards participants using import files. 
  • Export vouchersUse Pentaho kettles for custom-made .
  • Customize imports and exports with Pentaho kettles. Schedule imports and exports to run once in the future or on a recurring basis.

...

Related configuration areas

Mandatory modules must be configured for the Utilities the utilities module to work.

Manual link
Area
Module Description
Configuration
PlatformScheduler Settings

Scheduler settings are used Used to define the server(s) which that will handle batch processes sent to the scheduler with the option to define . A default server and specific servers for handling specific processes, as well as defining a default oneparticular processes can be defined.

Mandatory

...

Using the Archiving Utility

Info

Foundation Application > Utilities > Archiving Utility > Perform Data Archiving Runs

Foundation Application > Utilities > Archiving Utility > Archived Data

As the exponentially increasing data of many businesses can result in the lack of Data used and generated by businesses increases exponentially.  Insufficient storage space on database servers and the consequent slowing down of a system, archiving older is directly related to slower performance.  Consequently, the need to archive unused data (data created up to a certain date is crucial. The utility compresses ) is expanding.  

The archiving utility:

  • Compresses data that no longer has financial or marketing impact

...

  • .
  • Saves the data in an archived structure in the database.
  • Updates the existing entities still available in the system

...

  • that are associated with the archived data. 

For A record is created for every 'data archiving entity' a record is created and available from archived data to be downloaded or permanently deleted if it is .  The record is available for download and can be permanently deleted if no longer needed.  The execution of the archiving utility and the deletion of archived data can only be performed by system users designated for each process through a username and password 

Designated system users can execute the archive utility and delete archived datathe processes are password protected.

Note

Non-archived data which that refers to archived data (such as , awarded events, spend requests, customer events and spend reward transactions) can still be accessed without restrictions but not modified. No action can be applied on such of data, apart from deleting it (if it is allowed)cannot be modified. Actions (other than deleting, where allowed) cannot be applied to data that refers to archived data.

The following data can be archived:   

  • Purchase Customer Eventcustomer events
  • Spend Request Customer Eventrequest customer events
  • Referral Customer Eventcustomer events
  • Social Media Customer Eventmedia customer events
  • Web Customer Eventcustomer events
  • Achievement Customer Eventcustomer events
  • Financial Achievement Customer Eventachievement customer events
  • Financial Statement Customer Eventstatement customer events
  • Provisioning Requestsrequests
  • Process Run Logsrun logs
  • Rated Billing Items

...

  • billing items

As well as explicitly defined entities other , associated entity records may  can also be archived even if not explicitly defined, when are associated with an archived record. The following entities are subject to archiving.

  • Awarded Events
  • Spend Reward Transactions
  • Provisioning Request Parameters
  • Process Run Log Entity
  • Applied Additive Discounts

Archiving Data Run Definitions

To archive data you must set up an Archiving Data Run Definition

Image Removed

...

, such as:

  • Awarded events
  • Spend reward transactions
  • Provisioning request parameters
  • Process run log entity
  • Applied additive discounts

Archiving data run definitions

To archive data, an archiving data run definition must be set up.

Image Added

  1. Navigate to Foundation > Utilities > Archiving Utility > Perform Data Archiving Runs and click on NEW from the Actions menu.
  2. Provide all the mandatory information along with the Criteriaand criteria.
    1. Data Archiving Entity: the entity you wish to archive Select an entity to archive from the drop-down menu.
    2. Archive Data Created X Months Ago: determines  Identify the data that should to be archived based on using the date on which it was created in was entered into the system. The minimum (and default) value when creating a for new definition definitions is 12 months.
  3. Once you configure the definition SAVE, it SAVE the configured definition and SUBMIT to  it to the scheduler so that it is executed and data is archivedto be executed.
  4. A list of the previously executed runs (a is available under Process Runs.
    A run is created every each time the definition is submitted) is available in the process runs.

Anchor
viewing_not_arch_data
viewing_not_arch_data
Viewing archived data

Once After the utility is executed, the archived data can be accessed from , downloaded or deleted from a dedicated screen from it can be downloaded or deleted.

Image RemovedImage Added 

Anchor
downloading_arch_data
downloading_arch_data
Downloading archived data

  1. Navigate to the archived data summary Summary page and search for the data of interestyou are interested in.
  2. Click on DOWNLOAD.
  3. Provide a password.
    The A compressed archived file will be downloaded.

Anchor
deleting_arch_data
deleting_arch_data
Deleting archived data

  1. Navigate to the archived data summary page and search for the data to delete.
  2. Check the select box on the left of the item to be deleted.
  3. From Click on Delete from the Actions menu click on Actions > Delete.
  4. Provide a password.
  5. Click on Delete.The archived data will be deleted.

 

Back to top

Using the Scheduler

...

Info

Foundation > Utilities > Manage Scheduled Processes

 

The scheduler is used to execute the batch tasks that are scheduled in the system.   A batch task is program and execute system tasks that are processed in one flow.  Each task consists of a series of steps involving the simultaneous handling of multiple entries, without the need for manual intervention.

Every Each time a run definition is ' submitted' , a task is created and sent to the scheduler for processing. Depending on the ' The scheduler settings ' defined on in the run definition , determine whether the task may be is executed once and removed from the scheduler (one-off), or whether it may be is recurring (executed and the next upcoming task scheduled instantly (repetitive tasksone scheduled).

The scheduler can be in either one of two states:

  • 'Running':  All tasks that are   All 'Effective' and  tasks that have a CRON Expression equal to the current time of the scheduler will be are executed. 
  • 'Standby Mode':  No Task will be executed, even if it is scheduledScheduled tasks are not executed.
Note

Scheduled processes are picked up and applied only by the Application Servers which are registered with the organisation and have permission to perform each specific process. If no specific Application Server is Each scheduled process is addressed by authorized application servers registered with the organization. If an application server is not defined for a specific process, then the default Application Server registered for the organisation organization's default application server is used.

Refer to Scheduler Settings for more information.

 

  1. Navigate to the scheduler utility and explore scheduled tasks via through the Summary page. 
  2. Click on one of the entries an entry to access the task Data Entry page of the scheduled tasks to and DELETE a scheduled task or STOP a running one.
  3. Click on ACTIONS from the Actions menu and select Select one of the following options from the Actions menu:
    • VIEW SCHEDULED TASKS: Displays tasks which are scheduled for the organisation. 
    • VIEW RUNNING TASKS: Displays tasks which are already running for the organisation. 
    • VIEW COMPLETED TASKS: Displays completed tasks. 
    • VIEW LOG: Displays the scheduler log.  The tasks executed during  Click a day of the month to display tasks executed during specific days of the current month and their results can be viewed, by clicking on the day of your interestresults.
    • START SCHEDULER: Starts the scheduler. This  This action can be applied only if the current application server is enabled to run scheduled processes for the specific organisationorganization. More information is available at Scheduler Settings.
    • STOP SCHEDULER: Stops the scheduler.  

Image RemovedImage Added

 

Task fields

For each task the The following information is available either through the Summary  page

...

for each task on the Summary page. 

Scheduler Task

Task Number

Organisation: represents the The database on for which the task is scheduled for. Particularly helpful if you have multiple organisations /databases) being handled by (the scheduler of one application server can handle multiple organizations).

Class & , Method: the The name of the java class and specific method used to execute the task 

Parameters : The parameters that will be processed by the scheduler. Multiple parameters can be specified by using the character |asseparatorseparator. Default The default parameter type is 'String'. Parameter types can be defined by using the notation type@value, example  

Example of Integer plus Double parameters: java.lang.Integer@1|java.lang.Double@10.2 .

Application Server: The name of the application server whose scheduler is being used.  

CRON Expression: The CRON expression that defines Defines when the task should be performed.

State: The state of the task which can be one of the following:Can be 'Draft

.

'

or 'Effective'.

'

 

Anchor
delete
delete
Deleting a scheduled task

A task that has not yet been executed can be deleted , so that it is never executedis not executed

  1. Navigate to the Scheduler Utilityscheduler utility. 
  2. Click on the task to navigate to the task's go to its Data Entry page.
  3. Click on DELETEDELETE from the Top Menu.
  4. The scheduled task is deleted permanently and removed removed from the list.

Anchor
stop
stop
Stopping a running

...

task

A task that is already running being processed can be stopped , once the currently executing current step is completed. 

  1. Navigate to the Scheduler Utilityscheduler utility.
  2. Click on the task to navigate to the task's go to its Data Entry page.
  3. Click on STOP.
    1. Provide the a reason for stopping the task.
  4. The following information is kept on the associated log entry of the scheduler task:
    1. The user that stopped the task.
    2. The date and time that the task was stopped.
    3. The reason for stopping the running task. 
Note
  • Stopping When a task will not revert the changes that already took place apart from changes on the last processed record if they were not completed.The log will be is stopped, changes on records whose processing was incomplete are reverted, but changes on processed records are not.
  • A log is available on the server and can be provided by a system administrator

...

  • .

Back to top 


Anchor
batch
batch
Using Data Imports & Exports

...

 

Info

Foundation > Utilities > Import Data

Foundation > Utilities > Export Data

Various CRM.COM processes have been implemented to manage fast import supports the rapid import and export of data. 

Imports use files use files (for which templates ) are available) to import data from third-party systems into CRM.COM. Refer to the following processes for details on how to use:

Exports on the other hand, place format CRM.COM data in as a structured file and export it for use from in third-party systems. Refer to the following processes for details on how to use

Export Vouchers

 

Refer to Export Vouchers for details.

Back to top 

Using Pentaho Imports & Exports


 

Info

Foundation > Utilities > Integrate Using Pentaho

The Pentaho Utility is suitable for custom-made kettles. A kettle is A Kettle is a data integration tool implemented by Pentaho and embedded in CRM.COM. Both import and export kettles can .
The Pentaho utility can be used to create custom Kettles for your application.
Import and export Kettles can be scheduled to be executed once or on a recurring basis. 

 

Note

CRM.COM imports and exports are handled by batch processes Data Imports & Exports. Pentaho is only used for custom-made imports and exports.

Imports

Data Large data sets can be imported in bulk at any given time through the Pentaho utility, which uses dynamic interfaces implemented as kettles.  The simultaneously by using Kettles (Pentaho Data Integration), a third-party open source ETL (Extract, Transform, Load) tool.  The data is imported in a predefined file type.

  1. Navigate to Integrate Using Pentaho.

  2. Choose the module of your interestyou are interested in.
  3. Locate and click on the import kettles to Kettle to run. 
    1. Click on the Template link to download the template of the import file.
    2. Create and save the file.
    3. Click on the file.
      1. The import details will open in a new tab.
      2. Click on Choose File to browse and select the created file.
      3. Once the file is uploaded click on Next.
        1. Provide the required parameters (if any).  Each import Kettles Kettle requires different parameters. 
        2. Click on Start Import to execute the import.
        3. Click to Close the page and go back to the Main page.
  4. Once the Kettles Kettle is executed, results are made available on the Main page.
    1. Click on View Results next  (next to the executed import to get a list of ) to display the instances on which the import was executed together with executed, the Runrun's successful records and those with errors.

 

Exports

Data can be exported in bulk at any given time through the Pentaho Utility, which uses dynamic interfaces implemented as Kettles. The Large data sets can be imported simultaneously by using Kettles (Pentaho Data Integration), a third-party open source ETL (Extract, Transform, Load) tool.  The data is exported in a file predefined file type that is predefined and automatically generated by the Systemsystem

  1. Navigate to Integrate Using Pentaho

  2. Chose the module of your interestmodule you are interested in.
  3. Locate and click on the Export export Kettle to run.
  4. The export details will open in a new tab.
    1. Provide the required parameters (if any).  Each export Kettle requires different parameters. 
      1. Click on Start to execute the export directlyimmediately.
      2. Click to Close the page and go back to the Main page.
  5. Once the Kettle is executed, results are made available on the Main page.
    1. Click on View Results next  (next to the executed Export for a list of export) to display the instances on which the Export was export was executed together with , the Runrun's successful records and those with errors.
Note

Only successful records and those with errors that are defined when creating the 'Kettle' records defined as 'to To be logged' will be shownwhen creating the Kettle are displayed. If nothing is none were defined as 'to To be logged', then there will be no information will be available under 'Successful' or 'Error' drill-downs.

 

Anchor
schedule
schedule
Scheduling import and export

kettles

Kettles

Kettle runs for imports or exports import and export runs can be configured to start automatically.  The  information is specified and the process is triggered from the Pentaho Imports & Exports screen The Integrate Using Pentaho screen is used to define the necessary information and trigger the process

Once a schedule is created and submitted, the Kettle will be executed once or on a recurring basis, depending on the Scheduling Settings use Scheduling Settings to plan when and how often the Kettle should be executed.

  • Navigate to the Integrate Using Pentaho screen and explore existing Pentaho Imports and Exportsexplore imports and exports.
  • Select the Pentaho Import or Export of your interest import or export you are interested in and go to the Data Entry page to enter the required information.
  • Click on VIEW SCHEDULED to view existing Scheduled Kettle Runs or on SCHEDULE to schedule a new VIEW SCHEDULED Kettle runs or SCHEDULE a new one.

Image RemovedImage Added

Back to top 

 

Utilities Business Examples

Archiving

of

purchases

Panel
nameblue
titleArchiving old data

Scenario 1

Company ZX would like wants to save space in its database , by a clean-up process that is executed on through an annual basiscleanup process. Data older   Data logged in the system regarding purchases that is more than 6 months related to Purchases logged in the System old should be compressed and removed from the database. 


Solution

Configuration

An Archiving Definition Run archiving definition run must be set as follows:

Data Archiving EntityArchive Data Created X Months Ago
Purchase Customer Eventscustomer events6

User Process

Authorised Personnel, i.e. Super Users, Authorized personnel (a super user) should access the Run Definition run definition and submit the process to the Scheduler, every scheduler each time that the cleaning process must be executed. A password The archiving procedure will be required password protected to ensure that the archiving is submitted only by authorised usersonly authorized users have access to it.

Note
titleNotes

 

...