Anchor | ||||
---|---|---|---|---|
|
Excerpt | ||
---|---|---|
| ||
Learn to work with Utilities |
Panel | ||||
---|---|---|---|---|
| ||||
On this page
|
Overview
Utilities
...
Panel | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
Utilities - The Basics
|
...
Utilities - The Admin Stuff
...
are the tools that are used to implement, maintain or support business processes within the system. These include a scheduling engine for planning and executing CRM.COM processes in one flow, a data integration mechanism used for imports and exports, and an archiving utility for saving storage space occupied by unused data.
Major features
- Execute and manage batch processes.
- Roll back single entities if a batch process is interrupted.
- Archive data to control database size
- Mark data associated with archived data for reference.
- Use import files to import hardware directly into a warehouse and to create rewards participants.
- Export vouchers.
- Customize imports and exports with Pentaho kettles. Schedule imports and exports to run once or on a recurring basis.
Related configuration areas
Mandatory modules must be configured for the utilities module to work.
Manual link | Area | Module Description | Configuration |
---|---|---|---|
Platform | Scheduler Settings | Used to define the server(s) that will handle batch processes sent to the scheduler. A default server and specific servers for handling particular processes can be defined. | Mandatory |
Using the Archiving Utility
Info |
---|
Foundation Application > Utilities > Archiving Utility > Perform Data Archiving Runs Foundation Application > Utilities > Archiving Utility > Archived Data |
Data used and generated by businesses increases exponentially. Insufficient storage space on database servers is directly related to slower performance. Consequently, the need to archive unused data (data created up to a certain date) is expanding.
The archiving utility:
- Compresses data that no longer has financial or marketing impact.
- Saves the data in an archived structure in the database.
- Updates the existing entities still available in the system that are associated with the archived data.
A record is created for every 'data archiving entity'. The record is available for download and can be permanently deleted if no longer needed.
Designated system users can execute the archive utility and delete archived data; the processes are password protected.
Note |
---|
Non-archived data that refers to archived data (such as awarded events, spend requests, customer events and spend reward transactions) can be accessed without restrictions but cannot be modified. Actions (other than deleting, where allowed) cannot be applied to data that refers to archived data. |
The following data can be archived:
- Purchase customer events
- Spend request customer events
- Referral customer events
- Social media customer events
- Web customer events
- Achievement customer events
- Financial achievement customer events
- Financial statement customer events
- Provisioning requests
- Process run logs
- Rated billing items
As well as explicitly defined entities, associated records can also be archived, such as:
- Awarded events
- Spend reward transactions
- Provisioning request parameters
- Process run log entity
- Applied additive discounts
Archiving data run definitions
To archive data, an archiving data run definition must be set up.
- Navigate to Foundation > Utilities > Archiving Utility > Perform Data Archiving Runs and click on NEW from the Actions menu.
- Provide the mandatory information and criteria.
- Data Archiving Entity: Select an entity to archive from the drop-down menu.
- Archive Data Created X Months Ago: Identify the data to be archived using the date on which it was entered into the system. The minimum (and default) value for new definitions is 12 months.
- SAVE the configured definition and SUBMIT it to the scheduler to be executed.
- A list of previously executed runs is available under Process Runs.
A run is created each time the definition is submitted.
Anchor | ||||
---|---|---|---|---|
|
After the utility is executed, the archived data can be accessed, downloaded or deleted from a dedicated screen.
Anchor | ||||
---|---|---|---|---|
|
- Navigate to the archived data Summary page and search for the data you are interested in.
- Click on DOWNLOAD.
- Provide a password.
A compressed file will be downloaded.
Anchor | ||||
---|---|---|---|---|
|
- Navigate to the archived data summary page and search for the data to delete.
- Check the select box on the left of the item to be deleted.
- Click on Delete from the Actions menu.
- Provide a password.
- Click on Delete.
Using the Scheduler
...
Info |
---|
Foundation > Utilities > Manage Scheduled Processes |
The scheduler is used to program and execute system tasks that are processed in one flow. Each task consists of a series of steps involving the simultaneous handling of multiple entries, without the need for manual intervention.
Each time a run definition is submitted, a task is created and sent to the scheduler for processing. The scheduler settings defined in the run definition determine whether the task is executed once and removed from the scheduler or whether it is recurring (executed and next one scheduled).
The scheduler can be in one of two states:
- 'Running': All 'Effective' tasks that have a CRON Expression equal to the current time are executed.
- 'Standby Mode': Scheduled tasks are not executed.
Note |
---|
Each scheduled process is addressed by authorized application servers registered with the organization. If an application server is not defined for a specific process, then the organization's default application server is used. Refer to Scheduler Settings for more information. |
- Navigate to the scheduler utility and explore scheduled tasks through the Summary page.
- Click on an entry to access the task Data Entry page and DELETE a scheduled task or STOP a running one.
- Select one of the following options from the Actions menu:
- VIEW SCHEDULED TASKS
- VIEW RUNNING TASKS
- VIEW COMPLETED TASKS
- VIEW LOG: Click a day of the month to display tasks executed during specific days of the current month and their results.
- START SCHEDULER: This action can be applied only if the current application server is enabled to run scheduled processes for the specific organization. More information is available at Scheduler Settings.
- STOP SCHEDULER.
Task fields
The following information is available for each task on the Summary page.
Scheduler Task |
---|
Task Number Organisation: The database for which the task is scheduled (the scheduler of one application server can handle multiple organizations). Class, Method: The name of the java class and method used to execute the task. Parameters that will be processed by the scheduler. Multiple parameters can be specified by using the character separator. The default parameter type is 'String'. Parameter types can be defined by using the notation type@value. Example of Integer plus Double parameters: java.lang.Integer@1|java.lang.Double@10.2 . Application Server: The name of the application server whose scheduler is being used. CRON Expression: Defines when the task should be performed. State: Can be 'Draft' or 'Effective'. |
Anchor | ||||
---|---|---|---|---|
|
A task can be deleted so that it is not executed.
- Navigate to the scheduler utility.
- Click on the task to go to its Data Entry page.
- Click on DELETE from the Top Menu.
- The scheduled task is deleted permanently and removed from the list.
Anchor stop stop
Stopping a running task
stop | |
stop |
A task that is being processed can be stopped once the current step is completed.
- Navigate to the scheduler utility.
- Click on the task to go to its Data Entry page.
- Click on STOP.
- Provide a reason for stopping the task.
- The following information is kept on the associated log entry of the scheduler task:
- The user that stopped the task.
- The date and time that the task was stopped.
- The reason for stopping the task.
Note |
---|
|
Anchor | ||||
---|---|---|---|---|
|
...
Info |
---|
Foundation > Utilities > Import Data Foundation > Utilities > Export Data |
CRM.COM supports the rapid import and export of data.
Imports use files (for which templates are available) to import data from third-party systems into CRM.COM. Refer to the following processes for details:
Exports format CRM.COM data as a structured file for use in third-party systems.
Refer to Export Vouchers for details.
Using Pentaho Imports & Exports
Info |
---|
Foundation > Utilities > Integrate Using Pentaho |
Note |
---|
CRM.COM imports and exports are handled by Data Imports & Exports. Pentaho is only used for custom-made imports and exports. |
Imports
Large data sets can be imported simultaneously by using Kettles (Pentaho Data Integration), a third-party open source ETL (Extract, Transform, Load) tool. The data is imported in a predefined file type.
Navigate to Integrate Using Pentaho.
- Choose the module you are interested in.
- Locate and click on the import Kettle to run.
- Click on the Template link to download the template of the import file.
- Create and save the file.
- Click on the file.
- The import details will open in a new tab.
- Click on Choose File to browse and select the created file.
- Once the file is uploaded click on Next.
- Provide the required parameters (if any). Each import Kettle requires different parameters.
- Click on Start Import to execute the import.
- Click to Close the page and go back to the Main page.
- Once the Kettle is executed, results are available on the Main page.
- Click on View Results (next to the executed import) to display the instances on which the import was executed, the run's successful records and those with errors.
Exports
Large data sets can be imported simultaneously by using Kettles (Pentaho Data Integration), a third-party open source ETL (Extract, Transform, Load) tool. The data is exported in a predefined file type that is automatically generated by the system.
Navigate to Integrate Using Pentaho
- Chose the module you are interested in.
- Locate and click on the export Kettle to run.
- The export details will open in a new tab.
- Provide the required parameters (if any). Each export Kettle requires different parameters.
- Click on Start to execute the export immediately.
- Click to Close the page and go back to the Main page.
- Provide the required parameters (if any). Each export Kettle requires different parameters.
- Once the Kettle is executed, results are available on the Main page.
- Click on View Results (next to the executed export) to display the instances on which the export was executed, the run's successful records and those with errors.
Note |
---|
Only records defined as 'To be logged' when creating the Kettle are displayed. If none were defined as 'To be logged', then there will be no information available under 'Successful' or 'Error' drill-downs. |
Anchor | ||||
---|---|---|---|---|
|
Kettle import and export runs can be configured to start automatically. The Integrate Using Pentaho screen is used to define the necessary information and trigger the process.
Once a schedule is created and submitted, use Scheduling Settings to plan when and how often the Kettle should be executed.
- Navigate to Integrate Using Pentaho and explore imports and exports.
- Select the import or export you are interested in and go to the Data Entry page to enter the required information.
VIEW SCHEDULED Kettle runs or SCHEDULE a new one.
Utilities Business Examples
Archiving purchases
Panel | ||||
---|---|---|---|---|
| ||||
Scenario 1 Company ZX wants to save space in its database through an annual cleanup process. Data logged in the system regarding purchases that is more than 6 months old should be compressed and removed from the database. Solution Configuration An archiving definition run must be set as follows:
User Process Authorized personnel (a super user) should access the run definition and submit the process to the scheduler each time the cleaning process must be executed. The archiving procedure will be password protected to ensure only authorized users have access to it. |
Note | ||
---|---|---|
| ||
|
Panel | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
| ||||||||||
Related Links
|