Help Center

Consolidation Job


Consolidation job is designed to collect data from multiple agents onto storage servers. This job can be useful, for example, to collect periodic reports from workstations. It's reusable, so new files can be later added to the source agents. To re-use the job, just manually launch it with "Start" button or restart any of the previous Job Runs

At this point source agent will rescan the folder and re-hash the files. These files will be compared against files on destination agents and merged into their folder tree. If a file's hash matches, it won't be re-synced. If a destination agent has a file that is missing from source, it will remain intact unless option "Delete files absent on Read-Write peer" in enabled in Agent Profile.

Destination server gets Read-Only access to the share. It means that it will not be able to upload updated files back to source agents.

It's a one-time file transfer job that has a specific moment of start and finish (when all files are delivered and post download scripts executed). Android devices can participate in consolidation job starting with Console version 2.5.

This is how Consolidation job is created and configured:

1) Go to JOBS -> Configure jobs -> Create new job -> Consolidation.

2) Give it a name and description. It's optional and defaults can be used. Check option to use SHA2 hashing if preferred, but Agents running Connect version 2.0 and older will not be able to participate in this job.

3) Choose the source groups. Agent belonging to this group will be uploading their data. You can create a new group or use the one(s) that you already have. Also, you can separately select some agents for that job, provided agents do not produce conflicts. When creating a group, you can specify a schedule by which agents will be uploading data.
Specify the path from where agents will be uploading files. These directories must exist and files shall be there before you start the job.

See here for more details on creating and managing groups.

4) Choose the destination groups. Agents from this group will be collecting data from source agents. You can create a new group or pick an already existing one, or also select the agents separately, provided agents do not create Conflicts.
Specify the path from where agents will be storing the files. Using Agent tags is also possible.

 

5) Consolidation job supports post commands which make it possible to run a command once transfer is complete. Triggers indicate the moment when the script will be executed. This step can be skipped.
Before file-indexing begins: right after job is created, the agent will start indexing files in the specified directory. A script, triggered at this moment can "cook the files before serving", for example, re-arrange the them, add/remove new and do things alike, so that the folder is indexed and distributed in a proper manner the way you need.
After an agent completes downloading: the script will run on each destination agent after it finishes download. Other agents may be still downloading the files, thus it's recommended not to remove or update the distributed files with this trigger.  Otherwise the file will be re-downloaded from source again. It's possible to schedule the script for the agent's local time 
After all agents complete downloading: as opposed to the trigger above, in this case script will run only after all destination agents finish downloading all the files. This script will also be executed on the source agent. 

8) Job scheduler defines when the job will be launched

Run at - at the preferred date/time (local agents' time);
Repeat manually - job won't start until manually launched with "Start" button;
Repeat hourly - job will run every N hours. Scale is 1 hour, integer.
Repeat daily - job will run on a daily basis at the selected time. Scale is 1 day, integer.
Repeat weekly - job will run on selected days on the week, additionally you can set the exact time of the day.

In all those periodic schedules (hourly, daily and weekly) it's possible to select the starting and ending points for the job.

7) Settings. Select Job's priority and Job's profile.

7) Review the job details and save.
Right after that source agents will index the specified source share and upload data to destination agents. A specified script/command will be executed at the picked trigger.

Was the article helpful? Yes / No, send feedback on article Thanks!


Please note that we won't mail you back. This is just purely feedback on the article above. If you need help from our Support Team, please use the "Contact Support" link at the top of the page.