Before you can start creating jobs that send or receive data from the cloud, you need to configure a cloud storage connection in Settings -> Storage Connectors:
Resilio Connect Agents can work with several types of cloud storage: Amazon S3, Azure blob, Azure files, Google cloud and generic S3 Compatible storages. Each type is configured separately. To be able to add cloud storage, the MC license must include a "Storage connector" feature in at least one package.
No connection probed
The Management Console itself won't probe the connection to your cloud storage. Connection tests are performed by one of the chosen Agents, or by the configured Agent in the job.
- Name of the storage (any name that will help you to later understand what this storage is)
- Description, optional
- Access key ID. Should be copied from your Amazon account
- Secret access key. Should be copied from your Amazon account
- Region name. Please specify as dash-delimited text as in "Region" column of officially supported AWS regions.
- Use bucket. Optional*. Can be left empty, if the access keys have permissions to list the buckets. Otherwise, it won't be possible to test connection to the cloud and browse through buckets.
- Starting with MC v3.3, access by Amazon IAM Roles is supported. Bucket checkbox is optional. If it’s checked, the Agents’ connections will be limited to only that bucket. If not - all the buckets that the IAM Roles allow, will be listed by the Agent when configuring the job in the path picker (Permission to list the buckets does not guarantee write access inside the buckets).
Note: the Agent will not report access errors shortly after the role is revoked, until the access token expires or until the Agent is restarted.
(*) Empty bucket name
If you leave it empty, agent will be able to access all buckets by the provided access keys. Thus you MUST specify some bucket name as part of the path during job configuration and that bucket MUST exist in the storage already.-
Name of the storage (any name that will help you to later understand what this storage is).
-
Description, optional
- Access key or SAS token. You can get in Home -> Access keys or Shared access signature
of your Storage account - Storage account name
- Leave endpoint default unless you are using non-default connection endpoint
- Use container. Optional*. Can be left empty, if the access keys have permissions to list the containers. Otherwise, it won't be possible to test connection to the cloud and browse through containers.
(*) Empty container name
If you leave it empty, agent will be able to access all containers by the provided access keys. Thus you MUST specify some container name as part of the path during job configuration and that container MUST exist in the storage already.-
Name of the storage (any name that will help you to later understand what this storage is).
-
Description, optional.
- Access key or SAS token. You can get in Home -> Access keys or Shared access signature
of your Storage account - Storage account name
- Leave endpoint default unless you are using non-default connection endpoint
- Use share. Optional*. Can be left empty, if the access keys have permissions to list the shares. Otherwise, it won't be possible to test connection to the cloud and browse through buckets.
(*) Empty share name
If you leave it empty, agent will be able to access all shares by the provided access keys. Thus you MUST specify some share name as part of the path during job configuration and that container MUST exist in the storage already.- Name of the storage (any name that will help you to later understand what this storage is).
- Description, optional
- Your endpoint that agent will attempt to send S3 API requests to*
- Access Key ID. Should be provided to you by your S3-compatible storage vendor
- Secret access key. Should be provided to you by your S3-compatible storage vendor
- Region. Specify bucket's region.
- Checkbox 'Use SSL'. If checked, TLS protocol will be used in communication with the storage.
- Use bucket. Optional**. Can be left empty, if the access keys have permissions to list the buckets. Otherwise, it won't be possible to test connection to the cloud and browse through buckets.
(*) S3 Compatible Storages' peculiarities
Each object storage and S3-compatible in particular have their own specifics requiring applying some additional settings. See "Limitations and peculiarities" block here for more details 7. Oracle cloud: see here details on how to configure cloud connector.8. Storages known to be NOT supported: Amazon Glacier, Minio
(**) Empty bucket name
If you leave it empty, agent will be able to access all buckets by the provided access keys. Thus you MUST specify some bucket name as part of the path during job configuration and that bucket MUST exist in the storage already.- Name of the storage as it appear in MC everywhere
- Description, optional
- Access key. Get in (or create one) in your project settings
- Secret can also be obtained in Storage Settings
- Project ID. Can be learned from Project info.
- Use bucket. Optional*. Can be left empty, if the access keys have permissions to list the buckets. Otherwise, it won't be possible to test connection to the cloud and browse through buckets.
(*) Empty bucket name
If you leave it empty, agent will be able to access all buckets by the provided access keys. Thus you MUST specify some bucket name as part of the path during job configuration and that bucket MUST exist in the storage already.Create a new application. Be sure to save Client secret value. Application ID and Client secret value will be needed later when configuring connector in the Management Console
https://portal.azure.com/#view/Microsoft_AAD_RegisteredApps/ApplicationsListBlade
Microsoft Graph ‘Application’ level permissions Files.ReadWrite.All, Sites.Read.All, User.Read should be granted.
Go to ‘API permissions’ section -> Add a permission, in the ‘Microsoft APIs’ tab choose ‘Microsoft Graph’, then ‘Application permissions’, and search for a specific permission. Be sure that “Admin consent” is granted for the added permissions.
On the Management Console go to Storage Connectors menu and add new storage Sharepoint.
Fill in Tenant ID and Client ID with the information from the Application. Use the saved Client secret value.
Root (site name) is required.
Drive - Document library name. Optional, if not provided, the Agent will enumerate drives by its permissions. If the drive is not provided, be sure to add it to the path inside the job, or use the folder picker to browse through the drive.
Be sure to apply "Preset for Sharepoint" in the Agent Profile before configuring the job.
Read here for more details about synchronising with Sharepoint Online
Once you have a storage configured, at least 2 agents to transfer data and 1 agent is ready to take the role ofcloud agent you can create a job with cloud storage.
- Once you decide one of agents should deliver data to a cloud (or get it from cloud), apply cloud storage Profile to it. Be sure to use "Preset for Sharepoint" for Sharepoint Online synchronization.
- Create and configure job normally. Click this agent's path and select "Storage Connectors" from the path macro dropdown, Select the preconfigured storage connector
- Once you picked the storage, enter the path for that storage or browse through the buckets, see below for completing the "Path" field correctly.
(*) Bucket/container/drive name not configured?
- if you left them empty in cloud storage configuration, now you MUST enter it as a first component of the path. It CAN be the only component, or you may add more subfolders- if you pre-defined them name in an cloud storage configuration, Agent will create the path you enter inside that bucket/container/drive. They must exist on the storage
Storage connector path macro is only available for agents of version 2.9 and newer, which have this feature in their license package.
Starting with Resilio Active Everywhere it's also available for High Availability groups.
Selective Sync is not supported for a cloud storage. If Selective Sync option is checked, path configuration will report an error: