Before you can start creating jobs that send or receive data from the cloud, you need to configure a cloud storage connection in Settings -> Cloud Storage:
Resilio Connect Agents can work with several types of cloud storage: Amazon S3, Azure blob, Azure files, Google cloud and generic S3 Compatible storages. Each type is configured separately. To be able to add cloud storage, the MC license must include a "Cloud" feature in at least one package.
No connection probed
The Management Console itself won't probe the connection to your cloud storage. An attempt to connect will occur when you configure a job that uses cloud storage and run it on an Agent.- Name of the storage (best to use a name that identifies the type of storage it is)
- Description, optional
- Access key ID. Should be copied from your Amazon account
- Secret access key. Should be copied from your Amazon account
- Region name. Please specify as dash-delimited text as in "Region" column of officially supported AWS regions.
- Use bucket. Optional*. Can be left empty, if the access keys have permissions to list the buckets. Otherwise, it won't be possible to test connection to the cloud and browse through buckets.
(*) Empty bucket name
If you leave it empty, agent will be able to access all buckets by the provided access keys. Thus you MUST specify some bucket name as part of the path during job configuration and that bucket MUST exist in the storage already.-
Name of the storage as it will appear in MC everywhere
-
Description, optional
- Access key or SAS token. You can get in Home -> Access keys or Shared access signature
of your Storage account - Storage account name
- Leave endpoint default unless you are using non-default connection endpoint
- Use container. Optional*. Can be left empty, if the access keys have permissions to list the containers. Otherwise, it won't be possible to test connection to the cloud and browse through containers.
(*) Empty container name
If you leave it empty, agent will be able to access all containers by the provided access keys. Thus you MUST specify some container name as part of the path during job configuration and that container MUST exist in the storage already.-
Name of the storage as it will appear in MC everywhere
-
Description, optional.
- Access key or SAS token. You can get in Home -> Access keys or Shared access signature
of your Storage account - Storage account name
- Leave endpoint default unless you are using non-default connection endpoint
- Use share. Optional*. Can be left empty, if the access keys have permissions to list the shares. Otherwise, it won't be possible to test connection to the cloud and browse through buckets.
(*) Empty share name
If you leave it empty, agent will be able to access all shares by the provided access keys. Thus you MUST specify some share name as part of the path during job configuration and that container MUST exist in the storage already.- Name of the storage as it will appear everywhere in MC
- Description, optional
- Your endpoint that agent will attempt to send S3 API requests to
- Access Key ID. Should be provided to you by your S3-compatible storage vendor
- Secret access key. Should be provided to you by your S3-compatible storage vendor
- Region. Specify bucket's region.
- Checkbox 'Use SSL'. If checked, TLS protocol will be used in communication with the storage.
- Use bucket. Optional*. Can be left empty, if the access keys have permissions to list the buckets. Otherwise, it won't be possible to test connection to the cloud and browse through buckets.
(*) Empty bucket name
If you leave it empty, agent will be able to access all buckets by the provided access keys. Thus you MUST specify some bucket name as part of the path during job configuration and that bucket MUST exist in the storage already.- Name of the storage as it appear in MC everywhere
- Description, optional
- Access key. Get in (or create one) in your project settings
- Secret can also be obtained in project settings
- Project ID. Can be learned from Project info.
- Use bucket. Optional*. Can be left empty, if the access keys have permissions to list the buckets. Otherwise, it won't be possible to test connection to the cloud and browse through buckets.
(*) Empty bucket name
If you leave it empty, agent will be able to access all buckets by the provided access keys. Thus you MUST specify some bucket name as part of the path during job configuration and that bucket MUST exist in the storage already.Once you have a storage configured, at least 2 agents to transfer data and 1 agent is ready to take the role of cloud agent you can create a job with cloud storage.
- Create and configure job normally. Once you decide one of agents should deliver data to a cloud (or get it from cloud), click this agent's path and unwrap the path macro dropdown:
- Select "Cloud storage" and either pick pre-configured cloud storage or configure one. Do not use "Direct path" option here!
- Once you picked the storage, enter the path for that storage or browse through the buckets, see below for completing the "Path" field correctly.
(*) Bucket/container name not configured?
- if you left the bucket/container name empty in your cloud storage configuration, now you MUST enter it as a first component of the path. It CAN be the only component, or you may add more subfolders- if you pre-defined bucket/container name in an cloud storage configuration, Agent will create the path you enter inside that bucket/container. In either case the bucket itself must exist on the storage
Cloud Storage path macro is only available for agents of version 2.9 and newer, which have this feature in their license package.
It's not possible to assign cloud storage path to a group, that's not supported.
Selective Sync is not supported for a cloud storage. If Selective Sync option is checked, path configuration will report an error: