You can permanently move the data residing in an Amazon S3 storage tier to another Amazon S3 storage tier.
The process involves the following steps:
-
Contact the storage vendor and create the new storage class in cloud storage.
-
Recall the existing data in the current storage tier to the new tier. For example, from a S3 Deep Archive storage tier to the S3 Standard tier.
-
Run a command to permanently move the data to the appropriate new class.
-
For new data, change the storage class in the cloud storage library as described in Migrating Data Between the Storage Classes in a Cloud Storage Library.
Procedure
-
Identify the bucket name and mount path base folder for the data that must be moved.
Use the CloudStorageExplorer tool to view the bucket name and the base folder. For more information, see Starting the Cloud Storage Explorer Tool.
-
Create text file with the path to the data residing in the cloud depending on the number of mount paths that must be moved:
-
If you have multiple mount paths on the same bucket, then add the bucket and the mount path base folder as follows:
<BucketName>\<MountPathBaseFolder>\
Example:
MyBucket\E3TLWT_11.01.2017_03.09\
-
If you have just one mount path, than add only the bucket name as follows:
<BucketName>\
Example:
MyBucket\
Note
Make sure to include the backslash (\) at the end.
-
-
Navigate to the following folder:
<software installation directory\Base
Example:
c:\Program Files\Commvault\ContentStore\Base
-
From the command prompt, run the CloudTestTool with the following parameters, to restore the data from the current storage to the desired storage:
CloudTestTool.exe -h <endpoint_region_name> -u <AccessKeyID> -p <SecretAccessKey> -b <Bucket> -f "<InputFileName>" -m <#> -c <#> -o changeTier -T <#>
Options
Description
-h
A valid endpoint name for the Amazon S3 region provided by the agency.
Default: s3.[region].amazonaws.com
Example: s3.us-west-1.amazonaws.com
To find the region, see https://docs.aws.amazon.com/general/latest/gr/rande.html.
-u
Access key.
-p
Secret access key.
-b
Bucket name.
-f
Full path and name of the input file created in step 1.
-T
Storage Class to which the data must be recalled. Valid values are:
0 – Standard
2 – Standard - Infrequent Access
3 – One Zone - Infrequent Access
4 – Intelligent - Tiering
8 – Glacier
16 – Deep Archive
-m
Recall metadata options. Valid values are:
1 – Chunk Meta Data only
2 – SFILE only
3 – non SFILE
0 – All
-c
Recall mode options. Valid values are:
1 – Expedited
2 – Bulk
0 – Standard
Example:
CloudTestTool.exe -h s3.us-west-1.amazonaws.com -u MyAccessKeyID -p AbCdEf@aBcDeF -b MyBucket -f "C:\Program Files\Commvault\ContentStore\Base\Temp\MyList.txt" -m 0 -c 0 -o changeTier -T 0
This will start the process of recalling the data from the current storage tier to the target tier specified in the command.
Note
Depending on the amount of data, and the tier, this process may take some time to complete. For example, if you are moving the data from an archive tier, the process will depend on the recall mode.
Once the data is recalled, the
list.txt.ChangeList.txt
file will be created, with a complete list of files that was recalled. This file will be saved in the same folder specified for the input file. -
Permanently convert the storage class using the following command:
CloudTestTool.exe -h <endpoint_region_name> -u <AccessKeyID> -p <SecretAccessKey> -b <Bucket> -f "<InputFileName>" -o changeTierFullList -f <full path>\list.txt.ChangeList.txt
Example:
CloudTestTool.exe -h s3.us-west-1.amazonaws.com -u MyAccessKeyID -p AbCdEf@aBcDeF -b MyBucket -f "C:\Program Files\Commvault\ContentStore\Base\Temp\MyList.txt" -o changeTierFullList -f C:\Program Files\Commvault\ContentStore\Base\Temp\list.txt.ChangeList.txt
Once the command is completed successfully, you can view the storage tier for the data using the Cloud Storage Explorer tool. For more information about this tool, see Starting the Cloud Storage Explorer Tool.
Moving Data from Standard Storage Class to Glacier Flexible Retrieval Storage Class
Run the CloudTestTool with required parameters to move:
-
All the data to Glacier Flexible Retrieval storage class
CloudTestTool.exe -h s3.amazonaws.com -u ACCESS_KEYS -p SECRET_ACCESS_KEY -b bucket_name -f "c:\cloudchunk.txt" -m 0 -o changeTier -T 8
-
Only chunk metadata to Glacier Flexible Retrieval storage class
CloudTestTool.exe -h s3.amazonaws.com -u ACCESS_KEYS -p SECRET_ACCESS_KEY -b bucket_name -f "c:\cloudchunk.txt" -m 1 -o changeTier -T 8
-
Only SFILE to Glacier Flexible Retrieval storage class
CloudTestTool.exe -h s3.amazonaws.com -u ACCESS_KEYS -p SECRET_ACCESS_KEY -b bucket_name -f "c:\cloudchunk.txt" -m 2 -o changeTier -T 8
-
Only non-SFILE to Glacier Flexible Retrieval storage class
CloudTestTool.exe -h s3.amazonaws.com -u ACCESS_KEYS -p SECRET_ACCESS_KEY -b bucket_name -f "c:\cloudchunk.txt" -m 3 -o changeTier -T 8
Moving Data from Standard Storage Class to Deep Archive Storage Class
Run the CloudTestTool with the required parameters to move:
-
All the data to Deep Archive storage class
CloudTestTool.exe -h s3.amazonaws.com -u ACCESS_KEYS -p SECRET_ACCESS_KEY -b bucket_name -f "c:\cloudchunk.txt" -m 0 -o changeTier -T 16
-
Only chunk metadata to Deep Archive storage class
CloudTestTool.exe -h s3.amazonaws.com -u ACCESS_KEYS -p SECRET_ACCESS_KEY -b bucket_name -f "c:\cloudchunk.txt" -m 1 -o changeTier -T 16
-
Only SFILE to Deep Archive storage class
CloudTestTool.exe -h s3.amazonaws.com -u ACCESS_KEYS -p SECRET_ACCESS_KEY -b bucket_name -f "c:\cloudchunk.txt" -m 2 -o changeTier -T 16
-
Only non-SFILE to Deep Archive storage class
CloudTestTool.exe -h s3.amazonaws.com -u ACCESS_KEYS -p SECRET_ACCESS_KEY -b bucket_name -f "c:\cloudchunk.txt" -m 3 -o changeTier -T 16
Permanently Moving Data from Glacier Flexible Retrieval Storage Class to Standard Tier
-
Recall the data from Glacier Flexible Retrieval storage class to standard tier using the following command:
CloudTestTool.exe -h s3.amazonaws.com -u ACCESS_KEYS -p SECRET_ACCESS_KEY -b bucket_name -f "c:\cloudchunk.txt" -m 0 -o changeTier -T 0
Note
Depending on the amount of data, and the tier, this process may take some time to complete. Add -d with number of days at the end of the command to keep the data in the standard storage class till the data is moved permanently.
Sample output:
Generating the full list of objects to process
The full list of objects is generated to file c:\cloudchunk.txt.CompleteList.txt
You can run the tool again without generating the list by using the command with
-o changeTierFullList -f c:\cloudchunk.txt.CompleteList.txt
Starting the process for the objects
Processed 91 objects
-- 91 objects are rehydrating
Once the objects are accessible, you need to run the tool again and use the command with
-o changeTierFullList -f c:\cloudchunk.txt.RestoringList.txt
File c:\cloudchunk.txt.RestoringList.txt is generated
c:\Program Files\Commvault\ContentStore\Base>
-
Run the command to permanently move the data to the standard storage class.
c:\Program Files\Commvault\ContentStore\Base>CloudTestTool.exe -h s3.amazonaws.com -u ACCESS_KEYS -p SECRET_ACCESS_KEY -b bucket-name -f "c:\cloudchunk.txt.CompleteList.txt" -m 0 -o changeTierFullList -T 0
Sample output:
Starting the process for the objects... Processed 91 objects. -- 91 objects are in storage class STANDARD File c:\cloudchunk.txt.CompleteList.txt.SkipList.txt is generated. c:\Program Files\Commvault\ContentStore\Base>
What to Do Next
Once you move the existing data, you must also modify the storage class in the library and point it to the new storage tier. This will ensure that the new data writes to the new storage tier. For more information see Modifying the Storage Class for an Existing Cloud Library.