Skip to content

Access Spectus Amazon S3

The following instructions outline the multiple resources available to access Spectus Amazon s3. 

Cyberduck

  • Download Cyberduck here
  • Open the Cyberduck application once downloaded
  • Click on the bookmarks tab in Cyberduck and hit the Plus sign at the bottom:

  • In the window that appears, select Amazon S3 as the connection type.
  • Enter a nickname, s3.amazonaws.com as the Server, and the access key provided.
  • Then in Path, enter the s3 path provided by a Spectus rep, without the 's3://' at the beginning
  • Make sure to include a trailing slash at the end of the s3 path

  • Double-click on the new bookmark that's been set up and enter the Secret Key provided when prompted to connect

AWS CLI

AWS CLI full documentation

Installation documentation for AWS CLI 

  1. In order to confirm that AWS CLI was correctly installed, navigate to the command line and run the below command:
aws --version
  1. Once AWS CLI is confirmed to be successfully installed, configure a profile with the below command
aws configure --profile cuebiq_data
  1. You will then be prompted to enter the below fields:
**AWS Access Key ID [None]:** EHUA5XJNP4G3GKVHCPEC  
**AWS Secret Access Key [None]:** G9WzzuD2I5AW+4V671IBvf2uCJ0VOTjUMHtFHYvA  
**Default region name [None]:**  
**Default output format [None]:** 

The access key and secret key can be filled in with the credentials provided by your Spectus Rep. Default region name and Default output format can remain blank.

  1. Once a profile has been configured, the below command can be used to test that access to the Spectus s3 bucket is working as expected:
aws s3 ls s3://<spectus_path_provided_goes_here>/ --profile cuebiq_data

The s3 path mirrors what Spectus has provided exactly. If configured correctly, you should now be able to see the folders within Spectus s3.

Python via Boto3

Boto3 s3 documentation 

  1. To test out the connection to the Spectus s3 bucket, a script like the below can be used:
import boto3  

source_aws_key='<access_key_goes_here>'  
source_aws_secret='<secret_key_goes_here>'  
source_bucket_name='<bucket_name_goes_here>'  
Pfix = '<path_name_goes_here>' 
  • path name starts after s3 bucket name
    • e.g. if full path is s3://pathname/1/ce-an-842/, path_name is 1/ce-an_842/
Source_conn_S3 = boto3.client('s3', aws_access_key_id=source_aws_key,aws_secret_access_key=source_aws_secret)
result = Source_conn_S3.list_objects(Bucket=source_bucket_name,Prefix=Pfix, Delimiter='/')  
for i in result['Contents']:  
    print(i['Key'].split('/')[-1])

2 . Use a script like the below to download data locally from the Spectus s3 bucket:

import boto3  
import os  
ACCESS_KEY = '<access_key_goes_here>'  
SECRET_KEY = '<secret_key_goes_here>'
def download_from_aws():  
s3 = boto3.client('s3', aws_access_key_id=ACCESS_KEY,  
                     aws_secret_access_key=SECRET_KEY)  
s3_resource = boto3.resource('s3', aws_access_key_id=ACCESS_KEY,  
                      aws_secret_access_key=SECRET_KEY)  
s3_bucket = s3_resource.Bucket('<bucket_name_goes_here>')   
    for file in s3_bucket.objects.filter(Prefix = '<path_name_goes_here>'):  
        print(file.key)  
        with open('/Users/username/Downloads/{}'.format(file.key.split('/')[-1]), 'wb') as data:  
            print(data)  
            s3.download_fileobj('cuebiq-pathname-nv', file.key, data)

MSP360 (Formerly Cloudberry)

Download Cloudberry here 

  1. Once Cloudberry is downloaded, follow the below steps to connect to an s3 bucket

  2. Open Cloudberry application and select Amazon S3 as the Cloud Storage type

  1. Enter in a custom display name, as well as the AWS access Key and Secret Key provided by Spectus

  2. Once AccessKey and SecretKey have been configured, double click on the newly registered account and hit ‘Ok’ to allow connection

  3. In the resulting right-side window, connect to the s3 bucket by entering the entire path name, including the trailing slash at the end of the path

  4. e.g. s3://pathname/2/fd32e824-7qsr-4e96-a3a3-a218e6e4ce9/