Product Documentation
HomepageLoginSign up
  • Welcome to Biodock
  • Why Biodock
  • Quickstart
    • Start here (choose a tutorial)
    • Path 1: AI-assisted analysis
    • Path 2: Train a fully-automated AI model
      • Set up your project
      • Label and train
      • Labeling shortcuts
      • Run your AI model
      • Configuring your model
  • AI Projects
    • Project settings
    • Sharing your AI project
    • Training a model
  • AI Analysis
    • Results dashboard overview
    • Download results and reports
    • Correcting results to improve your model
    • Editing objects (QC)
    • Filters (like flow cytometry)
  • Biodock Scripts
    • Biodock Platform Script Guidelines
  • Files
    • Uploading images
    • AWS S3 integration
    • View and manage data
      • Viewing images
      • File details and download
      • Copy, cut, move
      • Sharing data
      • Merging channels
    • Supported image types
  • Public API (Beta)
    • Overview
    • Authentication
    • Examples
    • Resources
      • Files
      • Analysis Jobs
      • Pipelines
    • Limitations
  • Deep AI models
    • Evaluating Performance
  • User
    • Account registration
    • Change password, login, logout
    • Usage limits and team
      • Filesize based limits
      • Run credits limits
    • Billing
  • Company
    • Academics and startups
    • Contact us
    • Citing Biodock
    • Security and IP
Powered by GitBook
On this page
  • Generating a limited S3 read user and keys
  • Connecting your S3 Bucket with Biodock
  • Minimum working policy
  1. Files

AWS S3 integration

Biodock supports integration with existing S3 cloud datastores

PreviousUploading imagesNextView and manage data

Last updated 2 years ago

Generating a limited S3 read user and keys

You can skip this step and go down to if you already have AWS Access Key ID and Secret Key ID for the S3 buckets you wish to integrate

Log into the AWS console and navigate to the IAM Dashboard → Users. Then, click Add User.

From the Add User screen, pick any username, like biodock-s3-access, and check Programmatic access.

On the next screen, select Attach existing policies directly and search for S3. From the results, choose AmazonS3ReadOnlyAccess.

Note: If you want to restrict access to a single bucket, you will need to create a custom policy and attach it to the user. An example minimum policy is shown at the bottom of this page.

Once you have created your user by clicking next through the screens, make sure to copy this user's AWS Key ID & Secret Key ID for the next step.

Connecting your S3 Bucket with Biodock

Navigate to the Files page on the left sidebar and select the Amazon S3 option --

Provide your AWS Access Key & Secret. Your keys are AES-256 encrypted and stored but you can disconnect at any time using the Disconnect button on the top right, which will purge your keys from Biodock.

You can search only by prefix in the S3 file explorer (this restriction is created by AWS). Type in the prefix for the desired files or folders you want to import into Biodock and click Import to Biodock and then choose the location in My Files where you would like to import the image.

You can then navigate to My Files and start using your imported files in Biodock. See Running AI Analysis on Images.

Minimum working policy

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": [
                "s3:GetObject",
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::test-examples-biodock", // Replace with your bucket
                "arn:aws:s3:::test-examples-biodock/*" // Replace with your bucket
            ]
        },
        {
            "Sid": "VisualEditor1",
            "Effect": "Allow",
            "Action": "s3:ListAllMyBuckets",
            "Resource": "*"
        }
    ]
}