Product Documentation
HomepageLoginSign up
  • Welcome to Biodock
  • Why Biodock
  • Quickstart
    • Start here (choose a tutorial)
    • Path 1: AI-assisted analysis
    • Path 2: Train a fully-automated AI model
      • Set up your project
      • Label and train
      • Labeling shortcuts
      • Run your AI model
      • Configuring your model
  • AI Projects
    • Project settings
    • Sharing your AI project
    • Training a model
  • AI Analysis
    • Results dashboard overview
    • Download results and reports
    • Correcting results to improve your model
    • Editing objects (QC)
    • Filters (like flow cytometry)
  • Biodock Scripts
    • Biodock Platform Script Guidelines
  • Files
    • Uploading images
    • AWS S3 integration
    • View and manage data
      • Viewing images
      • File details and download
      • Copy, cut, move
      • Sharing data
      • Merging channels
    • Supported image types
  • Public API (Beta)
    • Overview
    • Authentication
    • Examples
    • Resources
      • Files
      • Analysis Jobs
      • Pipelines
    • Limitations
  • Deep AI models
    • Evaluating Performance
  • User
    • Account registration
    • Change password, login, logout
    • Usage limits and team
      • Filesize based limits
      • Run credits limits
    • Billing
  • Company
    • Academics and startups
    • Contact us
    • Citing Biodock
    • Security and IP
Powered by GitBook
On this page
  • Upload and analyze files from your local machine
  • Upload and analyze files from your local machine and the Biodock Filesystem
  1. Public API (Beta)

Examples

Upload and analyze files from your local machine

In this scenario, you want to run an analysis job consisting solely of files which exist on your local machine. One way to do this would be to upload each file, then submit an analysis job with all of the file ids.

import requests
import os

LOCAL_FILES = [""] # Replace with your file paths, change list size as necessary
API_KEY = ""  # Replace with your api key
DESIRED_FOLDER = "" # Replace with your desired folder name
PIPELINE_ID = "" # Replace with your pipeline id

UPLOAD_URL = "https://app.biodock.ai/api/external/filesystem-items/upload-file"
ANALYSIS_URL = "https://app.biodock.ai/api/external/analysis-jobs"

# Upload files
biodock_file_ids = []
for my_file in LOCAL_FILES:
    with open(my_file, "rb") as file_to_upload:
        data = {
            "fileName": os.path.basename(my_file), 
            "destinationFolder": DESIRED_FOLDER
        }
        headers = {"X-API-KEY": API_KEY}
        files = {"upload": file_to_upload}
        response = requests.post(UPLOAD_URL, data=data, headers=headers, files=files)
        print(response.text)
        biodock_file_ids.append(response.json()["id"])
 
# Submit analysis job
submit_headers = {"X-API-KEY": API_KEY, "Content-Type": "application/json"}
data = {"filesystemIds": biodock_file_ids, "pipelineId": PIPELINE_ID}
response = requests.post(ANALYSIS_URL, json=data, headers=submit_headers)
print(response.text)

Upload and analyze files from your local machine and the Biodock Filesystem

In this scenario you have an existing folder with files on the Biodock Filesystem. You also have files on your local machine. You would like add the local files to the remote folder, and run analysis on all the files.

import requests
import os

LOCAL_FILES = [""] # Replace with your file paths, change list size as neccesary
API_KEY = ""  # Replace with your api key
BIODOCK_FOLDER_NAME = "" # Replace with your existing folder name
BIODOCK_FOLDER_ID = "" # Replace with your existing folder id
PIPELINE_ID = "" # Replace with your pipeline id

UPLOAD_URL = "https://app.biodock.ai/api/external/filesystem-items/upload-file"
ANALYSIS_URL = "https://app.biodock.ai/api/external/analysis-jobs"

# Upload files
for my_file in LOCAL_FILES:
    with open(my_file, "rb") as file_to_upload:
        data = {
            "fileName": os.path.basename(my_file), 
            "destinationFolder": BIODOCK_FOLDER_NAME
        }
        headers = {"X-API-KEY": API_KEY}
        files = {"upload": file_to_upload}
        response = requests.post(UPLOAD_URL, data=data, headers=headers, files=files)
        print(response.text)

# Submit analysis job  
submit_headers = {"X-API-KEY": API_KEY, "Content-Type": "application/json"}
data = {"filesystemIds": [BIODOCK_FOLDER_ID], "pipelineId": PIPELINE_ID}
response = requests.post(ANALYSIS_URL, json=data, headers=headers)
print(response.text)

PreviousAuthenticationNextResources

Last updated 2 years ago