LogoLogo
API ReferenceGitHubSlackService StatusLogin
v3.0.10
v3.0.10
  • Deep Lake Docs
  • List of ML Datasets
  • Quickstart
  • Dataset Visualization
  • Storage & Credentials
    • Storage Options
    • Managed Credentials
      • Enabling CORS
      • Provisioning Role-Based Access
  • API Reference
  • EXAMPLE CODE
  • Getting Started
    • Step 1: Hello World
    • Step 2: Creating Deep Lake Datasets
    • Step 3: Understanding Compression
    • Step 4: Accessing Data
    • Step 5: Visualizing Datasets
    • Step 6: Using Activeloop Storage
    • Step 7: Connecting Deep Lake Datasets to ML Frameworks
    • Step 8: Parallel Computing
    • Step 9: Dataset Version Control
    • Step 10: Dataset Filtering
  • Tutorials (w Colab)
    • Creating Datasets
      • Creating Complex Datasets
      • Creating Object Detection Datasets
      • Creating Time-Series Datasets
      • Creating Datasets with Sequences
      • Creating Video Datasets
    • Training Models
      • Training an Image Classification Model in PyTorch
      • Training an Object Detection and Segmentation Model in PyTorch
    • Querying Datasets
    • Data Processing Using Parallel Computing
  • Playbooks
    • Querying, Training and Editing Datasets with Data Lineage
    • Evaluating Model Performance
    • Training Reproducibility Using Deep Lake and Weights & Biases
    • Working with Videos
  • Performant Dataloader (Beta)
  • API Summary
  • How Deep Lake Works
    • Data Layout
    • Tensor Relationships
    • Visualizer Integration
    • Shuffling in ds.pytorch()
    • Storage Synchronization
    • How to Contribute
Powered by GitBook
On this page
  • Managing your Credentials with Activeloop
  • Connecting Deep Lake Datasets to Activeloop Platform
  • Managed Credentials UI

Was this helpful?

  1. Storage & Credentials

Managed Credentials

How to manage your credentials with Activeloop Platform

PreviousStorage OptionsNextEnabling CORS

Last updated 2 years ago

Was this helpful?

Managing your Credentials with Activeloop

Connecting Deep Lake Datasets to Activeloop Platform

Datasets in Activeloop storage are automatically connected to . Datasets in non-Activeloop storage (S3, GCS) can be connected to Activeloop Platform using the UI below.

It's is also necessary to in the bucket containing any source data.

Once connected, datasets can be loaded in the Python API using their Deep Lake path or their cloud path:

  • Using the Deep Lake path (hub://org_name/dataset_name) will automatically load the managed credentials required to authenticate with the cloud storage provider.

Managed Credentials UI

All managed credentials that are used by Deep Lake and Activeloop Platform can be added, renamed, edited, or deleted via the UI below:

Using the cloud path (s3://bucket/...)will require the user to specify credentials using the .

API here
Activeloop Platform
enable CORS