LogoLogo
Have questions?📞 Speak with a specialist.📅 Book a demo now.
  • Welcome
  • INTRODUCTION
    • What Is Aizen?
    • Aizen Platform Interfaces
    • Typical ML Workflow
    • Datasets and Features
    • Resources and GPUs
    • LLM Operations
    • Glossary
  • INSTALLATION
    • Setting Up Your Environment
      • Hardware Requirements
      • Deploying Kubernetes On Prem
      • Deploying Kubernetes on AWS
      • Deploying Kubernetes on GCP
        • GCP and S3 API Interoperability
        • Provisioning the Cloud Service Mesh
        • Installing Ingress Gateways with Istio
      • Deploying Kubernetes on Azure
        • Setting Up Azure Blob Storage
    • Installing Aizen
      • Software Requirements
      • Installing the Infrastructure Components
      • Installing the Core Components
      • Virtual Services and Gateways Command Script (GCP)
      • Helpful Deployment Commands
    • Installing Aizen Remote Components
      • Static Remote Deployment
      • Dynamic Remote Deployment
    • Installing Optional Components
      • MinIO
      • OpenLDAP
      • OpenEBS Operator
      • NGINX Ingress Controller
      • Airbyte
  • GETTING STARTED
    • Managing Users and Roles
      • Aizen Security
      • Adding Users
      • Updating Users
      • Listing Users and Roles
      • Granting or Revoking Roles
      • Deleting Users
    • Accessing the Aizen Platform
    • Using the Aizen Jupyter Console
  • MANAGING ML WORKFLOWS
    • ML Workflow
    • Configuring Data Sources
    • Configuring Data Sinks
    • Creating Training Datasets
    • Performing ML Data Analysis
    • Training an ML Model
    • Adding Real-Time Data Sources
    • Serving an ML Model
    • Training and Serving Custom ML Models
  • MANAGING LLM WORKFLOWS
    • LLM Workflow
    • Configuring Data Sources
    • Creating Training Datasets for LLMs
    • Fine-Tuning an LLM
    • Serving an LLM
    • Adding Cloud Providers
    • Configuring Vector Stores
    • Running AI Agents
  • Notebook Commands Reference
    • Notebook Commands
  • SYSTEM CONFIGURATION COMMANDS
    • License Commands
      • check license
      • install license
    • Authorization Commands
      • add users
      • alter users
      • list users
      • grant role
      • list roles
      • revoke role
      • delete users
    • Cloud Provider Commands
      • add cloudprovider
      • list cloudproviders
      • list filesystems
      • list instancetypes
      • status instance
      • list instance
      • list instances
      • delete cloudprovider
    • Project Commands
      • create project
      • alter project
      • exportconfig project
      • importconfig project
      • list projects
      • show project
      • set project
      • listconfig all
      • status all
      • stop all
      • delete project
      • shutdown aizen
    • File Commands
      • install credentials
      • list credentials
      • delete credentials
      • install preprocessor
  • MODEL BUILDING COMMANDS
    • Data Source Commands
      • configure datasource
      • describe datasource
      • listconfig datasources
      • delete datasource
    • Data Sink Commands
      • configure datasink
      • describe datasink
      • listconfig datasinks
      • alter datasink
      • start datasink
      • status datasink
      • stop datasink
      • list datasinks
      • display datasink
      • delete datasink
    • Dataset Commands
      • configure dataset
      • describe dataset
      • listconfig datasets
      • exportconfig dataset
      • importconfig dataset
      • start dataset
      • status dataset
      • stop dataset
      • list datasets
      • display dataset
      • export dataset
      • import dataset
      • delete dataset
    • Data Analysis Commands
      • loader
      • show stats
      • show datatypes
      • show data
      • show unique
      • count rows
      • count missingvalues
      • plot
      • run analysis
      • run pca
      • filter dataframe
      • list dataframes
      • set dataframe
      • save dataframe
    • Training Commands
      • configure training
      • describe training
      • listconfig trainings
      • start training
      • status training
      • list trainings
      • list tensorboard
      • start tensorboard
      • stop tensorboard
      • stop training
      • restart training
      • delete training
      • list mlflow
      • save embedding
      • list trained-models
      • list trained-model
      • export trained-model
      • import trained-model
      • delete trained-model
      • register model
      • update model
      • list registered-models
      • list registered-model
  • MODEL SERVING COMMANDS
    • Resource Commands
      • configure resource
      • describe resource
      • listconfig resources
      • alter resource
      • delete resource
    • Prediction Commands
      • configure prediction
      • describe prediction
      • listconfig predictions
      • start prediction
      • status prediction
      • test prediction
      • list predictions
      • stop prediction
      • list prediction-logs
      • display prediction-log
      • delete prediction
    • Data Report Commands
      • configure datareport
      • describe datareport
      • listconfig datareports
      • start datareport
      • list data-quality
      • list data-drift
      • list target-drift
      • status data-quality
      • display data-quality
      • status data-drift
      • display data-drift
      • status target-drift
      • display target-drift
      • delete datareport
    • Runtime Commands
      • configure runtime
      • describe runtime
      • listconfig runtimes
      • start runtime
      • status runtime
      • stop runtime
      • delete runtime
  • LLM AND EMBEDDINGS COMMANDS
    • LLM Commands
      • configure llm
      • listconfig llms
      • describe llm
      • start llm
      • status llm
      • stop llm
      • delete llm
    • Vector Store Commands
      • configure vectorstore
      • describe vectorstore
      • listconfig vectorstores
      • start vectorstore
      • status vectorstore
      • stop vectorstore
      • delete vectorstore
    • LLM Application Commands
      • configure llmapp
      • describe llmapp
      • listconfig llmapps
      • start llmapp
      • status llmapp
      • stop llmapp
      • delete llmapp
  • TROUBLESHOOTING
    • Installation Issues
Powered by GitBook

© 2025 Aizen Corporation

On this page
  1. INSTALLATION
  2. Setting Up Your Environment
  3. Deploying Kubernetes on GCP

GCP and S3 API Interoperability

Google Cloud Storage offers interoperability support with Amazon's S3 API, allowing you to continue using the same S3 client libraries. To run an application designed to use the S3 storage API on Google Cloud Platform (GCP) without changing the application's storage interface, use the Storage Transfer Service for migration and then use Google Cloud Storage as your storage solution with a compatibility layer.

To configure your S3 application to use GCP, assuming GCP is currently configured to interact with S3, follow these steps:

  1. Set up a Google Cloud Storage bucket:

    1. Create a Google Cloud account, and set up a Google Cloud project.

    2. Create a bucket in Google Cloud Storage:

      1. Go to the Google Cloud Console.

      2. Navigate to Storage --> Browser and click on Create bucket.

      3. Configure the bucket settings according to your needs, similar to how your AWS S3 bucket is configured.

  2. Configure Google Cloud Storage to handle S3 API requests:

    Google Cloud Storage has built-in support to handle S3 API requests. You will need to make some adjustments to your configuration.

    1. Activate the S3 compatible API in your Google Cloud project:

      1. Go to the Google Cloud Console.

      2. Navigate to Settings under Storage and enable Interoperability.

      3. If you have not set up interoperability before, you might need to create a new interoperability access key.

    2. Use the genenerated access keys (access key ID and secret access key) in your existing application configuration where you previously used AWS credentials.

  3. Update the endpoint configuration:

    Modify your application's S3 API endpoint configuration to point to Google's Cloud Storage S3 endpoint. This typically involves changing the endpoint URL from https://s3.amazonaws.com to https://storage.googleapis.com.

  4. Data migration:

    To migrate your data from AWS S3 to Google Cloud Storage:

    1. Use Google's Storage Transfer Service:

      1. Navigate to the Storage section in the Google Cloud Console.

      2. Click on Transfer and create a transfer job from Amazon S3 to Google Cloud Storage.

      3. Provide the necessary credentials and configure the data transfer settings according to your requirements.

    2. Verify the data integrity after the transfer to ensure that all data is accurately copied over.

  5. Testing:

    1. Test the application to ensure that it functions correctly with the Google Cloud Storage backend using the S3 API.

    2. Look for any compatibility issues with the API calls and any performance discrepancies that might need addressing.

  6. Monitor and optimize:

    1. After migration, monitor your application for any performance issues or errors.

    2. Optimize configurations (like changing the class storage or lifecycle policies) as needed based on application usage patterns.

By following these steps, you can effectively run your application on GCP using Google Cloud Storage, with minimal changes to the way your application interfaces with S3. This approach leverages GCP's ability to mimic S3 APIs, providing a smoother transition without extensive code modifications.

PreviousDeploying Kubernetes on GCPNextProvisioning the Cloud Service Mesh

Last updated 3 months ago