LogoLogo
Have questions?📞 Speak with a specialist.📅 Book a demo now.
  • Welcome
  • INTRODUCTION
    • What Is Aizen?
    • Aizen Platform Interfaces
    • Typical ML Workflow
    • Datasets and Features
    • Resources and GPUs
    • LLM Operations
    • Glossary
  • INSTALLATION
    • Setting Up Your Environment
      • Hardware Requirements
      • Deploying Kubernetes On Prem
      • Deploying Kubernetes on AWS
      • Deploying Kubernetes on GCP
        • GCP and S3 API Interoperability
        • Provisioning the Cloud Service Mesh
        • Installing Ingress Gateways with Istio
      • Deploying Kubernetes on Azure
        • Setting Up Azure Blob Storage
    • Installing Aizen
      • Software Requirements
      • Installing the Infrastructure Components
      • Installing the Core Components
      • Virtual Services and Gateways Command Script (GCP)
      • Helpful Deployment Commands
    • Installing Aizen Remote Components
      • Static Remote Deployment
      • Dynamic Remote Deployment
    • Installing Optional Components
      • MinIO
      • OpenLDAP
      • OpenEBS Operator
      • NGINX Ingress Controller
      • Airbyte
  • GETTING STARTED
    • Managing Users and Roles
      • Aizen Security
      • Adding Users
      • Updating Users
      • Listing Users and Roles
      • Granting or Revoking Roles
      • Deleting Users
    • Accessing the Aizen Platform
    • Using the Aizen Jupyter Console
  • MANAGING ML WORKFLOWS
    • ML Workflow
    • Configuring Data Sources
    • Configuring Data Sinks
    • Creating Training Datasets
    • Performing ML Data Analysis
    • Training an ML Model
    • Adding Real-Time Data Sources
    • Serving an ML Model
    • Training and Serving Custom ML Models
  • MANAGING LLM WORKFLOWS
    • LLM Workflow
    • Configuring Data Sources
    • Creating Training Datasets for LLMs
    • Fine-Tuning an LLM
    • Serving an LLM
    • Adding Cloud Providers
    • Configuring Vector Stores
    • Running AI Agents
  • Notebook Commands Reference
    • Notebook Commands
  • SYSTEM CONFIGURATION COMMANDS
    • License Commands
      • check license
      • install license
    • Authorization Commands
      • add users
      • alter users
      • list users
      • grant role
      • list roles
      • revoke role
      • delete users
    • Cloud Provider Commands
      • add cloudprovider
      • list cloudproviders
      • list filesystems
      • list instancetypes
      • status instance
      • list instance
      • list instances
      • delete cloudprovider
    • Project Commands
      • create project
      • alter project
      • exportconfig project
      • importconfig project
      • list projects
      • show project
      • set project
      • listconfig all
      • status all
      • stop all
      • delete project
      • shutdown aizen
    • File Commands
      • install credentials
      • list credentials
      • delete credentials
      • install preprocessor
  • MODEL BUILDING COMMANDS
    • Data Source Commands
      • configure datasource
      • describe datasource
      • listconfig datasources
      • delete datasource
    • Data Sink Commands
      • configure datasink
      • describe datasink
      • listconfig datasinks
      • alter datasink
      • start datasink
      • status datasink
      • stop datasink
      • list datasinks
      • display datasink
      • delete datasink
    • Dataset Commands
      • configure dataset
      • describe dataset
      • listconfig datasets
      • exportconfig dataset
      • importconfig dataset
      • start dataset
      • status dataset
      • stop dataset
      • list datasets
      • display dataset
      • export dataset
      • import dataset
      • delete dataset
    • Data Analysis Commands
      • loader
      • show stats
      • show datatypes
      • show data
      • show unique
      • count rows
      • count missingvalues
      • plot
      • run analysis
      • run pca
      • filter dataframe
      • list dataframes
      • set dataframe
      • save dataframe
    • Training Commands
      • configure training
      • describe training
      • listconfig trainings
      • start training
      • status training
      • list trainings
      • list tensorboard
      • start tensorboard
      • stop tensorboard
      • stop training
      • restart training
      • delete training
      • list mlflow
      • save embedding
      • list trained-models
      • list trained-model
      • export trained-model
      • import trained-model
      • delete trained-model
      • register model
      • update model
      • list registered-models
      • list registered-model
  • MODEL SERVING COMMANDS
    • Resource Commands
      • configure resource
      • describe resource
      • listconfig resources
      • alter resource
      • delete resource
    • Prediction Commands
      • configure prediction
      • describe prediction
      • listconfig predictions
      • start prediction
      • status prediction
      • test prediction
      • list predictions
      • stop prediction
      • list prediction-logs
      • display prediction-log
      • delete prediction
    • Data Report Commands
      • configure datareport
      • describe datareport
      • listconfig datareports
      • start datareport
      • list data-quality
      • list data-drift
      • list target-drift
      • status data-quality
      • display data-quality
      • status data-drift
      • display data-drift
      • status target-drift
      • display target-drift
      • delete datareport
    • Runtime Commands
      • configure runtime
      • describe runtime
      • listconfig runtimes
      • start runtime
      • status runtime
      • stop runtime
      • delete runtime
  • LLM AND EMBEDDINGS COMMANDS
    • LLM Commands
      • configure llm
      • listconfig llms
      • describe llm
      • start llm
      • status llm
      • stop llm
      • delete llm
    • Vector Store Commands
      • configure vectorstore
      • describe vectorstore
      • listconfig vectorstores
      • start vectorstore
      • status vectorstore
      • stop vectorstore
      • delete vectorstore
    • LLM Application Commands
      • configure llmapp
      • describe llmapp
      • listconfig llmapps
      • start llmapp
      • status llmapp
      • stop llmapp
      • delete llmapp
  • TROUBLESHOOTING
    • Installation Issues
Powered by GitBook

© 2025 Aizen Corporation

On this page
  1. INTRODUCTION

Glossary

Term
Definition

AI

Artificial Intelligence (AI), machine or computer simulation of human cognitive processes, such as learning from data, recognizing patterns, and making decisions.

Basis

Basis data or basis features are those inputs to the ML model, which are provided by an external application in the prediction REST request.

Batch data

Data that is processed in large groups at scheduled times

Contextual

Contextual data or contextual features are those inputs to the ML model, which are fetched from data sources and were not part of the prediction REST request at the time of prediction.

CPU

A Central Processing Unit (CPU) is the primary processor in a computing system.

Data sink

A table in Aizen storage that corresponds to a data source. There are two types of data sinks: events and static.

Data source

The source or original location of the raw data that is used to train ML models. The data sources are external to the Aizen platform and are typically database tables (connected as JDBC endpoints), CSV files, or streaming sources (connected as Kafka endpoints).

DataOps

Data Operations (DataOps), a set of practices and technologies for improving data analytics

Dataset

A collection of related data that is used to train ML models

DL

Deep Learning (DL), a type of machine learning that uses artificial neural networks, similar to the human brain, to train computers to process data and make decisions based on examples

Entity

An object or concept that can be modeled and that has features associated with it. It is a key column in database terms. Examples are customer and product.

Feature

A feature is an individual measurable property. It is a column in database terms. Examples are user rating (product data) and humidity (weather data).

GPU

A Graphics Processing Unit (GPU) is an electrical circuit that can rapidly process large amounts of calculations simultaneously, thus making it useful for accelerating the training of ML models.

InfraOps

Infrastructure Operations (InfraOps), the management and maintenance of a company's IT infrastructure.

IoT

Internet of Things (IoT)

IPYNB

Interactive Python Notebook (IPYNB), a text-based file used by the Jupyter Notebook.

JSON

JavaScript Object Notation (JSON)

Label

The actual output value that an ML model is trying to learn.

LLM

Large Language Model (LLM) is a type of machine learning that processes and generates language.

ML

Machine Learning (ML), an area of artificial intelligence where computers use algorithms and statistical models to analyze input data and predict output data, steadily learning and improving performance over time

ML model

A machine learning (ML) model is an algorithm that has been trained on a dataset to identify patterns in the dataset and make predictions based on those patterns.

RAG

Retrieval-Augmented Generation (RAG)

SQL

Structured Query Language (SQL), a relational database programming language

Streaming data

Data that is processed continuously in real time as it arrives

MLOps

Machine Learning Operations (MLOps), a set of practices and technologies for managing the machine learning (ML) lifecycle

UDF

User-Defined Function (UDF)

YAML

Yet Another Markup Language (YAML)

PreviousLLM OperationsNextSetting Up Your Environment

Last updated 3 months ago