EyePop On-Premise AI Runtime
Deployable, Flexible, API-First Inference Engine
Key Features
Whatβs Included
System Requirements
Quick Start Guide
1. Validate Docker Installation
2. Get Your Deployment Credentials
3. Authenticate with EyePop Docker Registry
4. Pull the Runtime Image
5. Create Your Provisioning File
6. Start the AI Runtime
Using the Runtime Locally (Python Example)
Hardware Acceleration (CUDA)
Confirm Setup
Run GPU Runtime
Configuration Options (eyepop-instance.yml)
eyepop-instance.yml)Key
Required
Description
Uninstall
Support
Last updated