We have hosted the application aqueduct llm in order to run this application in our online workstations with Wine or directly.


Quick description about aqueduct llm:

Aqueduct is an MLOps framework that allows you to define and deploy machine learning and LLM workloads on any cloud infrastructure. Aqueduct is an open-source MLOps framework that allows you to write code in vanilla Python, run that code on any cloud infrastructure you'd like to use, and gain visibility into the execution and performance of your models and predictions. Aqueduct's Python native API allows you to define ML tasks in regular Python code. You can connect Aqueduct to your existing cloud infrastructure (docs), and Aqueduct will seamlessly move your code from your laptop to the cloud or between different cloud infrastructure layers. Aqueduct provides a single interface to running machine learning tasks on your existing cloud infrastructure � Kubernetes, Spark, Lambda, etc. From the same Python API, you can run code across any or all of these systems seamlessly and gain visibility into how your code is performing.

Features:
  • Python-native pipeline API
  • Integrated with your infrastructure
  • Centralized visibility into code, data, & metadata
  • Runs securely in your cloud
  • Aqueduct is fully open-source and runs in any Unix environment
  • It runs entirely in your cloud and on your infrastructure, so you can be confident that your data and code are secure


Programming Language: Go.
Categories:
Large Language Models (LLM)

Page navigation:

©2024. Winfy. All Rights Reserved.

By OD Group OU – Registry code: 1609791 -VAT number: EE102345621.