Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings
#

model-inference-service

Here are 6 public repositories matching this topic...

Language:All
Filter by language
BentoML

The easiest way to serve AI apps and models - Build Model Inference APIs, Job queues, LLM apps, Multi-model pipelines, and more!

  • UpdatedNov 28, 2025
  • Python

CLIP as a service - Embed image and sentences, object recognition, visual reasoning, image classification and reverse image search

  • UpdatedJul 29, 2025
  • Jupyter Notebook

Online Inference API for NLP Transformer models - summarization, text classification, sentiment analysis and more

  • UpdatedMar 16, 2024
  • Python

Learn the ins and outs of efficiently serving Large Language Models (LLMs). Dive into optimization techniques, including KV caching and Low Rank Adapters (LoRA), and gain hands-on experience with Predibase’s LoRAX framework inference server.

  • UpdatedApr 12, 2024
  • Jupyter Notebook

gRPC server for Machine Learning (ML) Model Inference in Rust.

  • UpdatedOct 27, 2025
  • Rust

SPIRA Serving Predictor v1 by@daitamae and@vitorguidi

  • UpdatedJul 8, 2023
  • Python

Improve this page

Add a description, image, and links to themodel-inference-service topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with themodel-inference-service topic, visit your repo's landing page and select "manage topics."

Learn more


[8]ページ先頭

©2009-2025 Movatter.jp