Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Commit9b8ca64

Browse files
authored
Update openai switch kit blog and guide (#1556)
1 parentdebd9ae commit9b8ca64

File tree

2 files changed

+38
-40
lines changed

2 files changed

+38
-40
lines changed

‎pgml-cms/blog/introducing-the-openai-switch-kit-move-from-closed-to-open-source-ai-in-minutes.md

Lines changed: 16 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -41,8 +41,8 @@ The Switch Kit is an open-source AI SDK that provides a drop in replacement for
4141
{% tabs %}
4242
{% tab title="JavaScript" %}
4343
```javascript
44-
constpgml=require("pgml");
45-
constclient=pgml.newOpenSourceAI();
44+
constkorvus=require("korvus");
45+
constclient=korvus.newOpenSourceAI();
4646
constresults=client.chat_completions_create(
4747
"meta-llama/Meta-Llama-3-8B-Instruct",
4848
[
@@ -62,8 +62,8 @@ console.log(results);
6262

6363
{% tab title="Python" %}
6464
```python
65-
importpgml
66-
client=pgml.OpenSourceAI()
65+
importkorvus
66+
client=korvus.OpenSourceAI()
6767
results= client.chat_completions_create(
6868
"meta-llama/Meta-Llama-3-8B-Instruct",
6969
[
@@ -117,17 +117,15 @@ The above is an example using our open-source AI SDK with Meta-Llama-3-8B-Instru
117117

118118
Notice there is near one to one relation between the parameters and return type of OpenAI’s`chat.completions.create` and our`chat_completion_create`.
119119

120-
The best part of using open-source AI is the flexibility with models. Unlike OpenAI, we are not restricted to using a few censored models, but have access to almost any model out there.
121-
122-
Here is an example of streaming with the popular Mythalion model, an uncensored MythoMax variant designed for chatting.
120+
Here is an example of streaming:
123121

124122
{% tabs %}
125123
{% tab title="JavaScript" %}
126124
```javascript
127-
constpgml=require("pgml");
128-
constclient=pgml.newOpenSourceAI();
125+
constkorvus=require("korvus");
126+
constclient=korvus.newOpenSourceAI();
129127
constit=client.chat_completions_create_stream(
130-
"PygmalionAI/mythalion-13b",
128+
"meta-llama/Meta-Llama-3-8B-Instruct",
131129
[
132130
{
133131
role:"system",
@@ -149,10 +147,10 @@ while (!result.done) {
149147

150148
{% tab title="Python" %}
151149
```python
152-
importpgml
153-
client=pgml.OpenSourceAI()
150+
importkorvus
151+
client=korvus.OpenSourceAI()
154152
results= client.chat_completions_create_stream(
155-
"PygmalionAI/mythalion-13b",
153+
"meta-llama/Meta-Llama-3-8B-Instruct",
156154
[
157155
{
158156
"role":"system",
@@ -184,7 +182,7 @@ for c in results:
184182
],
185183
"created":1701296792,
186184
"id":"62a817f5-549b-43e0-8f0c-a7cb204ab897",
187-
"model":"PygmalionAI/mythalion-13b",
185+
"model":"meta-llama/Meta-Llama-3-8B-Instruct",
188186
"object":"chat.completion.chunk",
189187
"system_fingerprint":"f366d657-75f9-9c33-8e57-1e6be2cf62f3"
190188
}
@@ -200,7 +198,7 @@ for c in results:
200198
],
201199
"created":1701296792,
202200
"id":"62a817f5-549b-43e0-8f0c-a7cb204ab897",
203-
"model":"PygmalionAI/mythalion-13b",
201+
"model":"meta-llama/Meta-Llama-3-8B-Instruct",
204202
"object":"chat.completion.chunk",
205203
"system_fingerprint":"f366d657-75f9-9c33-8e57-1e6be2cf62f3"
206204
}
@@ -212,15 +210,15 @@ We have truncated the output to two items
212210

213211
!!!
214212

215-
We also have asynchronous versions of the create and`create_stream` functions relatively named`create_async` and`create_stream_async`. Checkout[our documentation](https://postgresml.org/docs/introduction/machine-learning/sdks/opensourceai) for a complete guide of the open-source AI SDK including guides on how to specify custom models.
213+
We also have asynchronous versions of the create and`create_stream` functions relatively named`create_async` and`create_stream_async`. Checkout[our documentation](https://postgresml.org/docs/guides/opensourceai) for a complete guide of the open-source AI SDK including guides on how to specify custom models.
216214

217-
PostgresML is free and open source. To run the above examples yourself[create an account](https://postgresml.org/signup), installpgml, and get running!
215+
PostgresML is free and open source. To run the above examples yourself[create an account](https://postgresml.org/signup), installkorvus, and get running!
218216

219217
###Why use open-source models on PostgresML?
220218

221219
PostgresML is a complete MLOps platform in a simple PostgreSQL extension. It’s the tool our team wished they’d had scaling MLOps at Instacart during its peak years of growth. You can host your database with us or locally. However you want to engage, we know from experience that it’s better to bring your ML workload to the database rather than bringing the data to the codebase.
222220

223-
Fundamentally, PostgresML enables PostgreSQL to act as a GPU-powered AI application database — where you can both save models and index data. That eliminates the need for the myriad of separate services you have to tie together for your ML workflow.Pgml + pgvector create a complete ML platform (vector DB, model store, inference service, open-source LLMs) all within open-source extensions for PostgreSQL. That takes a lot of the complexity out of your infra, and it's ultimately faster for your users.
221+
Fundamentally, PostgresML enables PostgreSQL to act as a GPU-powered AI application database — where you can both save models and index data. That eliminates the need for the myriad of separate services you have to tie together for your ML workflow.pgml + pgvector create a complete ML platform (vector DB, model store, inference service, open-source LLMs) all within open-source extensions for PostgreSQL. That takes a lot of the complexity out of your infra, and it's ultimately faster for your users.
224222

225223
We're bullish on the power of in-database and open-source ML/AI, and we’re excited for you to see the power of this approach yourself. You can try it out in our serverless database for $0, with usage based billing starting at just five cents an hour per GB GPU cache. You can even mess with it for free on our homepage.
226224

‎pgml-cms/docs/guides/opensourceai.md

Lines changed: 22 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -6,26 +6,26 @@ OpenSourceAI is a drop in replacement for OpenAI's chat completion endpoint.
66

77
Follow the instillation section in[getting-started.md](../api/client-sdk/getting-started.md"mention")
88

9-
When done, set the environment variable`DATABASE_URL` to your PostgresML database url.
9+
When done, set the environment variable`KORVUS_DATABASE_URL` to your PostgresML database url.
1010

1111
```bash
12-
exportDATABASE_URL=postgres://user:pass@.db.cloud.postgresml.org:6432/pgml
12+
exportKORVUS_DATABASE_URL=postgres://user:pass@.db.cloud.postgresml.org:6432/pgml
1313
```
1414

1515
Note that an alternative to setting the environment variable is passing the url to the constructor of`OpenSourceAI`
1616

1717
{% tabs %}
1818
{% tab title="JavaScript" %}
1919
```javascript
20-
constpgml=require("pgml");
21-
constclient=pgml.newOpenSourceAI(YOUR_DATABASE_URL);
20+
constkorvus=require("korvus");
21+
constclient=korvus.newOpenSourceAI(YOUR_DATABASE_URL);
2222
```
2323
{% endtab %}
2424

2525
{% tab title="Python" %}
2626
```python
27-
importpgml
28-
client=pgml.OpenSourceAI(YOUR_DATABASE_URL)
27+
importkorvus
28+
client=korvus.OpenSourceAI(YOUR_DATABASE_URL)
2929
```
3030
{% endtab %}
3131
{% endtabs %}
@@ -59,8 +59,8 @@ Here is a simple example using zephyr-7b-beta, one of the best 7 billion paramet
5959
{% tabs %}
6060
{% tab title="JavaScript" %}
6161
```javascript
62-
constpgml=require("pgml");
63-
constclient=pgml.newOpenSourceAI();
62+
constkorvus=require("korvus");
63+
constclient=korvus.newOpenSourceAI();
6464
constresults=client.chat_completions_create(
6565
"meta-llama/Meta-Llama-3-8B-Instruct",
6666
[
@@ -80,8 +80,8 @@ console.log(results);
8080

8181
{% tab title="Python" %}
8282
```python
83-
importpgml
84-
client=pgml.OpenSourceAI()
83+
importkorvus
84+
client=korvus.OpenSourceAI()
8585
results= client.chat_completions_create(
8686
"meta-llama/Meta-Llama-3-8B-Instruct",
8787
[
@@ -138,8 +138,8 @@ Here is an example of streaming with the popular `meta-llama/Meta-Llama-3-8B-Ins
138138
{% tabs %}
139139
{% tab title="JavaScript" %}
140140
```javascript
141-
constpgml=require("pgml");
142-
constclient=pgml.newOpenSourceAI();
141+
constkorvus=require("korvus");
142+
constclient=korvus.newOpenSourceAI();
143143
constit=client.chat_completions_create_stream(
144144
"meta-llama/Meta-Llama-3-8B-Instruct",
145145
[
@@ -163,8 +163,8 @@ while (!result.done) {
163163

164164
{% tab title="Python" %}
165165
```python
166-
importpgml
167-
client=pgml.OpenSourceAI()
166+
importkorvus
167+
client=korvus.OpenSourceAI()
168168
results= client.chat_completions_create_stream(
169169
"meta-llama/Meta-Llama-3-8B-Instruct",
170170
[
@@ -231,8 +231,8 @@ We also have asynchronous versions of the `chat_completions_create` and `chat_co
231231
{% tabs %}
232232
{% tab title="JavaScript" %}
233233
```javascript
234-
constpgml=require("pgml");
235-
constclient=pgml.newOpenSourceAI();
234+
constkorvus=require("korvus");
235+
constclient=korvus.newOpenSourceAI();
236236
constresults=awaitclient.chat_completions_create_async(
237237
"meta-llama/Meta-Llama-3-8B-Instruct",
238238
[
@@ -252,8 +252,8 @@ console.log(results);
252252

253253
{% tab title="Python" %}
254254
```python
255-
importpgml
256-
client=pgml.OpenSourceAI()
255+
importkorvus
256+
client=korvus.OpenSourceAI()
257257
results=await client.chat_completions_create_async(
258258
"meta-llama/Meta-Llama-3-8B-Instruct",
259259
[
@@ -300,8 +300,8 @@ Notice the return types for the sync and async variations are the same.
300300
{% tabs %}
301301
{% tab title="JavaScript" %}
302302
```javascript
303-
constpgml=require("pgml");
304-
constclient=pgml.newOpenSourceAI();
303+
constkorvus=require("korvus");
304+
constclient=korvus.newOpenSourceAI();
305305
constit=awaitclient.chat_completions_create_stream_async(
306306
"meta-llama/Meta-Llama-3-8B-Instruct",
307307
[
@@ -325,8 +325,8 @@ while (!result.done) {
325325

326326
{% tab title="Python" %}
327327
```python
328-
importpgml
329-
client=pgml.OpenSourceAI()
328+
importkorvus
329+
client=korvus.OpenSourceAI()
330330
results=await client.chat_completions_create_stream_async(
331331
"meta-llama/Meta-Llama-3-8B-Instruct",
332332
[

0 commit comments

Comments
 (0)

[8]ページ先頭

©2009-2025 Movatter.jp