You signed in with another tab or window.Reload to refresh your session.You signed out in another tab or window.Reload to refresh your session.You switched accounts on another tab or window.Reload to refresh your session.Dismiss alert
image_alt:Moving from one abstraction layer to another.
6
6
---
7
7
8
-
PostgresML ismoving to Rust for our 2.0release
8
+
PostgresML isMoving to Rust for our 2.0Release
9
9
================================================
10
10
11
11
<pclass="author">
@@ -14,12 +14,13 @@ PostgresML is moving to Rust for our 2.0 release
14
14
September 19, 2022
15
15
</p>
16
16
17
-
PostgresML is a fairly young project. We recently released 1.0 and now we're considering what we want to accomplish for 2.0. In addition to simplifying the workflow for building models, we'd like to address runtime speed, memory consumption and the overall reliability we've seen for machine learning deployments running at scale.
18
17
19
-
Python isgenerally touted as fast enough for machine learning, andis the de facto industry standard with tons of popular libraries implementing all the latest and greatest algorithms. Many of these algorithms (torch, tensorflow, xgboost, numpy) have been optimized in C, but not all of them. For example, most ofthe[linear algorithms](https://github.com/scikit-learn/scikit-learn/tree/main/sklearn/linear_model) in scikit learn are implemented in pure Python, although they rely on numpy, whichisa convenient optimization. It also uses cython in a few performance critical places. This ecosystem has allowed PostgresML to offer a ton of functionality with minimal duplication of effort.
18
+
PostgresML isa fairly young project. We recently released v1.0 andnow we're considering what we want to accomplish for v2.0. In addition to simplifying the workflow for building models, we'd like to address runtime speed, memory consumption andtheoverall reliability we've seenisneeded for machine learning deployments running at scale.
20
19
20
+
Python is generally touted as fast enough for machine learning, and is the de facto industry standard with tons of popular libraries, implementing all the latest and greatest algorithms. Many of these algorithms (Torch, Tensorflow, XGboost, NumPy) have been optimized in C, but not all of them. For example, most of the[linear algorithms](https://github.com/scikit-learn/scikit-learn/tree/main/sklearn/linear_model) in scikit-learn are written in pure Python, although they do use NumPy, which is a convenient optimization. It also uses Cython in a few performance critical places. This ecosystem has allowed PostgresML to offer a ton of functionality with minimal duplication of effort.
21
21
22
-
##Ambition starts with a simple benchmark
22
+
23
+
##Ambition Starts With a Simple Benchmark
23
24
<figure>
24
25
<img alt="Ferris the crab" src="/blog/images/rust_programming_crab_sea.webp" />
25
26
<figcaption>Rust mascot image by opensource.com</figcaption>
@@ -34,7 +35,7 @@ FROM generate_series(1, 1280000) i
34
35
GROUP BY i %10000;
35
36
```
36
37
37
-
Spoiler alert, idiomatic Rust is about 10x faster than native SQL,theembedded PL/pgSQL scripting language, andPython in this benchmark. Rust comes close to the handoptimized assembly version of the Basic Linear Algebra Subroutinesimplementation for the dot product. Numpy is supposed to provide optimizations in cases like this, but it's actually the worst performer. Data movement from Postgres to PL/Python is pretty good. It's even faster than the pure SQL equivalent, but adding the extra conversion from Python list to Numpy array takes almost as much time as everything else. Machine Learning systems that move relatively large quantities of data around can become dominated by these extraneous operations, rather than the ML algorithms that actually generate value.
38
+
Spoiler alert: idiomatic Rust is about 10x faster than native SQL, embedded PL/pgSQL, andpure Python. Rust comes close to the hand-optimized assembly version of the Basic Linear Algebra Subroutines(BLAS) implementation. NumPy is supposed to provide optimizations in cases like this, but it's actually the worst performer. Data movement from Postgres to PL/Python is pretty good; it's even faster than the pure SQL equivalent, but adding the extra conversion from Python list to Numpy array takes almost as much time as everything else. Machine Learning systems that move relatively large quantities of data around can become dominated by these extraneous operations, rather than the ML algorithms that actually generate value.
@@ -101,7 +102,7 @@ Spoiler alert, idiomatic Rust is about 10x faster than native SQL, the embedded
101
102
ORDER BY 1
102
103
LIMIT 1;
103
104
```
104
-
=== "Numpy"
105
+
=== "NumPy"
105
106
```sql linenums="1" title="define_numpy.sql"
106
107
CREATE OR REPLACE FUNCTION dot_product_numpy(a FLOAT4[], b FLOAT4[])
107
108
RETURNS FLOAT4
@@ -179,15 +180,17 @@ ML isn't just about basic math and a little bit of business logic. It's about al
179
180
<figcaption>Layers of abstraction must remain a good value.</figcaption>
180
181
</figure>
181
182
182
-
The results are somewhat staggering. We didn't spend any time intentionally optimizing Rust over Python. Most of the time spent was just trying to get things to compile. 😅 It's hard to believe the difference is this big, but those fringe operations outside of the core machine learning algorithms really do dominate, requiring up to 35x more time in Python during inference. The difference between classification and regression speeds here are related to the dataset size. The scikit learn handwritten image classification dataset effectively has 64 features (pixels) vs the diabetes regression dataset having only 10 features.**The more data we're dealing with, the bigger the improvement we see in Rust**. We're even giving Python some leeway by warming up the runtime on the connection before the test, which typically takes a second or two to interpret all of PostgresML's dependencies. Since Rust is a compiled language, there is no longer a need to warmup the connection.
183
+
The results are somewhat staggering. We didn't spend any time intentionally optimizing Rust over Python. Most of the time spent was just trying to get things to compile. 😅 It's hard to believe the difference is this big, but those fringe operations outside of the core machine learning algorithms really do dominate, requiring up to 35x more time in Python during inference. The difference between classification and regression speeds here are related to the dataset size. The scikit learn handwritten image classification dataset effectively has 64 features (pixels) vs the diabetes regression dataset having only 10 features.
184
+
185
+
**The more data we're dealing with, the bigger the improvement we see in Rust**. We're even giving Python some leeway by warming up the runtime on the connection before the test, which typically takes a second or two to interpret all of PostgresML's dependencies. Since Rust is a compiled language, there is no longer a need to warmup the connection.
>_This language comparison uses in-process data access. Python based machine learning microservices that communicate with other services over HTTP with JSON or gRPC interfaces will look even worse in comparison, especially if they are stateless and rely on yet another database to provide their data over yet another wire._
The API is identical betweenversions 1.0 and2.0. We take breaking changes seriously and we're not going to break existing deployments just because we're rewriting the whole project. The only reason we're bumping the major version is because we feel like this is a dramatic change, but we intend to preserve a full compatibility layer with models trained on1.0 in Python.This means that to get the full performance benefits, you'll need to retrain models after upgrading.
209
+
The API is identical betweenv1.0 andv2.0. We take breaking changes seriously and we're not going to break existing deployments just because we're rewriting the whole project. The only reason we're bumping the major version is because we feel like this is a dramatic change, but we intend to preserve a full compatibility layer with models trained onv1.0 in Python.However, this does mean that to get the full performance benefits, you'll need to retrain models after upgrading.
207
210
208
-
##Ensuringhigh quality Rustimplementations
209
-
Besides backwards compatibility, we're building a Python compatibility layer to guarantee we can preserve the full Python model training APIs, when Rust APIs are not at parity in terms of functionality, quality or performance. We started this journey thinking that the olderalgorithms in scikit learn that are implementedinvanilla Pythonwould be the best candidates for replacement in Rust, but that is only partly true. There are high quality efforts in[linfa](https://github.com/rust-ml/linfa) and[smartcore](https://github.com/smartcorelib/smartcore) that also show 10-30x speedup overscikit, but they still lack some of the deeper functionality like joint regression, some of the more obscure algorithms andhyperparams, and some of the error handling that has been hardened intoscikit with mass adoption.
211
+
##EnsuringHigh Quality RustImplementations
212
+
Besides backwards compatibility, we're building a Python compatibility layer to guarantee we can preserve the full Python model training APIs, when Rust APIs are not at parity in terms of functionality, quality or performance. We started this journey thinking that the oldervanilla Python algorithmsinScikitwould be the best candidates for replacement in Rust, but that is only partly true. There are high quality efforts in[linfa](https://github.com/rust-ml/linfa) and[smartcore](https://github.com/smartcorelib/smartcore) that also show 10-30x speedup overScikit, but they still lack some of the deeper functionality like joint regression, some of the more obscure algorithms andhyperparameters, and some of the error handling that has been hardened intoScikit with mass adoption.
Interestingly, the training times for some of the simplest algorithms aremuchworse in the Rust implementation. Until we can guarantee eachalgorithm implementation is an upgrade in every way, we'll continue to use the Python compatibility layer on a case by case basis to avoid any unpleasant surprises.
231
+
Interestingly, the training times for some of the simplest algorithms are worse in the Rust implementation. Until we can guarantee eachRust algorithm is an upgrade in every way, we'll continue to use the Python compatibility layer on a case by case basis to avoid any unpleasant surprises.
229
232
230
-
We believe that[machine learning in Rust](https://www.arewelearningyet.com/) is mature enough to add significant value now, where we'll be using the same underlying C libraries, andthatit's worth contributing to the Rustimplementations further tohelpbringthem up to full feature parity.With this goal in mind, we intend to drop our Python compatibility layer completely in 3.0, and only support 2.0 models trained with Rust long term. Part of our 2.0releaseprocesswill include a benchmark suite for the full API we support via all Python libraries, so that we can track our progress toward pure Rust implementationsacross the board.
233
+
We believe that[machine learning in Rust](https://www.arewelearningyet.com/) is mature enough to add significant value now. We'll be using the same underlying C/C++ libraries, and it's worth contributing to the RustML ecosystem to bringit up to full feature parity.Our v2.0release will include a benchmark suite for the full API we support via all Python libraries, so that we can track our progress toward pure Rust implementationsover time.
231
234
232
-
Many thanks and ❤️ to all those who are supporting this endeavor. We’d love to hear feedback from the broader ML and Engineering community about applications and other real world scenarios to help prioritize our work.We'd also appreciate your supportin the form of[starson ourgithub](https://github.com/postgresml/postgresml).
235
+
Many thanks and ❤️ to all those who are supporting this endeavor. We’d love to hear feedback from the broader ML and Engineering community about applications and other real world scenarios to help prioritize our work.You can show your supportby[starring uson ourGitHub](https://github.com/postgresml/postgresml).