Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

MLOS is a project to enable autotuning for systems.

License

NotificationsYou must be signed in to change notification settings

microsoft/MLOS

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

565 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

MLOS DevContainerMLOS LinuxMLOS MacOSMLOS WindowsCode Coverage Statuspre-commit.ci status

MLOS is a project to enable autotuning for systems.

Contents

Overview

MLOS currently focuses on anoffline tuning approach, though we intend to add online tuning in the future.

To accomplish this, the general flow involves

  • Running a workload (i.e., benchmark) against a system (e.g., a database, web server, or key-value store).
  • Retrieving the results of that benchmark, and perhaps some other metrics from the system.
  • Feed that data to an optimizer (e.g., usingBayesian Optimization or other techniques).
  • Obtain a new suggested config to try from the optimizer.
  • Apply that configuration to the target system.
  • Repeat until either the exploration budget is consumed or the configurations' performance appear to have converged.

optimization loop

Source:LlamaTune: VLDB 2022

For a brief overview of some of the features and capabilities of MLOS, please see the following video:

demo video

Organization

To do this this repo provides three Python modules, which can be used independently or in combination:

  • mlos-bench provides a framework to help automate running benchmarks as described above.

  • mlos-viz provides some simple APIs to help automate visualizing the results of benchmark experiments and their trials.

    It provides a simpleplot(experiment_data) API, whereexperiment_data is obtained from themlos_bench.storage module.

  • mlos-core provides an abstraction around existing optimization frameworks (e.g.,FLAML,SMAC, etc.)

    It is intended to provide a simple, easy to consume (e.g. viapip), with low dependencies abstraction to

    • describe a space of context, parameters, their ranges, constraints, etc. and result objectives
    • an "optimizer" serviceabstraction (e.g.register() andsuggest()) so we can easily swap out different implementations methods of searching (e.g. random, BO, LLM, etc.)
    • provide some helpers forautomating optimization experiment runner loops and data collection

For these design requirements we intend to reuse as much from existing OSS libraries as possible and layer policies and optimizations specifically geared towards autotuning systems over top.

By providing wrappers we aim to also allow more easily experimenting with replacing underlying optimizer components as new techniques become available or seem to be a better match for certain systems.

Contributing

SeeCONTRIBUTING.md for details on development environment and contributing.

Getting Started

The development environment for MLOS usesconda anddevcontainers to ease dependency management, but not all these libraries are required for deployment.

For instructions on setting up the development environment please try one of the following options:

  • seeCONTRIBUTING.md for details on setting up a local development environment
  • launch this repository (or your fork) in acodespace, or
  • have a look at one of the autotuning example repositories likesqlite-autotuning to kick the tires in acodespace in your browser immediately :)

conda activation

  1. Create themlos Conda environment.

    conda env create -f conda-envs/mlos.yml

    See theconda-envs/ directory for additional conda environment files, including those used for Windows (e.g.mlos-windows.yml).

    or

    # This will also ensure the environment is update to date using "conda env update -f conda-envs/mlos.yml"make conda-env

    Note: the latter expects a *nix environment.

  2. Initialize the shell environment.

    conda activate mlos

Usage Examples

mlos-core

For an example of using themlos_core optimizer APIs run theBayesianOptimization.ipynb notebook.

mlos-bench

For an example of using themlos_bench tool to run an experiment, see themlos_bench Quickstart README.

Here's a quick summary:

./scripts/generate-azure-credentials-config> global_config_azure.jsonc# run a simple experimentmlos_bench --config ./mlos_bench/mlos_bench/config/cli/azure-redis-1shot.jsonc

See Also:

mlos-viz

For a simple example of using themlos_viz module to visualize the results of an experiment, see thesqlite-autotuning repository, especially themlos_demo_sqlite_teachers.ipynb notebook.

Installation

The MLOS modules are published topypi when new releases are tagged:

To install the latest release, simply run:

# this will install just the optimizer component with SMAC support:pip install -U mlos-core[smac]# this will install just the optimizer component with flaml support:pip install -U"mlos-core[flaml]"# this will install just the optimizer component with smac and flaml support:pip install -U"mlos-core[smac,flaml]"# this will install both the flaml optimizer and the experiment runner with azure support:pip install -U"mlos-bench[flaml,azure]"# this will install both the smac optimizer and the experiment runner with ssh support:pip install -U"mlos-bench[smac,ssh]"# this will install the postgres storage backend for mlos-bench# and mlos-viz for visualizing results:pip install -U"mlos-bench[postgres]" mlos-viz

Details on using a local version from git are available inCONTRIBUTING.md.

See Also

Examples

These can be used as starting points for new autotuning projects outside of the main MLOS repository if you want to keep your tuning experiment configs separate from the MLOS codebase.

Alternatively, we accept PRs to add new examples to the main MLOS repository!Seemlos_bench/config andCONTRIBUTING.md for more details.

Publications

Packages

No packages published

Contributors22


[8]ページ先頭

©2009-2026 Movatter.jp