Movatterモバイル変換


[0]ホーム

URL:


localLLM: Running Local LLMs with 'llama.cpp' Backend

Provides R bindings to the 'llama.cpp' library for running large language models. The package uses a lightweight architecture where the C++ backend library is downloaded at runtime rather than bundled with the package. Package features include text generation, reproducible generation, and parallel inference.

Version:1.1.0
Depends:R (≥ 3.6.0)
Imports:Rcpp (≥ 1.0.14), tools, utils,jsonlite,digest,curl,R.utils
LinkingTo:Rcpp
Suggests:testthat (≥ 3.0.0),covr,irr,knitr,rmarkdown
Published:2025-12-17
DOI:10.32614/CRAN.package.localLLM
Author:Eddie YangORCID iD [aut], Yaosheng XuORCID iD [aut, cre]
Maintainer:Yaosheng Xu <xu2009 at purdue.edu>
BugReports:https://github.com/EddieYang211/localLLM/issues
License:MIT + fileLICENSE
URL:https://github.com/EddieYang211/localLLM
NeedsCompilation:yes
SystemRequirements:C++17, libcurl (optional, for model downloading)
Materials:README
CRAN checks:localLLM results

Documentation:

Reference manual:localLLM.html ,localLLM.pdf
Vignettes:Frequently Asked Questions (source,R code)
Get Started with localLLM (source,R code)
Reproducible Output (source,R code)
Basic Text Generation (source,R code)
Model Comparison & Validation (source,R code)
Ollama Integration (source,R code)
Parallel Processing (source,R code)

Downloads:

Package source: localLLM_1.1.0.tar.gz
Windows binaries: r-devel:localLLM_1.0.1.zip, r-release:localLLM_1.0.1.zip, r-oldrel:localLLM_1.0.1.zip
macOS binaries: r-release (arm64):localLLM_1.0.1.tgz, r-oldrel (arm64):localLLM_1.0.1.tgz, r-release (x86_64):localLLM_1.1.0.tgz, r-oldrel (x86_64):localLLM_1.0.1.tgz
Old sources: localLLM archive

Linking:

Please use the canonical formhttps://CRAN.R-project.org/package=localLLMto link to this page.


[8]ページ先頭

©2009-2025 Movatter.jp