Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up

The Jieba Chinese Word Segmentation Implemented in Rust

License

NotificationsYou must be signed in to change notification settings

messense/jieba-rs

Repository files navigation

GitHub ActionscodecovCrates.iodocs.rs

🚀 Help me to become a full-time open-source developer bysponsoring me on GitHub

The Jieba Chinese Word Segmentation Implemented in Rust

Installation

Add it to yourCargo.toml:

[dependencies]jieba-rs ="0.7"

then you are good to go. If you are using Rust 2015 you have toextern crate jieba_rs to your crate root as well.

Example

use jieba_rs::Jieba;fnmain(){let jieba =Jieba::new();let words = jieba.cut("我们中出了一个叛徒",false);assert_eq!(words, vec!["我们","中","出","了","一个","叛徒"]);}

Enabling Additional Features

  • default-dict feature enables embedded dictionary, this features is enabled by default
  • tfidf feature enables TF-IDF keywords extractor
  • textrank feature enables TextRank keywords extractor
[dependencies]jieba-rs = {version ="0.7",features = ["tfidf","textrank"] }

Run benchmark

cargo bench --all-features

Benchmark: Compare with cppjieba

jieba-rs bindings

License

This work is released under the MIT license. A copy of the license is provided in theLICENSE file.

About

The Jieba Chinese Word Segmentation Implemented in Rust

Topics

Resources

License

Stars

Watchers

Forks

Sponsor this project

 

Packages

No packages published

Languages


[8]ページ先頭

©2009-2025 Movatter.jp