Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings
NotificationsYou must be signed in to change notification settings

LabARSS/complexity-aware-fine-tuning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

General-purpose Large Language Models (LLMs) are frequently fine-tuned through supervised fine-tuning (SFT) to enhance performance in specific domains. Better results can be achieved by distilling the chain-of-thought of a larger model at the cost of numerous expensive calls and a much greater amount of data.We propose a novel blueprint for efficient fine-tuning that uses reasoning only for complex data identified by entropy. Specifically, across two small open models ($\approx 3B$) we split the training data into complexity categories by a single token answer entropy (ROC AUC$0.73$), fine-tune large language models (LLMs) via SFT and distillation, and show that our pipeline significantly outperforms the standard SFT approach ($0.55$ vs$0.43$ average accuracy) and provides comparable with distillation performance while using$62\%$ less data ($0.55$ average accuracy for both).

Note: This is an ongoing research. If you want to reproduce the results from the EMNLP 2025 version, check outthis tag.

Prerequisites

Data

Other datasets are included in the repo and also published on Huggingface:

Running experiments

uv run src/experiments/REPLACE_ME.py

Cite

@misc{goncharov2025complexityawarefinetuning,      title={Complexity-aware fine-tuning},       author={Andrey Goncharov and Daniil Vyazhev and Petr Sychev and Edvard Khalafyan and Alexey Zaytsev},      year={2025},      eprint={2506.21220},      archivePrefix={arXiv},      primaryClass={cs.LG},      url={https://arxiv.org/abs/2506.21220}, }

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Packages

No packages published

Contributors5


[8]ページ先頭

©2009-2025 Movatter.jp