Movatterモバイル変換


[0]ホーム

URL:


Hugging Face's logoHugging Face

English

gpt4all-lora

An autoregressive transformer trained ondata curated usingAtlas.This model is trained with four full epochs of training, while the relatedgpt4all-lora-epoch-3 model is trained with three.Replication instructions and data:https://github.com/nomic-ai/gpt4all

Model Details

Model Description

Developed by:Nomic AI

Model Type: An auto-regressive language model based on the transformer architecture and fine-tuned.

Languages: English

License:GPL-3.0

Finetuned from:LLaMA

Model Sources

Repository:https://github.com/nomic-ai/gpt4all

Base Model Repository:https://github.com/facebookresearch/llama

Technical Report:GPT4All: Training an Assistant-style Chatbot with Large Scale DataDistillation from GPT-3.5-Turbo

Downloads last month

-

Downloads are not tracked for this model.How to track
Inference ProvidersNEW
This model isn't deployed by any Inference Provider.🙋Ask for provider support
HF Inference deployability: The model has no library tag.

Dataset used to trainnomic-ai/gpt4all-lora


[8]ページ先頭

©2009-2025 Movatter.jp