N8Programs / llamoe-8x1b

huggingface.co
Total runs: 10
24-hour runs: 0
7-day runs: 0
30-day runs: 0
Model's Last Updated: March 07 2024
text-generation

Introduction of llamoe-8x1b

Model Details of llamoe-8x1b

Built with Axolotl

See axolotl config

axolotl version: 0.4.0

base_model: N8Programs/llamoe-8x1b
model_type: MixtralForCausalLM
tokenizer_type: LlamaTokenizer

load_in_8bit: false
load_in_4bit: false
strict: false

datasets:
  - path: mhenrichsen/alpaca_2k_test
    type: alpaca
dataset_prepared_path:
val_set_size: 0.05
output_dir: ./out

sequence_len: 2048
sample_packing: true
eval_sample_packing: false
pad_to_sequence_len: true

wandb_project: tinyllamoe
wandb_entity:
wandb_watch:
wandb_name: run-1
wandb_log_model: run-1

gradient_accumulation_steps: 4
micro_batch_size: 2
num_epochs: 4
optimizer: adafactor
lr_scheduler: cosine
learning_rate: 0.0002

train_on_inputs: false
group_by_length: false
bf16: auto
fp16:
tf32: false

gradient_checkpointing: true
early_stopping_patience:
resume_from_checkpoint:
local_rank:
logging_steps: 1
xformers_attention:
flash_attention: true

warmup_steps: 10
evals_per_epoch: 4
saves_per_epoch: 1
debug:
deepspeed:
weight_decay: 0.0
fsdp:
fsdp_config:
special_tokens:

out

This model is a fine-tuned version of N8Programs/llamoe-8x1b on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.7176
Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure
Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0002
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 8
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 10
  • num_epochs: 4
Training results
Training Loss Epoch Step Validation Loss
1.2099 0.04 1 1.2991
1.3823 0.27 7 1.4997
10.4722 0.54 14 2.6370
1.6521 0.82 21 1.4303
1.6555 1.07 28 1.7053
1.7864 1.34 35 1.8820
1.2141 1.61 42 1.6614
1.1488 1.88 49 1.5619
0.4733 2.14 56 1.6381
0.444 2.41 63 1.6311
0.4717 2.68 70 1.6398
0.4657 2.95 77 1.5938
0.1066 3.2 84 1.6952
0.1547 3.48 91 1.7209
0.1246 3.75 98 1.7176
Framework versions
  • Transformers 4.39.0.dev0
  • Pytorch 2.2.0+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.0

Runs of N8Programs llamoe-8x1b on huggingface.co

10
Total runs
0
24-hour runs
0
3-day runs
0
7-day runs
0
30-day runs

More Information About llamoe-8x1b huggingface.co Model

More llamoe-8x1b license Visit here:

https://choosealicense.com/licenses/apache-2.0

llamoe-8x1b huggingface.co

llamoe-8x1b huggingface.co is an AI model on huggingface.co that provides llamoe-8x1b's model effect (), which can be used instantly with this N8Programs llamoe-8x1b model. huggingface.co supports a free trial of the llamoe-8x1b model, and also provides paid use of the llamoe-8x1b. Support call llamoe-8x1b model through api, including Node.js, Python, http.

N8Programs llamoe-8x1b online free

llamoe-8x1b huggingface.co is an online trial and call api platform, which integrates llamoe-8x1b's modeling effects, including api services, and provides a free online trial of llamoe-8x1b, you can try llamoe-8x1b online for free by clicking the link below.

N8Programs llamoe-8x1b online free url in huggingface.co:

https://huggingface.co/N8Programs/llamoe-8x1b

llamoe-8x1b install

llamoe-8x1b is an open source model from GitHub that offers a free installation service, and any user can find llamoe-8x1b on GitHub to install. At the same time, huggingface.co provides the effect of llamoe-8x1b install, users can directly use llamoe-8x1b installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

llamoe-8x1b install url in huggingface.co:

https://huggingface.co/N8Programs/llamoe-8x1b

Url of llamoe-8x1b

Provider of llamoe-8x1b huggingface.co

N8Programs
ORGANIZATIONS

Other API from N8Programs

huggingface.co

Total runs: 8
Run Growth: -2
Growth Rate: -25.00%
Updated:April 18 2024
huggingface.co

Total runs: 3
Run Growth: 0
Growth Rate: 0.00%
Updated:February 05 2024
huggingface.co

Total runs: 1
Run Growth: -2
Growth Rate: -200.00%
Updated:August 24 2024