tiny-random / lfm2-moe

huggingface.co
Total runs: 61
24-hour runs: 0
7-day runs: 8
30-day runs: 13
Model's Last Updated: October 18 2025
text-generation

Introduction of lfm2-moe

Model Details of lfm2-moe

This tiny model is intended for debugging. It is randomly initialized using the configuration adapted from LiquidAI/LFM2-8B-A1B .

Example usage:
from transformers import AutoModelForCausalLM, AutoTokenizer

# Load model and tokenizer
model_id = "tiny-random/lfm2-moe"
model = AutoModelForCausalLM.from_pretrained(
    model_id,
    device_map="cuda",
    dtype="bfloat16",
    trust_remote_code=True,
    attn_implementation="flash_attention_2",
)
tokenizer = AutoTokenizer.from_pretrained(model_id)

# Generate answer
prompt="What is AI?"
input_ids=tokenizer.apply_chat_template(
    [{"role": "user", "content": prompt}],
    add_generation_prompt=True,
    return_tensors="pt",
    tokenize=True,
).to(model.device)

output=model.generate(
    input_ids,
    do_sample=True,
    temperature=0.3,
    min_p=0.15,
    repetition_penalty=1.05,
    max_new_tokens=32,
)

print(tokenizer.decode(output[0], skip_special_tokens=False))
Codes to create this repo:
import json
from pathlib import Path

import accelerate
import torch
from huggingface_hub import file_exists, hf_hub_download
from transformers import (
    AutoConfig,
    AutoModelForCausalLM,
    AutoProcessor,
    GenerationConfig,
    set_seed,
)

source_model_id = "LiquidAI/LFM2-8B-A1B"
save_folder = "/tmp/tiny-random/lfm2-moe"

processor = AutoProcessor.from_pretrained(source_model_id, trust_remote_code=True)
processor.save_pretrained(save_folder)

with open(hf_hub_download(source_model_id, filename='config.json', repo_type='model'), 'r', encoding='utf-8') as f:
    config_json = json.load(f)
config_json['hidden_size'] = 64
config_json['intermediate_size'] = 128
config_json['layer_types'] = ['conv', 'conv', 'full_attention']
config_json['moe_intermediate_size'] = 128
config_json['num_dense_layers'] = 2
config_json['num_attention_heads'] = 2
config_json['num_hidden_layers'] = 3
config_json['num_key_value_heads'] = 1
config_json['use_cache'] = True
# config_json['tie_word_embeddings'] = True
with open(f"{save_folder}/config.json", "w", encoding='utf-8') as f:
    json.dump(config_json, f, indent=2)

config = AutoConfig.from_pretrained(
    save_folder,
    trust_remote_code=True,
)
print(config)
torch.set_default_dtype(torch.bfloat16)
model = AutoModelForCausalLM.from_config(config)
torch.set_default_dtype(torch.float32)
if file_exists(filename="generation_config.json", repo_id=source_model_id, repo_type='model'):
    model.generation_config = GenerationConfig.from_pretrained(
        source_model_id, trust_remote_code=True,
    )
set_seed(42)
model = model.cpu()  # cpu is more stable for random initialization across machines
with torch.no_grad():
    for name, p in sorted(model.named_parameters()):
        torch.nn.init.normal_(p, 0, 0.1)
        print(name, p.shape)
model.save_pretrained(save_folder)
print(model)
Printing the model:
Lfm2MoeForCausalLM(
  (model): Lfm2MoeModel(
    (embed_tokens): Embedding(65536, 64, padding_idx=0)
    (layers): ModuleList(
      (0-1): 2 x Lfm2MoeDecoderLayer(
        (conv): Lfm2MoeShortConv(
          (conv): Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(2,), groups=64, bias=False)
          (in_proj): Linear(in_features=64, out_features=192, bias=False)
          (out_proj): Linear(in_features=64, out_features=64, bias=False)
        )
        (feed_forward): Lfm2MoeMLP(
          (w1): Linear(in_features=64, out_features=128, bias=False)
          (w3): Linear(in_features=64, out_features=128, bias=False)
          (w2): Linear(in_features=128, out_features=64, bias=False)
        )
        (operator_norm): Lfm2MoeRMSNorm((64,), eps=1e-05)
        (ffn_norm): Lfm2MoeRMSNorm((64,), eps=1e-05)
      )
      (2): Lfm2MoeDecoderLayer(
        (self_attn): Lfm2MoeAttention(
          (q_proj): Linear(in_features=64, out_features=64, bias=False)
          (k_proj): Linear(in_features=64, out_features=32, bias=False)
          (v_proj): Linear(in_features=64, out_features=32, bias=False)
          (out_proj): Linear(in_features=64, out_features=64, bias=False)
          (q_layernorm): Lfm2MoeRMSNorm((32,), eps=1e-05)
          (k_layernorm): Lfm2MoeRMSNorm((32,), eps=1e-05)
        )
        (feed_forward): Lfm2MoeSparseMoeBlock(
          (gate): Linear(in_features=64, out_features=32, bias=False)
          (experts): Lfm2MoeExperts(
            (0-31): 32 x Lfm2MoeMLP(
              (w1): Linear(in_features=64, out_features=128, bias=False)
              (w3): Linear(in_features=64, out_features=128, bias=False)
              (w2): Linear(in_features=128, out_features=64, bias=False)
            )
          )
        )
        (operator_norm): Lfm2MoeRMSNorm((64,), eps=1e-05)
        (ffn_norm): Lfm2MoeRMSNorm((64,), eps=1e-05)
      )
    )
    (pos_emb): Lfm2MoeRotaryEmbedding()
    (embedding_norm): Lfm2MoeRMSNorm((64,), eps=1e-05)
  )
  (lm_head): Linear(in_features=64, out_features=65536, bias=False)
)

Runs of tiny-random lfm2-moe on huggingface.co

61
Total runs
0
24-hour runs
0
3-day runs
8
7-day runs
13
30-day runs

More Information About lfm2-moe huggingface.co Model

lfm2-moe huggingface.co

lfm2-moe huggingface.co is an AI model on huggingface.co that provides lfm2-moe's model effect (), which can be used instantly with this tiny-random lfm2-moe model. huggingface.co supports a free trial of the lfm2-moe model, and also provides paid use of the lfm2-moe. Support call lfm2-moe model through api, including Node.js, Python, http.

tiny-random lfm2-moe online free

lfm2-moe huggingface.co is an online trial and call api platform, which integrates lfm2-moe's modeling effects, including api services, and provides a free online trial of lfm2-moe, you can try lfm2-moe online for free by clicking the link below.

tiny-random lfm2-moe online free url in huggingface.co:

https://huggingface.co/tiny-random/lfm2-moe

lfm2-moe install

lfm2-moe is an open source model from GitHub that offers a free installation service, and any user can find lfm2-moe on GitHub to install. At the same time, huggingface.co provides the effect of lfm2-moe install, users can directly use lfm2-moe installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

lfm2-moe install url in huggingface.co:

https://huggingface.co/tiny-random/lfm2-moe

Url of lfm2-moe

Provider of lfm2-moe huggingface.co

tiny-random
ORGANIZATIONS

Other API from tiny-random

huggingface.co

Total runs: 751
Run Growth: 627
Growth Rate: 92.07%
Updated:July 11 2025
huggingface.co

Total runs: 687
Run Growth: 670
Growth Rate: 97.53%
Updated:February 20 2026
huggingface.co

Total runs: 493
Run Growth: 474
Growth Rate: 96.15%
Updated:August 06 2025
huggingface.co

Total runs: 362
Run Growth: 73
Growth Rate: 20.17%
Updated:April 23 2026
huggingface.co

Total runs: 281
Run Growth: 278
Growth Rate: 98.93%
Updated:August 21 2025
huggingface.co

Total runs: 154
Run Growth: 80
Growth Rate: 52.29%
Updated:September 06 2025
huggingface.co

Total runs: 151
Run Growth: 68
Growth Rate: 45.64%
Updated:January 12 2025
huggingface.co

Total runs: 145
Run Growth: 67
Growth Rate: 46.21%
Updated:April 27 2025
huggingface.co

Total runs: 145
Run Growth: -26
Growth Rate: -17.93%
Updated:November 23 2025
huggingface.co

Total runs: 141
Run Growth: -81
Growth Rate: -57.45%
Updated:February 27 2026
huggingface.co

Total runs: 129
Run Growth: -127
Growth Rate: -94.07%
Updated:June 25 2025
huggingface.co

Total runs: 80
Run Growth: -8
Growth Rate: -10.00%
Updated:July 08 2025
huggingface.co

Total runs: 64
Run Growth: -253
Growth Rate: -395.31%
Updated:December 16 2025
huggingface.co

Total runs: 58
Run Growth: 58
Growth Rate: 100.00%
Updated:April 03 2026
huggingface.co

Total runs: 49
Run Growth: 45
Growth Rate: 91.84%
Updated:February 14 2026
huggingface.co

Total runs: 41
Run Growth: -30
Growth Rate: -73.17%
Updated:November 23 2025
huggingface.co

Total runs: 40
Run Growth: 33
Growth Rate: 82.50%
Updated:October 18 2025
huggingface.co

Total runs: 32
Run Growth: 31
Growth Rate: 100.00%
Updated:April 12 2026
huggingface.co

Total runs: 27
Run Growth: 15
Growth Rate: 55.56%
Updated:April 12 2026
huggingface.co

Total runs: 20
Run Growth: -13
Growth Rate: -65.00%
Updated:July 22 2025
huggingface.co

Total runs: 14
Run Growth: 9
Growth Rate: 64.29%
Updated:July 22 2025
huggingface.co

Total runs: 11
Run Growth: 2
Growth Rate: 18.18%
Updated:February 13 2026
huggingface.co

Total runs: 11
Run Growth: 7
Growth Rate: 63.64%
Updated:July 29 2025
huggingface.co

Total runs: 5
Run Growth: 1
Growth Rate: 20.00%
Updated:August 11 2025
huggingface.co

Total runs: 2
Run Growth: 1
Growth Rate: 50.00%
Updated:October 05 2025