EQUES / jpharma-bert-base

huggingface.co
Total runs: 3
24-hour runs: 0
7-day runs: 0
30-day runs: -2
Model's Last Updated: June 15 2025

Introduction of jpharma-bert-base

Model Details of jpharma-bert-base

Model Card

Our JpharmaBERT (base) is a continually pre-trained version of the BERT model ( tohoku-nlp/bert-base-japanese-v3 ), further trained on pharmaceutical data — the same dataset used for eques/jpharmatron .

Examoke Usage

import torch
from transformers import AutoModelForMaskedLM, AutoTokenizer, pipeline

model = AutoModelForMaskedLM.from_pretrained("EQUES/jpharma-bert-base", torch_dtype=torch.bfloat16)
tokenizer = AutoTokenizer.from_pretrained("EQUES/jpharma-bert-base")
fill_mask = pipeline("fill-mask", model=model, tokenizer=tokenizer)

results = fill_mask("水は化学式で[MASK]2Oです。")

for result in results:
    print(result)
# {'score': 0.49609375, 'token': 55, 'token_str': 'H', 'sequence': '水は化学式でH2Oです。'}
# {'score': 0.11767578125, 'token': 29257, 'token_str': 'Na', 'sequence': '水は化学式でNa2Oです。'}
# {'score': 0.047607421875, 'token': 61, 'token_str': 'N', 'sequence': '水は化学式でN2Oです。'}
# {'score': 0.038330078125, 'token': 16966, 'token_str': 'CH', 'sequence': '水は化学式でCH2Oです 。'}
# {'score': 0.0255126953125, 'token': 66, 'token_str': 'S', 'sequence': '水は化学式でS2Oです 。'}
Training Details
Training Data

We used the same dataset as eques/jpharmatron for training our JpharmaBERT, which consists of:

  • Japanese text data (2B tokens) collected from pharmaceutical documents such as academic papers and package inserts
  • English data (8B tokens) obtained from PubMed abstracts
  • Pharmaceutical-related data (1.2B tokens) extracted from the multilingual CC100 dataset

After removing duplicate entries across these sources, the final dataset contains approximately 9 billion tokens.
(For details, please refer to our paper about Jpharmatron: link )

Training Hyperparameters

The model was continually pre-trained with the following settings:

  • Mask probability: 15%
  • Maximum sequence length: 512 tokens
  • Number of training epochs: 6
  • Learning rate: 1e-4
  • Warm-up steps: 10,000
  • Per-device training batch size: 64
Model Card Authors

Created by Takuro Fujii ( [email protected] )

Runs of EQUES jpharma-bert-base on huggingface.co

3
Total runs
0
24-hour runs
0
3-day runs
0
7-day runs
-2
30-day runs

More Information About jpharma-bert-base huggingface.co Model

jpharma-bert-base huggingface.co

jpharma-bert-base huggingface.co is an AI model on huggingface.co that provides jpharma-bert-base's model effect (), which can be used instantly with this EQUES jpharma-bert-base model. huggingface.co supports a free trial of the jpharma-bert-base model, and also provides paid use of the jpharma-bert-base. Support call jpharma-bert-base model through api, including Node.js, Python, http.

jpharma-bert-base huggingface.co Url

https://huggingface.co/EQUES/jpharma-bert-base

EQUES jpharma-bert-base online free

jpharma-bert-base huggingface.co is an online trial and call api platform, which integrates jpharma-bert-base's modeling effects, including api services, and provides a free online trial of jpharma-bert-base, you can try jpharma-bert-base online for free by clicking the link below.

EQUES jpharma-bert-base online free url in huggingface.co:

https://huggingface.co/EQUES/jpharma-bert-base

jpharma-bert-base install

jpharma-bert-base is an open source model from GitHub that offers a free installation service, and any user can find jpharma-bert-base on GitHub to install. At the same time, huggingface.co provides the effect of jpharma-bert-base install, users can directly use jpharma-bert-base installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

jpharma-bert-base install url in huggingface.co:

https://huggingface.co/EQUES/jpharma-bert-base

Url of jpharma-bert-base

jpharma-bert-base huggingface.co Url

Provider of jpharma-bert-base huggingface.co

EQUES
ORGANIZATIONS

Other API from EQUES

huggingface.co

Total runs: 169
Run Growth: 43
Growth Rate: 25.44%
Updated:February 09 2026
huggingface.co

Total runs: 55
Run Growth: 0
Growth Rate: 0.00%
Updated:April 20 2025
huggingface.co

Total runs: 55
Run Growth: -320
Growth Rate: -581.82%
Updated:August 21 2024
huggingface.co

Total runs: 37
Run Growth: 0
Growth Rate: 0.00%
Updated:April 19 2025
huggingface.co

Total runs: 34
Run Growth: 0
Growth Rate: 0.00%
Updated:April 19 2025
huggingface.co

Total runs: 31
Run Growth: 0
Growth Rate: 0.00%
Updated:April 23 2025
huggingface.co

Total runs: 30
Run Growth: 0
Growth Rate: 0.00%
Updated:April 20 2025
huggingface.co

Total runs: 30
Run Growth: 0
Growth Rate: 0.00%
Updated:April 20 2025
huggingface.co

Total runs: 29
Run Growth: 0
Growth Rate: 0.00%
Updated:April 19 2025
huggingface.co

Total runs: 29
Run Growth: 0
Growth Rate: 0.00%
Updated:April 19 2025
huggingface.co

Total runs: 28
Run Growth: 0
Growth Rate: 0.00%
Updated:April 20 2025
huggingface.co

Total runs: 9
Run Growth: 0
Growth Rate: 0.00%
Updated:April 19 2025
huggingface.co

Total runs: 5
Run Growth: 0
Growth Rate: 0.00%
Updated:April 19 2025
huggingface.co

Total runs: 1
Run Growth: -4
Growth Rate: -400.00%
Updated:May 11 2025
huggingface.co

Total runs: 1
Run Growth: -3
Growth Rate: -300.00%
Updated:April 27 2024
huggingface.co

Total runs: 1
Run Growth: -3
Growth Rate: -300.00%
Updated:May 02 2025
huggingface.co

Total runs: 1
Run Growth: -6
Growth Rate: -600.00%
Updated:May 22 2025
huggingface.co

Total runs: 0
Run Growth: -3
Growth Rate: 0.00%
Updated:April 19 2025
huggingface.co

Total runs: 0
Run Growth: -3
Growth Rate: 0.00%
Updated:April 20 2025
huggingface.co

Total runs: 0
Run Growth: 0
Growth Rate: 0.00%
Updated:April 06 2025
huggingface.co

Total runs: 0
Run Growth: 0
Growth Rate: 0.00%
Updated:April 24 2025
huggingface.co

Total runs: 0
Run Growth: -3
Growth Rate: 0.00%
Updated:April 20 2025
huggingface.co

Total runs: 0
Run Growth: 0
Growth Rate: 0.00%
Updated:April 24 2025