UBC-NLP / MARBERT

huggingface.co
Total runs: 27.5K
24-hour runs: 0
7-day runs: 6.0K
30-day runs: 21.8K
Model's Last Updated: August 17 2022
fill-mask

Introduction of MARBERT

Model Details of MARBERT

drawing

MARBERT is one of three models described in our ACL 2021 paper "ARBERT & MARBERT: Deep Bidirectional Transformers for Arabic" . MARBERT is a large-scale pre-trained masked language model focused on both Dialectal Arabic (DA) and MSA. Arabic has multiple varieties. To train MARBERT, we randomly sample 1B Arabic tweets from a large in-house dataset of about 6B tweets. We only include tweets with at least 3 Arabic words, based on character string matching, regardless whether the tweet has non-Arabic string or not. That is, we do not remove non-Arabic so long as the tweet meets the 3 Arabic word criterion. The dataset makes up 128GB of text ( 15.6B tokens ). We use the same network architecture as ARBERT (BERT-base), but without the next sentence prediction (NSP) objective since tweets are short. See our repo for modifying BERT code to remove NSP. For more information about MARBERT, please visit our own GitHub repo .

BibTex

If you use our models (ARBERT, MARBERT, or MARBERTv2) for your scientific publication, or if you find the resources in this repository useful, please cite our paper as follows (to be updated):

@inproceedings{abdul-mageed-etal-2021-arbert,
    title = "{ARBERT} {\&} {MARBERT}: Deep Bidirectional Transformers for {A}rabic",
    author = "Abdul-Mageed, Muhammad  and
      Elmadany, AbdelRahim  and
      Nagoudi, El Moatez Billah",
    booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)",
    month = aug,
    year = "2021",
    address = "Online",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2021.acl-long.551",
    doi = "10.18653/v1/2021.acl-long.551",
    pages = "7088--7105",
    abstract = "Pre-trained language models (LMs) are currently integral to many natural language processing systems. Although multilingual LMs were also introduced to serve many languages, these have limitations such as being costly at inference time and the size and diversity of non-English data involved in their pre-training. We remedy these issues for a collection of diverse Arabic varieties by introducing two powerful deep bidirectional transformer-based models, ARBERT and MARBERT. To evaluate our models, we also introduce ARLUE, a new benchmark for multi-dialectal Arabic language understanding evaluation. ARLUE is built using 42 datasets targeting six different task clusters, allowing us to offer a series of standardized experiments under rich conditions. When fine-tuned on ARLUE, our models collectively achieve new state-of-the-art results across the majority of tasks (37 out of 48 classification tasks, on the 42 datasets). Our best model acquires the highest ARLUE score (77.40) across all six task clusters, outperforming all other models including XLM-R Large ( 3.4x larger size). Our models are publicly available at https://github.com/UBC-NLP/marbert and ARLUE will be released through the same repository.",
}
Acknowledgments

We gratefully acknowledge support from the Natural Sciences and Engineering Research Council of Canada, the Social Sciences and Humanities Research Council of Canada, Canadian Foundation for Innovation, ComputeCanada and UBC ARC-Sockeye . We also thank the Google TensorFlow Research Cloud (TFRC) program for providing us with free TPU access.

Runs of UBC-NLP MARBERT on huggingface.co

27.5K
Total runs
0
24-hour runs
315
3-day runs
6.0K
7-day runs
21.8K
30-day runs

More Information About MARBERT huggingface.co Model

MARBERT huggingface.co

MARBERT huggingface.co is an AI model on huggingface.co that provides MARBERT's model effect (), which can be used instantly with this UBC-NLP MARBERT model. huggingface.co supports a free trial of the MARBERT model, and also provides paid use of the MARBERT. Support call MARBERT model through api, including Node.js, Python, http.

UBC-NLP MARBERT online free

MARBERT huggingface.co is an online trial and call api platform, which integrates MARBERT's modeling effects, including api services, and provides a free online trial of MARBERT, you can try MARBERT online for free by clicking the link below.

UBC-NLP MARBERT online free url in huggingface.co:

https://huggingface.co/UBC-NLP/MARBERT

MARBERT install

MARBERT is an open source model from GitHub that offers a free installation service, and any user can find MARBERT on GitHub to install. At the same time, huggingface.co provides the effect of MARBERT install, users can directly use MARBERT installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

MARBERT install url in huggingface.co:

https://huggingface.co/UBC-NLP/MARBERT

Url of MARBERT

MARBERT huggingface.co Url

Provider of MARBERT huggingface.co

UBC-NLP
ORGANIZATIONS

Other API from UBC-NLP

huggingface.co

Total runs: 100.7K
Run Growth: 84.8K
Growth Rate: 84.26%
Updated:March 31 2022
huggingface.co

Total runs: 4.6K
Run Growth: -5.3K
Growth Rate: -126.89%
Updated:March 10 2026
huggingface.co

Total runs: 4.4K
Run Growth: 3.0K
Growth Rate: 68.53%
Updated:April 24 2024
huggingface.co

Total runs: 3.9K
Run Growth: -1.2K
Growth Rate: -30.60%
Updated:May 17 2024
huggingface.co

Total runs: 2.9K
Run Growth: 2.7K
Growth Rate: 93.68%
Updated:May 02 2024
huggingface.co

Total runs: 2.2K
Run Growth: 1.1K
Growth Rate: 50.16%
Updated:January 20 2022
huggingface.co

Total runs: 1.8K
Run Growth: 703
Growth Rate: 39.76%
Updated:August 14 2024
huggingface.co

Total runs: 411
Run Growth: 397
Growth Rate: 96.59%
Updated:February 13 2026
huggingface.co

Total runs: 248
Run Growth: 131
Growth Rate: 52.82%
Updated:November 11 2025
huggingface.co

Total runs: 222
Run Growth: 180
Growth Rate: 81.08%
Updated:February 13 2026
huggingface.co

Total runs: 108
Run Growth: -23
Growth Rate: -21.30%
Updated:November 26 2024
huggingface.co

Total runs: 87
Run Growth: 41
Growth Rate: 47.13%
Updated:March 14 2025
huggingface.co

Total runs: 46
Run Growth: 17
Growth Rate: 36.96%
Updated:June 10 2022
huggingface.co

Total runs: 45
Run Growth: -634
Growth Rate: -1408.89%
Updated:February 20 2024
huggingface.co

Total runs: 18
Run Growth: -18
Growth Rate: -100.00%
Updated:November 26 2024
huggingface.co

Total runs: 11
Run Growth: -1
Growth Rate: -9.09%
Updated:August 31 2021
huggingface.co

Total runs: 7
Run Growth: -4
Growth Rate: -57.14%
Updated:December 04 2023
huggingface.co

Total runs: 5
Run Growth: -10
Growth Rate: -200.00%
Updated:November 12 2024
huggingface.co

Total runs: 2
Run Growth: -20
Growth Rate: -1000.00%
Updated:June 03 2022
huggingface.co

Total runs: 1
Run Growth: -46
Growth Rate: -4600.00%
Updated:January 22 2025
huggingface.co

Total runs: 0
Run Growth: 0
Growth Rate: 0.00%
Updated:October 10 2023
huggingface.co

Total runs: 0
Run Growth: -1
Growth Rate: 0.00%
Updated:June 03 2022