This model is a fine-tuned version of
tner/bert-large-tweetner-2020
on the
tner/tweetner7
dataset (
train_2021
split). The model is first fine-tuned on
train_2020
, and then continuously fine-tuned on
train_2021
.
Model fine-tuning is done via
T-NER
's hyper-parameter search (see the repository
for more detail). It achieves the following results on the test set of 2021:
F1 (micro): 0.6319818203564167
Precision (micro): 0.6544463710676245
Recall (micro): 0.6110083256244219
F1 (macro): 0.5766988664971804
Precision (macro): 0.601237684920777
Recall (macro): 0.5559244768648601
The per-entity breakdown of the F1 score on the test set are below:
corporation: 0.514024041213509
creative_work: 0.39736070381231675
event: 0.42546740778170794
group: 0.5859649122807017
location: 0.6335664335664336
person: 0.8127490039840638
product: 0.6677595628415302
For F1 scores, the confidence interval is obtained by bootstrap as below:
This model can be used through the
tner library
. Install the library via pip.
pip install tner
TweetNER7
pre-processed tweets where the account name and URLs are
converted into special formats (see the dataset page for more detail), so we process tweets accordingly and then run the model prediction as below.
import re
from urlextract import URLExtract
from tner import TransformersNER
extractor = URLExtract()
defformat_tweet(tweet):
# mask web urls
urls = extractor.find_urls(tweet)
for url in urls:
tweet = tweet.replace(url, "{{URL}}")
# format twitter account
tweet = re.sub(r"\b(\s*)(@[\S]+)\b", r'\1{\2@}', tweet)
return tweet
text = "Get the all-analog Classic Vinyl Edition of `Takin' Off` Album from @herbiehancock via @bluenoterecords link below: http://bluenote.lnk.to/AlbumOfTheWeek"
text_format = format_tweet(text)
model = TransformersNER("tner/bert-large-tweetner7-continuous")
model.predict([text_format])
It can be used via transformers library but it is not recommended as CRF layer is not supported at the moment.
Training hyperparameters
The following hyperparameters were used during training:
If you use the model, please cite T-NER paper and TweetNER7 paper.
T-NER
@inproceedings{ushio-camacho-collados-2021-ner,
title = "{T}-{NER}: An All-Round Python Library for Transformer-based Named Entity Recognition",
author = "Ushio, Asahi and
Camacho-Collados, Jose",
booktitle = "Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: System Demonstrations",
month = apr,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.eacl-demos.7",
doi = "10.18653/v1/2021.eacl-demos.7",
pages = "53--62",
abstract = "Language model (LM) pretraining has led to consistent improvements in many NLP downstream tasks, including named entity recognition (NER). In this paper, we present T-NER (Transformer-based Named Entity Recognition), a Python library for NER LM finetuning. In addition to its practical utility, T-NER facilitates the study and investigation of the cross-domain and cross-lingual generalization ability of LMs finetuned on NER. Our library also provides a web app where users can get model predictions interactively for arbitrary text, which facilitates qualitative model evaluation for non-expert programmers. We show the potential of the library by compiling nine public NER datasets into a unified format and evaluating the cross-domain and cross- lingual performance across the datasets. The results from our initial experiments show that in-domain performance is generally competitive across datasets. However, cross-domain generalization is challenging even with a large pretrained LM, which has nevertheless capacity to learn domain-specific features if fine- tuned on a combined dataset. To facilitate future research, we also release all our LM checkpoints via the Hugging Face model hub.",
}
TweetNER7
@inproceedings{ushio-etal-2022-tweet,
title = "{N}amed {E}ntity {R}ecognition in {T}witter: {A} {D}ataset and {A}nalysis on {S}hort-{T}erm {T}emporal {S}hifts",
author = "Ushio, Asahi and
Neves, Leonardo and
Silva, Vitor and
Barbieri, Francesco. and
Camacho-Collados, Jose",
booktitle = "The 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing",
month = nov,
year = "2022",
address = "Online",
publisher = "Association for Computational Linguistics",
}
Runs of tner bert-large-tweetner7-continuous on huggingface.co
16
Total runs
0
24-hour runs
1
3-day runs
-3
7-day runs
12
30-day runs
More Information About bert-large-tweetner7-continuous huggingface.co Model
bert-large-tweetner7-continuous huggingface.co
bert-large-tweetner7-continuous huggingface.co is an AI model on huggingface.co that provides bert-large-tweetner7-continuous's model effect (), which can be used instantly with this tner bert-large-tweetner7-continuous model. huggingface.co supports a free trial of the bert-large-tweetner7-continuous model, and also provides paid use of the bert-large-tweetner7-continuous. Support call bert-large-tweetner7-continuous model through api, including Node.js, Python, http.
bert-large-tweetner7-continuous huggingface.co is an online trial and call api platform, which integrates bert-large-tweetner7-continuous's modeling effects, including api services, and provides a free online trial of bert-large-tweetner7-continuous, you can try bert-large-tweetner7-continuous online for free by clicking the link below.
tner bert-large-tweetner7-continuous online free url in huggingface.co:
bert-large-tweetner7-continuous is an open source model from GitHub that offers a free installation service, and any user can find bert-large-tweetner7-continuous on GitHub to install. At the same time, huggingface.co provides the effect of bert-large-tweetner7-continuous install, users can directly use bert-large-tweetner7-continuous installed effect in huggingface.co for debugging and trial. It also supports api for free installation.
bert-large-tweetner7-continuous install url in huggingface.co: