huawei-noah / pycodegpt-CodeCLM-full-100m

huggingface.co
Total runs: 3
24-hour runs: 0
7-day runs: 2
30-day runs: -1
Model's Last Updated: October 25 2024

Introduction of pycodegpt-CodeCLM-full-100m

Model Details of pycodegpt-CodeCLM-full-100m

Model Card for pycodegpt-CodeCLM-full-100m

Model Description

This model is a PyCodeGPT model further trained on text-to-code pairs collected from public github repositories. Training was performed with the CodeCLM objective, i.e. causal language modeling calculating loss only over code tokens and full embedding separation.

In order to use the model, first download it from the hub and have a look at the evaluation section .

Citation

BibTeX:

@inproceedings{christopoulou-etal-2024-text,
    title = "Text-to-Code Generation with Modality-relative Pre-training",
    author = "Christopoulou, Fenia  and
      Zhang, Guchun  and
      Lampouras, Gerasimos",
    editor = "Graham, Yvette  and
      Purver, Matthew",
    booktitle = "Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers)",
    month = mar,
    year = "2024",
    address = "St. Julian{'}s, Malta",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2024.eacl-long.72",
    pages = "1194--1208",
    abstract = "Large pre-trained language models have recently been expanded and applied to programming language tasks with great success, often through further pre-training of a strictly-natural language model{--}where training sequences typically contain both natural and (linearised) programming language. Such approaches effectively map both modalities of the sequence into the same embedding space. However, programming language keywords (e.g. {``}while{''}) often have very strictly defined semantics. As such, transfer learning from their natural language usage may not necessarily be beneficial to their code application and vise versa. Assuming an already pre-trained language model, in this work we investigate how sequence tokens can be adapted and represented differently, depending on which modality they belong to, and to the ultimate benefit of the downstream task. We experiment with separating embedding spaces between modalities during further model pre-training with modality-relative training objectives. We focus on text-to-code generation and observe consistent improvements across two backbone models and two test sets, measuring pass@$k$ and a novel incremental variation.",
}
Model Card Authors [optional]

Fenia Christopoulou

Runs of huawei-noah pycodegpt-CodeCLM-full-100m on huggingface.co

3
Total runs
0
24-hour runs
0
3-day runs
2
7-day runs
-1
30-day runs

More Information About pycodegpt-CodeCLM-full-100m huggingface.co Model

pycodegpt-CodeCLM-full-100m huggingface.co

pycodegpt-CodeCLM-full-100m huggingface.co is an AI model on huggingface.co that provides pycodegpt-CodeCLM-full-100m's model effect (), which can be used instantly with this huawei-noah pycodegpt-CodeCLM-full-100m model. huggingface.co supports a free trial of the pycodegpt-CodeCLM-full-100m model, and also provides paid use of the pycodegpt-CodeCLM-full-100m. Support call pycodegpt-CodeCLM-full-100m model through api, including Node.js, Python, http.

pycodegpt-CodeCLM-full-100m huggingface.co Url

https://huggingface.co/huawei-noah/pycodegpt-CodeCLM-full-100m

huawei-noah pycodegpt-CodeCLM-full-100m online free

pycodegpt-CodeCLM-full-100m huggingface.co is an online trial and call api platform, which integrates pycodegpt-CodeCLM-full-100m's modeling effects, including api services, and provides a free online trial of pycodegpt-CodeCLM-full-100m, you can try pycodegpt-CodeCLM-full-100m online for free by clicking the link below.

huawei-noah pycodegpt-CodeCLM-full-100m online free url in huggingface.co:

https://huggingface.co/huawei-noah/pycodegpt-CodeCLM-full-100m

pycodegpt-CodeCLM-full-100m install

pycodegpt-CodeCLM-full-100m is an open source model from GitHub that offers a free installation service, and any user can find pycodegpt-CodeCLM-full-100m on GitHub to install. At the same time, huggingface.co provides the effect of pycodegpt-CodeCLM-full-100m install, users can directly use pycodegpt-CodeCLM-full-100m installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

pycodegpt-CodeCLM-full-100m install url in huggingface.co:

https://huggingface.co/huawei-noah/pycodegpt-CodeCLM-full-100m

Url of pycodegpt-CodeCLM-full-100m

pycodegpt-CodeCLM-full-100m huggingface.co Url

Provider of pycodegpt-CodeCLM-full-100m huggingface.co

huawei-noah
ORGANIZATIONS

Other API from huawei-noah

huggingface.co

Total runs: 5
Run Growth: -3
Growth Rate: -60.00%
Updated:January 18 2024
huggingface.co

Total runs: 5
Run Growth: 2
Growth Rate: 40.00%
Updated:December 22 2022
huggingface.co

Total runs: 3
Run Growth: -2
Growth Rate: -66.67%
Updated:December 22 2022
huggingface.co

Total runs: 3
Run Growth: -2
Growth Rate: -66.67%
Updated:January 18 2024
huggingface.co

Total runs: 2
Run Growth: -2
Growth Rate: -100.00%
Updated:January 18 2024
huggingface.co

Total runs: 0
Run Growth: 0
Growth Rate: 0.00%
Updated:September 27 2023
huggingface.co

Total runs: 0
Run Growth: -1
Growth Rate: 0.00%
Updated:January 05 2022