CarperAI / diff-codegen-350m-v1

huggingface.co
Total runs: 16
24-hour runs: -3
7-day runs: -3
30-day runs: -5
Model's Last Updated: November 26 2022
text-generation

Introduction of diff-codegen-350m-v1

Model Details of diff-codegen-350m-v1

Model Description

Diff-Codegen-350M is the first in a series of diff models released by CarperAI. A diff model is an autoregressive language model trained on edits to a piece of text, formatted in Unified Diff Format. These diff models can suggest, given a section of text and a description of the desired change, an intelligent change to the text that fits the description, marking the lines added, changed, and deleted in diff format. The primary use case for these models is for suggesting changes to code—as such, most models we release will be fine-tuned versions of models trained on code datasets.

Diff-Codegen-350M-v1 is an initial preliminary release of an experimental artifact and should be treated as such. We are releasing these results and this model in the hopes that it may be useful to the greater research community, especially those interested in LMs for code.

CarperAI will be releasing larger diff LMs trained on larger code datasets in the near future, building on this initial release.

Training Data

This model is a fine-tune of Codegen-350m-mono by Salesforce. This language model was first pre-trained on The PIle, an 800Gb dataset composed of varied web corpora. The datasheet and paper for the Pile can be found here and here respectively. The model was then fine-tuned on a large corpus of code data in multiple languages, before finally being fine-tuned on a Python code dataset. The Codegen paper with full details of these datasets can be found here.

Our diff model was trained on a dataset of commits from BigQuery, a large-scale dataset of many programming languages from GitHub repositories. We filtered the dataset by the number of stars in the repository (>100 stars), license (only open-source non-copyleft licensed code included), and length of file (files greater than 2048 tokens in length were excluded).

The model was trained using the GPT-2 tokenizer.

Training Details

The model was trained for 44574 steps (1 epoch) on 8 A100 GPUs.

Each file was formatted as follows for input to the language model:

<NME> {FILE_NAME}
<BEF> {INPUT_FILE}
<MSG> {COMMIT_MESSAGE}
<DFF> {FILE_DIFF}

Intended Uses and Limitations

Due to the model’s small size and restriction to code, one should not expect the model to generalize to domains beyond code and perform (successful) reasoning over large chunks of code. This model is intended to be used in prototyping ELM-like systems, and for solely experimental purposes. This model is provided without warranty and should not be used in commercial settings -- even though the license permits.

Limitations and Biases

Due to the short context length restriction and due to the fact that all repositories with under 100 stars were excluded, we expect our diff model to underperform on underrepresented languages, for instance Lean or Coq.

The output of this model should not be trusted as correct and secure code. This model should not be used in any mission critical setting where security is of importance. Similarly, when running the output of this model, it should be done in a sandbox like gVisor.

Evaluation Results

Since this model was trained for prototyping, no evaluation has been performed. Future releases will have extensive evaluation.

Licensing

This model is licensed as MIT. While it can be used in commercial settings, we do not recommend its use in commercial settings.

Acknowledgements

We’d like to thank Honglu Fan, Harry Saini, Herbie Bradley, and Joel Lehman

Runs of CarperAI diff-codegen-350m-v1 on huggingface.co

16
Total runs
-3
24-hour runs
-3
3-day runs
-3
7-day runs
-5
30-day runs

More Information About diff-codegen-350m-v1 huggingface.co Model

More diff-codegen-350m-v1 license Visit here:

https://choosealicense.com/licenses/mit

diff-codegen-350m-v1 huggingface.co

diff-codegen-350m-v1 huggingface.co is an AI model on huggingface.co that provides diff-codegen-350m-v1's model effect (), which can be used instantly with this CarperAI diff-codegen-350m-v1 model. huggingface.co supports a free trial of the diff-codegen-350m-v1 model, and also provides paid use of the diff-codegen-350m-v1. Support call diff-codegen-350m-v1 model through api, including Node.js, Python, http.

diff-codegen-350m-v1 huggingface.co Url

https://huggingface.co/CarperAI/diff-codegen-350m-v1

CarperAI diff-codegen-350m-v1 online free

diff-codegen-350m-v1 huggingface.co is an online trial and call api platform, which integrates diff-codegen-350m-v1's modeling effects, including api services, and provides a free online trial of diff-codegen-350m-v1, you can try diff-codegen-350m-v1 online for free by clicking the link below.

CarperAI diff-codegen-350m-v1 online free url in huggingface.co:

https://huggingface.co/CarperAI/diff-codegen-350m-v1

diff-codegen-350m-v1 install

diff-codegen-350m-v1 is an open source model from GitHub that offers a free installation service, and any user can find diff-codegen-350m-v1 on GitHub to install. At the same time, huggingface.co provides the effect of diff-codegen-350m-v1 install, users can directly use diff-codegen-350m-v1 installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

diff-codegen-350m-v1 install url in huggingface.co:

https://huggingface.co/CarperAI/diff-codegen-350m-v1

Url of diff-codegen-350m-v1

diff-codegen-350m-v1 huggingface.co Url

Provider of diff-codegen-350m-v1 huggingface.co

CarperAI
ORGANIZATIONS

Other API from CarperAI

huggingface.co

Total runs: 14
Run Growth: 0
Growth Rate: 0.00%
Updated:December 01 2022