Sharathhebbar24 / code_gpt2

huggingface.co
Total runs: 72
24-hour runs: 0
7-day runs: 0
30-day runs: 0
Model's Last Updated: March 15 2024
text-generation

Introduction of code_gpt2

Model Details of code_gpt2

This model is a finetuned version of Sharathhebbar24/code_gpt2_mini_model using Sharathhebbar24/Evol-Instruct-Code-80k-v1

Model description

GPT-2 is a transformers model pre-trained on a very large corpus of English data in a self-supervised fashion. This means it was pre-trained on the raw texts only, with no humans labeling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was trained to guess the next word in sentences.

More precisely, inputs are sequences of continuous text of a certain length and the targets are the same sequence, shifting one token (word or piece of word) to the right. The model uses a masking mechanism to make sure the predictions for the token i only use the inputs from 1 to i but not the future tokens.

This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks. The model is best at what it was trained for, however, which is generating texts from a prompt.

To use this model
>>> from transformers import AutoTokenizer, AutoModelForCausalLM
>>> model_name = "Sharathhebbar24/code_gpt2"
>>> model = AutoModelForCausalLM.from_pretrained(model_name)
>>> tokenizer = AutoTokenizer.from_pretrained(model_name)
>>> def generate_text(prompt):
>>>  inputs = tokenizer.encode(prompt, return_tensors='pt')
>>>  outputs = model.generate(inputs, max_length=64, pad_token_id=tokenizer.eos_token_id)
>>>  generated = tokenizer.decode(outputs[0], skip_special_tokens=True)
>>>  return generated[:generated.rfind(".")+1]
>>> prompt = "Can you write a Linear search program in Python"
>>> res = generate_text(prompt)
>>> res

Open LLM Leaderboard Evaluation Results

Detailed results can be found here

Metric Value
Avg. 28.19
AI2 Reasoning Challenge (25-Shot) 23.29
HellaSwag (10-Shot) 30.99
MMLU (5-Shot) 25.03
TruthfulQA (0-shot) 40.60
Winogrande (5-shot) 49.25
GSM8k (5-shot) 0.00

Runs of Sharathhebbar24 code_gpt2 on huggingface.co

72
Total runs
0
24-hour runs
0
3-day runs
0
7-day runs
0
30-day runs

More Information About code_gpt2 huggingface.co Model

More code_gpt2 license Visit here:

https://choosealicense.com/licenses/apache-2.0

code_gpt2 huggingface.co

code_gpt2 huggingface.co is an AI model on huggingface.co that provides code_gpt2's model effect (), which can be used instantly with this Sharathhebbar24 code_gpt2 model. huggingface.co supports a free trial of the code_gpt2 model, and also provides paid use of the code_gpt2. Support call code_gpt2 model through api, including Node.js, Python, http.

Sharathhebbar24 code_gpt2 online free

code_gpt2 huggingface.co is an online trial and call api platform, which integrates code_gpt2's modeling effects, including api services, and provides a free online trial of code_gpt2, you can try code_gpt2 online for free by clicking the link below.

Sharathhebbar24 code_gpt2 online free url in huggingface.co:

https://huggingface.co/Sharathhebbar24/code_gpt2

code_gpt2 install

code_gpt2 is an open source model from GitHub that offers a free installation service, and any user can find code_gpt2 on GitHub to install. At the same time, huggingface.co provides the effect of code_gpt2 install, users can directly use code_gpt2 installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

code_gpt2 install url in huggingface.co:

https://huggingface.co/Sharathhebbar24/code_gpt2

Url of code_gpt2

Provider of code_gpt2 huggingface.co

Sharathhebbar24
ORGANIZATIONS

Other API from Sharathhebbar24