aisingapore / SPANBert

huggingface.co
Total runs: 4
24-hour runs: -1
7-day runs: 1
30-day runs: 3
Model's Last Updated: March 02 2023
question-answering

Introduction of SPANBert

Model Details of SPANBert

Causal Span Detection

You can test the model at Casual Span Extraction | SGNLP-Demo .
If you want to find out more information, please contact us at [email protected] .

Table of Contents
Model Details

Model Name: Span Extraction

  • Description: This is a causal span extraction model based on SPANBert which recognises that causes of emotions in conversations. Given 4 sets of inputs: target utterance, target utterance's emotion, evidence utterance and conversational history, it returns arrays of start and end logits which can be postprocessed to obtain the span which caused the emotion in the target utterance.
  • Paper: Recognizing emotion cause in conversations. arXiv preprint arXiv:2012.11820., Dec 2020.
  • Author(s): Poria, S., Majumder, N., Hazarika, D., Ghosal, D., Bhardwaj, R., Jian, S.Y.B., Hong, P., Ghosh, R., Roy, A., Chhaya, N., Gelbukh, A. and Mihalcea, R. (2020).
  • URL: https://arxiv.org/abs/2012.11820/

How to Get Started With the Model

Install Python package

SGnlp is an initiative by AI Singapore's NLP Hub. They aim to bridge the gap between research and industry, promote translational research, and encourage adoption of NLP techniques in the industry.

Various NLP models, other than aspect sentiment analysis are available in the python package. You can try them out at SGNLP-Demo | SGNLP-Github .

pip install sgnlp
Examples

For more full code (such as Causal Span Detection), please refer to this SGNLP-Docs .
Alternatively, you can also try out the Casual Span Extraction | SGNLP-Demo for Causal-Span-Detection.

Example of Causal Span Detection (for surprise):

from sgnlp.models.span_extraction import (
    RecconSpanExtractionConfig,
    RecconSpanExtractionModel,
    RecconSpanExtractionTokenizer,
    RecconSpanExtractionPreprocessor,
    RecconSpanExtractionPostprocessor,
)

# Load model
config = RecconSpanExtractionConfig.from_pretrained(
    "https://storage.googleapis.com/sgnlp-models/models/reccon_span_extraction/config.json"
)
tokenizer = RecconSpanExtractionTokenizer.from_pretrained(
    "mrm8488/spanbert-finetuned-squadv2"
)
model = RecconSpanExtractionModel.from_pretrained(
    "https://storage.googleapis.com/sgnlp-models/models/reccon_span_extraction/pytorch_model.bin",
    config=config,
)
preprocessor = RecconSpanExtractionPreprocessor(tokenizer)
postprocessor = RecconSpanExtractionPostprocessor()

# Model predict
input_batch = {
    "emotion": ["surprise", "surprise"],
    "target_utterance": [
        "Hi George ! It's good to see you !",
        "Hi George ! It's good to see you !",
    ],
    "evidence_utterance": [
        "Linda ? Is that you ? I haven't seen you in ages !",
        "Hi George ! It's good to see you !",
    ],
    "conversation_history": [
        "Linda ? Is that you ? I haven't seen you in ages ! Hi George ! It's good to see you !",
        "Linda ? Is that you ? I haven't seen you in ages ! Hi George ! It's good to see you !",
    ],
}

tensor_dict, evidences, examples, features = preprocessor(input_batch)
raw_output = model(**tensor_dict)
context, evidence_span, probability = postprocessor(
    raw_output, evidences, examples, features)


Training

The train and evaluation datasets were derived from the RECCON dataset. The full dataset can be downloaded from the author's github repository .

Training Results
  • Training Time: ~3 hours for 12 epochs on a single V100 GPU.

Model Parameters

  • Model Weights: link
  • Model Config: link
  • Model Inputs: Target utterance, emotion in target utterance, evidence utterance and conversational history.
  • Model Outputs: Array of start logits and array of end logits. These 2 arrays can be post processed to detemine the start and end of the causal span.
  • Model Size: ~411MB
  • Model Inference Info: ~ 2 sec on Intel(R) Core(TM) i7-8750H CPU @ 2.20GHz.
  • Usage Scenarios: Recognizing emotion cause for phone support satisfaction.

Other Information

Runs of aisingapore SPANBert on huggingface.co

4
Total runs
-1
24-hour runs
-1
3-day runs
1
7-day runs
3
30-day runs

More Information About SPANBert huggingface.co Model

More SPANBert license Visit here:

https://choosealicense.com/licenses/mit

SPANBert huggingface.co

SPANBert huggingface.co is an AI model on huggingface.co that provides SPANBert's model effect (), which can be used instantly with this aisingapore SPANBert model. huggingface.co supports a free trial of the SPANBert model, and also provides paid use of the SPANBert. Support call SPANBert model through api, including Node.js, Python, http.

aisingapore SPANBert online free

SPANBert huggingface.co is an online trial and call api platform, which integrates SPANBert's modeling effects, including api services, and provides a free online trial of SPANBert, you can try SPANBert online for free by clicking the link below.

aisingapore SPANBert online free url in huggingface.co:

https://huggingface.co/aisingapore/SPANBert

SPANBert install

SPANBert is an open source model from GitHub that offers a free installation service, and any user can find SPANBert on GitHub to install. At the same time, huggingface.co provides the effect of SPANBert install, users can directly use SPANBert installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

SPANBert install url in huggingface.co:

https://huggingface.co/aisingapore/SPANBert

Url of SPANBert

Provider of SPANBert huggingface.co

aisingapore
ORGANIZATIONS

Other API from aisingapore

huggingface.co

Total runs: 3
Run Growth: 2
Growth Rate: 66.67%
Updated:March 02 2023