ccasimiro/squad_es
Updated • 319 • 14
How to use somosnlp-hackathon-2022/roberta-base-bne-squad2-es with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("question-answering", model="somosnlp-hackathon-2022/roberta-base-bne-squad2-es") # Load model directly
from transformers import AutoTokenizer, AutoModelForQuestionAnswering
tokenizer = AutoTokenizer.from_pretrained("somosnlp-hackathon-2022/roberta-base-bne-squad2-es")
model = AutoModelForQuestionAnswering.from_pretrained("somosnlp-hackathon-2022/roberta-base-bne-squad2-es")This model is a fine-tuned version of PlanTL-GOB-ES/roberta-base-bne on the squad_es(v2) training dataset.
The hyperparameters were chosen based on those used in deepset/roberta-base-squad2, an english-based model trained for similar purposes
--num_train_epochs 2
--learning_rate 3e-5
--max_seq_length 386
--doc_stride 128
Evaluated on the squad_es(v2) dev set.
eval_exact": 62.13526733007252,
eval_f1": 69.38515019522332,
eval_HasAns_exact": 53.07017543859649,
eval_HasAns_f1": 67.57238714827123,
eval_HasAns_total": 5928,
eval_NoAns_exact": 71.19730185497471,
eval_NoAns_f1": 71.19730185497471,
eval_NoAns_total": 5930,
Santiago Maximo: smaximo