Instructions to use RussianNLP/ruRoBERTa-large-rucola with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use RussianNLP/ruRoBERTa-large-rucola with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="RussianNLP/ruRoBERTa-large-rucola")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("RussianNLP/ruRoBERTa-large-rucola") model = AutoModelForSequenceClassification.from_pretrained("RussianNLP/ruRoBERTa-large-rucola") - Notebooks
- Google Colab
- Kaggle
This is a finetuned version of RuRoBERTa-large for the task of linguistic acceptability classification on the RuCoLA benchmark.
The hyperparameters used for finetuning are as follows:
- 5 training epochs (with early stopping based on validation MCC)
- Peak learning rate: 1e-5, linear warmup for 10% of total training time
- Weight decay: 1e-4
- Batch size: 32
- Random seed: 5
- Optimizer: torch.optim.AdamW
- Downloads last month
- 150