setfit
Few-shot learning tool
A framework for efficient few-shot learning with Sentence Transformers
Efficient few-shot learning with Sentence Transformers
2k stars
22 watching
223 forks
Language: Jupyter Notebook
last commit: 6 months ago
Linked from 1 awesome list
few-shot-learningnlpsentence-transformers
Related projects:
Repository | Description | Stars |
---|---|---|
| An implementation of the Optimization as a Model for Few-Shot Learning paper in PyTorch | 255 |
| This repository provides code for a few-shot transfer learning approach to personalized and federated image classification | 11 |
| A pretraining framework for large language models using 3D parallelism and scalable training techniques | 1,332 |
| An implementation of the Learning to Compare paper in PyTorch | 251 |
| A collection of pre-trained models and code for training paraphrastic sentence embeddings from large machine translation datasets. | 102 |
| A neural network model for learning semantic representations from multiple natural language processing tasks | 1,191 |
| An implementation of Federated Few-shot Learning using Python and the PyTorch framework. | 18 |
| A Go package providing an easy interface to use pre-trained NLP models from the HuggingFace repository for tasks like text classification and machine translation. | 293 |
| Implementing OpenAI's transformer language model in PyTorch with pre-trained weights and fine-tuning capabilities | 1,511 |
| Provides Swift implementations of popular transformer-based models for question answering and text generation. | 1,618 |
| Converts popular transformer models to run on Android devices for efficient inference and generation tasks. | 396 |
| A Python-based few-shot learning framework for medical applications utilizing a visual language model. | 396 |
| A PyTorch quantization backend for models. | 847 |
| A minimalistic codebase for training and interacting with NLG models from HuggingFace Transformers using PyTorch Lightning. | 3 |
| An implementation of a few-shot learning method that uses cascade models to generate strong learners from text prompts. | 41 |