AdapterHub 文档
AdapterHub 是一个简化适配器集成、训练和使用的框架,专为基于Transformer的语言模型设计,支持多种高效微调方法。 要查看当前已实现方法的完整列表,请参阅我们代码库中的表格。
该框架由两个主要组件组成:
Hugging Face的Transformers库的扩展,用于向transformer模型添加适配器 |
预训练适配器模块的中央集合 |
目前,我们支持模型概览页面上列出的所有模型的PyTorch版本。
入门指南
适配器方法
高级
支持的模型
引用
如果您在工作中使用_Adapters_,请考虑引用我们的库论文Adapters: A Unified Library for Parameter-Efficient and Modular Transfer Learning
@inproceedings{poth-etal-2023-adapters,
title = "Adapters: A Unified Library for Parameter-Efficient and Modular Transfer Learning",
author = {Poth, Clifton and
Sterz, Hannah and
Paul, Indraneil and
Purkayastha, Sukannya and
Engl{\"a}nder, Leon and
Imhof, Timo and
Vuli{\'c}, Ivan and
Ruder, Sebastian and
Gurevych, Iryna and
Pfeiffer, Jonas},
booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing: System Demonstrations",
month = dec,
year = "2023",
address = "Singapore",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.emnlp-demo.13",
pages = "149--160",
}
另外,对于前身adapter-transformers、Hub基础设施以及由AdapterHub团队上传的适配器,请考虑引用我们的初始论文:AdapterHub: A Framework for Adapting Transformers
@inproceedings{pfeiffer2020AdapterHub,
title={AdapterHub: A Framework for Adapting Transformers},
author={Jonas Pfeiffer and
Andreas R\"uckl\'{e} and
Clifton Poth and
Aishwarya Kamath and
Ivan Vuli\'{c} and
Sebastian Ruder and
Kyunghyun Cho and
Iryna Gurevych},
booktitle={Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP 2020): Systems Demonstrations},
year={2020},
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.emnlp-demos.7",
pages = "46--54",
}