Adapters for the paper "M2QA: Multi-domain Multilingual Question Answering".
We evaluate 2 setups: MAD-X+Domain and MAD-X²
			
	
	AI & ML interests
Parameter-Efficient Fine-Tuning
			Organization Card
		
		
 
Adapters
A Unified Library for Parameter-Efficient and Modular Transfer Learning
💻 Website • 📚 Documentation • 📜 Paper • 🧪 Notebook Tutorials
Adapters is an add-on library to HuggingFace's Transformers, integrating various adapter methods into state-of-the-art pre-trained language models with minimal coding overhead for training and inference.
pip install adapters
🤗 Hub integration: https://docs.adapterhub.ml/huggingface_hub.html
			models
			505
		
			
	
	
	
	
	 
				AdapterHub/m2qa-xlm-roberta-base-mad-x-domain-wiki
		
	
				Updated
					
				
				• 
					
					1
				
	
				
				
 
				AdapterHub/m2qa-xlm-roberta-base-mad-x-domain-news
		
	
				Updated
					
				
				• 
					
					3
				
	
				
				
 
				AdapterHub/m2qa-xlm-roberta-base-mad-x-2-english
		
	
				Updated
					
				
				• 
					
					2
				
	
				
				
 
				AdapterHub/m2qa-xlm-roberta-base-mad-x-2-chinese
		
	
				Updated
					
				
				
				
	
				
				
 
				AdapterHub/m2qa-xlm-roberta-base-mad-x-2-wiki
		
	
				Updated
					
				
				• 
					
					5
				
	
				
				
 
				AdapterHub/m2qa-xlm-roberta-base-mad-x-2-news
		
	
				Updated
					
				
				• 
					
					3
				
	
				
				
 
				AdapterHub/m2qa-xlm-roberta-base-mad-x-domain-product-reviews
		
	
				Updated
					
				
				
				
	
				
				
 
				AdapterHub/m2qa-xlm-roberta-base-mad-x-2-product-reviews
		
	
				Updated
					
				
				• 
					
					4
				
	
				
				
 
				AdapterHub/m2qa-xlm-roberta-base-mad-x-2-creative-writing
		
	
				Updated
					
				
				• 
					
					2
				
	
				
				
 
				AdapterHub/m2qa-xlm-roberta-base-mad-x-2-turkish
		
	
				Updated
					
				
				• 
					
					6
				
	
				
				
			datasets
			0
		
			
	None public yet