Instructions to use Data-Lab/moderation_layer with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Data-Lab/moderation_layer with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="Data-Lab/moderation_layer")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("Data-Lab/moderation_layer") model = AutoModelForSequenceClassification.from_pretrained("Data-Lab/moderation_layer") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 50661c83a5be7cb2e5e8dc3e91d683ca17e183f247ebd80bd0f3723d5c00b03d
- Size of remote file:
- 117 MB
- SHA256:
- b71b792a81c743b86f262f24dbe77eea184a0c27510709bef79bcf2ab9067ac0
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.