google/xtreme
Viewer • Updated • 2.77M • 20k • 116
How to use OtterDev/otterchat2 with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("question-answering", model="OtterDev/otterchat2") # Load model directly
from transformers import AutoTokenizer, AutoModelForQuestionAnswering
tokenizer = AutoTokenizer.from_pretrained("OtterDev/otterchat2")
model = AutoModelForQuestionAnswering.from_pretrained("OtterDev/otterchat2")OtterChat 2 is a brand-new version of OtterChat with new features!
This model can be used to extract data from text, such as an essay.
[More Information Needed]
[More Information Needed]
The main limitation of this model is you NEED to have data in order for it to work. More will be posted when more limitations are found. Another limitation is it is quite slow when translating languages.
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
Select "Deploy" and select Interference API to get started.