Instructions to use facebook-llama/custom_code with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use facebook-llama/custom_code with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("facebook-llama/custom_code") model = AutoModel.from_pretrained("facebook-llama/custom_code") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 433e9584056e5d2ef3de3d6357dfd7ca14cbc4d678686f06b6c4812d0922b62c
- Size of remote file:
- 1.42 GB
- SHA256:
- fd413c1d0f38ef80869b4fb2bb09a03e489c8e94636265f3975c032cb0c4913d
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.