Instructions to use Fralet/DDeduPModelv7 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Fralet/DDeduPModelv7 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="Fralet/DDeduPModelv7")# Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("Fralet/DDeduPModelv7") model = AutoModelForCausalLM.from_pretrained("Fralet/DDeduPModelv7") - Notebooks
- Google Colab
- Kaggle
- Local Apps
- vLLM
How to use Fralet/DDeduPModelv7 with vLLM:
Install from pip and serve model
# Install vLLM from pip: pip install vllm # Start the vLLM server: vllm serve "Fralet/DDeduPModelv7" # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:8000/v1/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "Fralet/DDeduPModelv7", "prompt": "Once upon a time,", "max_tokens": 512, "temperature": 0.5 }'Use Docker
docker model run hf.co/Fralet/DDeduPModelv7
- SGLang
How to use Fralet/DDeduPModelv7 with SGLang:
Install from pip and serve model
# Install SGLang from pip: pip install sglang # Start the SGLang server: python3 -m sglang.launch_server \ --model-path "Fralet/DDeduPModelv7" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "Fralet/DDeduPModelv7", "prompt": "Once upon a time,", "max_tokens": 512, "temperature": 0.5 }'Use Docker images
docker run --gpus all \ --shm-size 32g \ -p 30000:30000 \ -v ~/.cache/huggingface:/root/.cache/huggingface \ --env "HF_TOKEN=<secret>" \ --ipc=host \ lmsysorg/sglang:latest \ python3 -m sglang.launch_server \ --model-path "Fralet/DDeduPModelv7" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "Fralet/DDeduPModelv7", "prompt": "Once upon a time,", "max_tokens": 512, "temperature": 0.5 }' - Unsloth Studio new
How to use Fralet/DDeduPModelv7 with Unsloth Studio:
Install Unsloth Studio (macOS, Linux, WSL)
curl -fsSL https://unsloth.ai/install.sh | sh # Run unsloth studio unsloth studio -H 0.0.0.0 -p 8888 # Then open http://localhost:8888 in your browser # Search for Fralet/DDeduPModelv7 to start chatting
Install Unsloth Studio (Windows)
irm https://unsloth.ai/install.ps1 | iex # Run unsloth studio unsloth studio -H 0.0.0.0 -p 8888 # Then open http://localhost:8888 in your browser # Search for Fralet/DDeduPModelv7 to start chatting
Using HuggingFace Spaces for Unsloth
# No setup required # Open https://huggingface.co/spaces/unsloth/studio in your browser # Search for Fralet/DDeduPModelv7 to start chatting
Load model with FastModel
pip install unsloth from unsloth import FastModel model, tokenizer = FastModel.from_pretrained( model_name="Fralet/DDeduPModelv7", max_seq_length=2048, ) - Docker Model Runner
How to use Fralet/DDeduPModelv7 with Docker Model Runner:
docker model run hf.co/Fralet/DDeduPModelv7
Overview
DDeduPModelv7 is a fine-tuned version of the Meta-Llama-3-8B model, specifically optimized for educational tasks for Kazakhstan and the world. This model excels in educational text generation, multi-label classification and mapping courses to relevant learning outcome codes across various academic disciplines, such as environmental science, pedagogy, pharmacology, ecology, IT, psychology, geodesy, art, linguistics, agriculture, geology, land management, and mathematics.
Background & Motivation
The development of DDeduPModelv7 stems from the need to automate and streamline curriculum alignment in higher education. Traditional methods of mapping courses to learning outcomes are time-intensive and prone to inconsistencies, especially across multidisciplinary programs. By leveraging a large dataset of pre-mapped educational examples, this model addresses challenges in fields like environmental science and pedagogy, where ensuring that course content contributes to broader program competencies is crucial for accreditation, assessment, and quality assurance. The motivation is to provide educators, curriculum developers, and institutions with an AI-assisted tool that promotes accurate, efficient alignment, ultimately enhancing educational planning and student outcome tracking in diverse academic contexts.
Training & Methodology
- Base Model: unsloth/llama-3-8b-bnb-4bit
- Fine-Tuning Framework: The model was fine-tuned using Huggingface's TRL library in conjunction with Unsloth, resulting in a training process that is approximately 2x faster.
- Dataset & Techniques: Trained on Fralet/DDeduPDatasetv2, a comprehensive collection of 137,226 examples with Kazakh, Russian and English languages.
Intended Use Cases
- Curriculum Alignment: Automatically map university courses to program learning outcomes, aiding in syllabus development and accreditation processes.
- Tool Integration: Embed in software for educators to quickly assess program gaps.
- Research Support: Assist researchers in studying interdisciplinary education by providing structured data on course-outcome relationships.
- Job Market Alignment: Serve as a foundational tool in educational settings for aligning job market and educational program development.
Ethical Considerations & Limitations
This model should be used to support, not replace, human expertise in education. Ensure outputs are reviewed for institutional compliance, as automated mappings may overlook nuanced contextual factors. Promote fair access to educational tools, and consider biases in the dataset's focus on specific disciplines or regions (e.g., Kazakhstani contexts).
It is tailored for English, Russian and Kazakh, and educational tasks; adaptation for other languages or domains may require further fine-tuning. Potential biases from dataset sources (e.g., over-representation of certain fields) could affect fairness in diverse applications.
- Downloads last month
- 29
Model tree for Fralet/DDeduPModelv7
Base model
meta-llama/Meta-Llama-3-8B