Omar Kamali PRO
omarkamali
AI & ML interests
NLP & LLMs for low resource languages.
Recent Activity
repliedto their post about 1 hour ago
I just might have cracked tokenizer-free LLMs. No vocab, no softmax.
I'm training a 22M params LLM rn to test this "thing" and it's able to formulate coherent sentences 🤯
Bear in mind, this is a completely new, tokenizer-free LLM architecture with built-in language universality.
Check the explainer video to understand what's happening. Feedback welcome on this approach!
repliedto their post about 2 hours ago
I just might have cracked tokenizer-free LLMs. No vocab, no softmax.
I'm training a 22M params LLM rn to test this "thing" and it's able to formulate coherent sentences 🤯
Bear in mind, this is a completely new, tokenizer-free LLM architecture with built-in language universality.
Check the explainer video to understand what's happening. Feedback welcome on this approach!
repliedto their post about 4 hours ago
I just might have cracked tokenizer-free LLMs. No vocab, no softmax.
I'm training a 22M params LLM rn to test this "thing" and it's able to formulate coherent sentences 🤯
Bear in mind, this is a completely new, tokenizer-free LLM architecture with built-in language universality.
Check the explainer video to understand what's happening. Feedback welcome on this approach!