home page

Kawn Foundation Models

Kawn’s foundation language models focus on achieving deep understanding of Arabic texts across various styles and contexts. At Kawn, we develop a range of models varying in size and use, from lightweight models suitable for resource-constrained devices, to advanced models built with Mixture of Experts (MoE) technology that deliver high performance while maintaining resource efficiency.

Our model lineup includes:

Domain-Specific Language Models

A comprehensive AI architecture designed to support Arabic language understanding across multiple levels—from textual and visual comprehension to semantic embeddings.

ModelSizeBest ForArchitecture
KuwainSmallFocused tasks, on-device appsStandard Transformer
Kawn MediumMediumGeneral-purpose applicationsStandard Transformer
Kawn–MoELargeKnowledge-intensive domainsMixture of Experts (MoE)