The latest JAIS large language model (LLM), JAIS 70B, was released today by Inception, a G42 company specialising in the development of advanced AI models and applications, all provided as a service.

A 70 billion parameter model, JAIS 70B is built for developers of Arabic-based natural-language processing (NLP) solutions and promises to accelerate the integration of Generative AI services across various industries, enhancing capabilities in customer service, content creation, and data analysis.

The company has also unveiled a comprehensive suite of JAIS foundation and fine-tuned models; 20 models, across 8 sizes, ranging from 590M to 70B parameters, and specifically fine-tuned for chat applications, trained on up to 1.6T tokens of Arabic, English, and code data.

This extensive release now delivers a breadth of tools, including the first Arabic-centric model small enough to run on a laptop, delivering both small, compute-efficient models for targeted applications, and advanced model sizes for enterprise precision.

Dr. Andrew Jackson, CEO of Inception, said, "AI is now a proven value-adding force, and large language models have been at the forefront of the AI adoption spike. JAIS was created to preserve Arabic heritage, culture, and language and to democratise access to AI. Releasing JAIS 70B and this new family of models reinforces our commitment to delivering the highest-quality AI foundation model for Arabic-speaking nations."

Neha Sengupta, Principal Applied Scientist at Inception, stated, "For models up to 30 billion parameters, we successfully trained JAIS from scratch, consistently outperforming adapted models in the community. However, for models with 70 billion parameters and above, the computational complexity and environmental impact of training from scratch were significant."