Apple open source 200B parameter LLM code

September 11, 2023

Apple has open sourced the code for its 200 billion parameter LLM, Ajax GPT. The code is available on GitHub and can be used by anyone to train their own large language models.

This is a departure from Apple’s previous closed-source approach to AI development, as Apple has been previously criticized for its secrecy. According to a report by The Information, Apple is training this Ajax GPT on 200 billion parameters, potentially surpassing the prowess of GPT-3.5.

The open sourcing of AXLearn could help to foster innovation in the AI research community, as other researchers can now build on Apple’s work. It could also help to attract top talent to Apple, as engineers will be more likely to want to work for a company that is open and collaborative.

AXLearn is designed to facilitate the rapid training of machine-learning models, leverages the power of Google TPUs. Apple’s venture into open source is a stark departure from its traditionally closed-door approach, drawing comparisons to the likes of Meta’s Llama-2 and Anthropic’s Claude-2.

Apple has not released any details about its potential applications. However, it is possible that the model could be used to improve the performance of Apple’s products, such as Siri and the Photos app.

The sources for this piece include an article in AnalyticsIndiaMag.

Top Stories

Related Articles

December 23, 2025 Spotify says it has identified the user account behind what it describes as “unlawful” scraping of its more...

December 12, 2025 Google is rolling out fully managed MCP servers so AI agents can plug directly into services like more...

December 10, 2025 Chinese developers now hold most of the top positions on major community leaderboards that track the performance more...

August 25, 2025 xAI has announced that they are issuing and Open-Source version of Grok 2.5 — a move likely intended more...

Jim Love

Jim is an author and podcast host with over 40 years in technology.

Share:
Facebook
Twitter
LinkedIn