Apple’s Releases 4 New Open-Source AI Models that Run On-Device

Apple Open Source AI LLM
In Short
  • Apple releases a family of open-source large language models that run on-device rather than on cloud servers.
  • There are 8 OpenELM models in total, four of which were pre-trained using the CoreNet library, while the other four are instruction-tuned models.
  • The LLMs are ready for developer use and can be downloaded from the Hugging Face Hub library.

Apple doesn’t share much about its Generative AI plans. Now, with the release of a family of Open-source large language models, it seems the Cupertino tech giant wants to make AI run locally on Apple devices. Apple’s LLMs, which the company calls OpenELM (Open-source Efficient Language Models), are designed to run on-device rather than on cloud servers. These LLMs are available on the Hugging Face Hub, a central platform for sharing AI code and datasets.

In the testing, Apple observes that OpenELM delivers similar performance as the other open language models, but the former has less training data.

Apple’s Releases 4 New Open-Source AI Models that Run On-Device
Image Courtesy: Hugging Face

As shown in the white paper, there are eight OpenELM models in total. Four models were pre-trained using the CoreNet library, while the other four are instruction-tuned models. To improve the overall accuracy and efficiency, Apple uses a layer-wise scaling strategy in these open-source LLMs.

“To this end, we release OpenELM, a state-of-the-art open language model. OpenELM uses a layer-wise scaling strategy to efficiently allocate parameters within each layer of the transformer model, leading to enhanced accuracy. For example, with a parameter budget of approximately one billion parameters, OpenELM exhibits a 2.36% improvement in accuracy compared to OLMo while requiring 2× fewer pre-training tokens.“- Apple

Apple didn’t just provide the final trained model. Rather, it has also provided code, training logs, and multiple iterations of the model. The project researchers are optimistic that it will accelerate advancements and deliver “trustworthy results” within the realm of natural language AI.

“Diverging from prior practices that only provide model weights and inference code, and pre-train on private datasets, our release includes the complete framework for training and evaluation of the language model on publicly available datasets, including training logs, multiple checkpoints, and pre-training configurations. We also release code to convert models to MLX library for inference and fine-tuning on Apple devices. This comprehensive release aims to empower and strengthen the open research community, paving the way for future open research endeavors.”– Apple

Apple further added that the release of the OpenELM models will “empower and enrich the open research community” with state-of-the-art language models. The open-source models allow researchers to explore potential risks, data, and biases inherent in the models. The models are ready for developer use. They can use these open-source LLMs as-is or make required modifications.

Back in February, Apple’s CEO Tim Cook teased that Generative AI features are coming to Apple devices later this year. Sometime after that, he reiterated that the company is working hard to deliver ground-breaking AI experiences.

Well, Apple has released several several other AI models previously. Unfortunately, it hasn’t brought AI capabilities to its devices yet. That said, the upcoming iOS 18 is expected to include a suite of new AI features and the OpenELM release might be the latest chunk of Apple’s behind-the-scenes preparation.

Recently, Mark Gurman also reported that iOS 18 AI features will be largely fueled by an on-device large language model for privacy and speed benefits. Well, we will get to know everything when Apple announces its iOS 18 and other software upgrades at its WWDC on June 10.

comment Comments 0
Leave a Reply