Apple Unveils OpenELM AI Models for On-Device Use

Joining the likes of Microsoft and Google, Apple has launched OpenELM, a collection of open-source language models (LLMs) tailored for on-device operations (via VentureBeat).

Siri hero 2023

Announced recently on the AI code community platform Hugging Face, OpenELM features compact models optimized for efficient text generation tasks. The collection comprises eight models in total, with four pre-trained and four instruction-tuned variants.

These models range in parameter sizes from 270 million to 3 billion, with more parameters generally indicating greater capabilities and performance.

Pre-training lays the foundation for an LLM to produce coherent text. However, it primarily focuses on prediction, often resulting in generalized responses. On the other hand, instruction tuning refines the model to generate more contextually relevant outputs.

For instance, while pre-training might respond to “teach me how to bake bread” with a generic answer like “in a home oven,” instruction tuning would yield detailed step-by-step instructions, as explained by IBM.

Apple has made the weights of its OpenELM models available under a “sample code license.” This license permits commercial use and modification but requires retaining specific notices and disclaimers when redistributing the software.

The company, however, cautions that the models come without safety guarantees and could produce inaccurate or objectionable outputs.

Apple

Apple’s listing on Hugging Face indicates a focus on on-device applications for OpenELM, aligning with strategies employed by rivals Google, Samsung, and Microsoft. Microsoft recently unveiled its Phi-3 Mini model designed for smartphone-based operations.

Benchmarks for Apple’s OpenELM were conducted on various devices, including an Intel i9-13900KF workstation and an Apple MacBook Pro equipped with an M2 Max system-on-chip.

Results shared by Apple indicate that OpenELM models, particularly the 450 million-parameter instruct variant, deliver commendable performance in text generation tasks.

P.S. Help support us and independent media here: Buy us a beer, Buy us a coffee, or use our Amazon link to shop.