With the release of iOS 26, developers are actively implementing Apple's local AI models through the Foundation Models framework. This framework grants developers access to on-device AI capabilities, eliminating inference costs and enabling features like guided generation and tool calling. While these models are smaller compared to those from OpenAI and Google, they prioritise enhancing app usability.
Apps are utilising these capabilities to introduce new functionalities. Lil Artist creates stories from selected characters and themes. Day One provides AI-generated entry titles and prompts for deeper search. Crouton suggests recipe tags, names timers, and offers step-by-step guides. SignEasy summarises contracts by extracting key points. Tasks suggests tags, identifies recurring patterns, and converts voice notes into to-do lists. MoneyCoach offers spending insights and auto-categorises expenses. Daylish experiments with automatic emoji suggestions for timeline events. Language learning apps generate word examples and quizzes.
These local AI-centric features enhance privacy and deliver excellent performance while adding smart features for everyday apps. As iOS 26 adoption grows, more apps are expected to integrate Apple's models. The focus is on targeted language and vision models optimised for the Neural Engine, enabling tasks like classification, entity extraction, and intent recognition without sending data to the cloud.