Recently I decided to build an app that utilizes Apple’s proprietary AI ‘Foundation Models’. I’ve worked with Anthropic and OpenAI AI technology before, but those are handled through API calls over the network. I wanted to experiment with local, on-device AI that does not require a network connection to function so decided to pick up Apple AI and build a small application.
You can find my app here in the US: AI Color Assistant
Let’s dig in!
Apple Intelligence and Foundation Models represent a shift in how we can integrate AI into our applications. Unlike traditional cloud-based AI solutions, Apple’s Foundation Models framework brings powerful language models directly onto the device, offering instant responses with complete privacy and zero network calls. I built the macOS app ‘AI Color Assistant’, a menu bar app that generates color palettes from natural language descriptions, to explore this new Apple AI capability. The result? A seamless AI implementation experience that feels native to macOS while maintaining the speed and privacy users expect from Apple platforms.
What makes this implementation genuinely exciting is the privacy-performance trade-off and the fact that we no longer have to make external API calls! Every AI call happens locally on the device’s Neural Engine, with zero network requests and zero data leaving the machine. Users can generate thousands of palettes without worrying about API costs, rate limits, or their design concepts being logged on remote servers. The speed is remarkable. Responses arrive in seconds because there’s no network latency, because there is no network!
For developers, this means we can offer AI-powered features to our users without complicated backend infrastructure, API key management, or usage-based pricing models. The framework handles availability gracefully too, letting us guide users to enable the feature or explain why it’s unavailable on their hardware.
Apple Intelligence makes sophisticated language model interactions as straightforward as any other Apple API. The @Generable protocol’s structured output approach feels distinctly Apple: elegant, type-safe, and remarkably simple to work with. For anyone building macOS or iOS apps, I’d encourage exploring Foundation Models now while the ecosystem is young. It works beautifully for tasks like natural language processing, content generation, intelligent suggestions, and contextual recommendations, but without the complexity of managing external AI services. AI Color Assistant demonstrates that powerful AI features can be simple to implement, respectful of user privacy, and delightful to use.
If you’re an app developer, check out Foundation Models and let me know what you think!