Sunday, November 02, 2025

Getting started with Apple's tiny AI Foundation model

Recently I asked ChatGPT how to clean the creosote stains from the glass on a wood stove. It did know the trick I've just learned about using Ammonia. I thought I'd see if Apple's tiny local Foundation model knew. I've built a minimal app to let me ask questions and show the answers. 


It's a pretty good response but didn't know the Ammonia trick. Making an app to use Apple Foundation model is super easy. Obviously you need to be on macOS 26 or iOS 26 and have Apple Intelligence enabled. Here's the code.

//

//  ContentView.swift

//  FoundationPlay

//

//  Created by Peter Marks on 31/10/2025.

//


import SwiftUI

import FoundationModels


struct ContentView: View {

    @State private var userInput = ""

        @State private var response = AttributedString("")

        @State private var isLoading = false

    

    var body: some View {

        VStack {

            TextField("Prompt", text: $userInput)

                .onSubmit {

                    Task {

                        await generateResponse()

                    }

                }

            HStack {

                Spacer()

                Button("Ask") {

                    Task {

                        await generateResponse()

                    }

                }

            }

            if isLoading {

                   ProgressView()

               }

               

               ScrollView {

                   Text(response)

                       .frame(maxWidth: .infinity, alignment: .leading)

               }

        }

        .padding()

    }

    private func generateResponse() async {

        isLoading = true

        defer { isLoading = false }

        

        do {

            let session = LanguageModelSession()

            let prompt = Prompt(userInput)

            let result = try await session.respond(to: prompt)

            response = attributedMarkdown(markdown: result.content)

        } catch {

            response = attributedMarkdown(markdown: "Error: \(error.localizedDescription)")

        }

    }

    

    func attributedMarkdown(markdown: String) -> AttributedString {

            do {

                return try AttributedString(markdown: markdown, options:AttributedString.MarkdownParsingOptions(interpretedSyntax:

                        .inlineOnlyPreservingWhitespace))

            } catch {

                return AttributedString("Error parsing markdown")

            }

        }

}


Amazingly small amount needed. Much of the code is to display the returned Markdown nicely. I wonder why they haven't bolted this on to Siri already?


No comments: