Apple’s 2026 Swift Student Challenge produced some of its most socially conscious code yet, with four Distinguished Winners using Apple’s developer tools — and a lot of AI assistance from Claude, among others — to build apps that address tremors, presentation anxiety, flood safety and music education. Apple showcased them in a feature story Thursday.
Swift Student Challenge winner apps bring great ideas to life
The annual Swift Student Challenge invites students worldwide to create original app playgrounds using Swift, Apple’s coding language. This year’s 350 winning submissions came from 37 countries and regions. Fifty Distinguished Winners will attend WWDC at Apple Park in June, where they’ll watch the keynote live and work directly with Apple engineers. The four Apple profiled developed apps from personal experiences and greatly augmented them with AI.
“The breadth of creativity we see in the Swift Student Challenge never ceases to amaze us,” said Susan Prescott, Apple’s vice president of Worldwide Developer Relations. “This year’s winners found remarkable ways to harness the power of Apple platforms, Swift, and AI tools to build app playgrounds that are as technically impressive as they are meaningful.”
Helping artists with tremors rediscover their passion

Photo: Apple
Gayatri Goundadkar, 20, drew inspiration from her grandmother in Pune, India, who loves Warli painting. It’s a centuries-old art form using basic geometric shapes. But age-related tremors made holding a brush too difficult for her to keep practicing the art.
Her granddaughter’s response was Steady Hands, an iPad app that uses Apple Pencil stabilization to filter out involuntary hand movements. Using Apple’s PencilKit and Accelerate frameworks, the app analyzes stroke data in real time, distinguishes intentional movement from tremor and removes the shaking component. Completed drawings are displayed in a personal 3D gallery.
“I wanted them to feel like artists, not patients,” Goundadkar said. The Maharashtra Institute of Technology student used AI tools, including Claude, to help her work through SwiftUI concepts during development.
Coaching presenters in real time

Photo: Apple
Anton Baranov, 22, built his app Pitch Coach after his mother — a linguistics professor in Frankfurt, Germany — described watching talented students freeze up during presentations, losing their words and their confidence.
The app uses Apple’s Foundation Models framework to deliver real-time, personalized feedback during practice sessions. It flags filler words like “um” or “like” and tracking posture via AirPods. Baranov also used Claude Agent in Xcode 26 to localize the app into 20 languages.
Since launching on the App Store in early March, Pitch Coach has racked up more than 6,000 organic downloads. Users have found creative applications beyond boardrooms, including rehearsing stand-up comedy and rap performances.
Navigating floods safely in real time

Photo: Apple
Karen-Happuch Peprah Henneh, a Ghanaian designer now studying interaction design at the California College of the Arts, built Asuo. The name means “flowing water” in Twi. It helps people escape flood zones safely. She shaped the app from personal memory: the catastrophic 2015 floods in Accra, Ghana, which left the country in mourning.
Asuo calculates rain intensity and uses a pathfinding algorithm informed by historical flood data to guide users along safe evacuation routes. It includes full VoiceOver support and a custom voice alert system built with AVSpeechSynthesizer, so no one is left behind during a crisis.
Henneh leaned on AI to handle the most complex technical implementation. “Something that would have taken me months to do was able to be done in three or four days,” she said.
In addition to coding, Henneh runs Radiance Girl Africa. It’s a nonprofit working to bring more women from marginalized communities into tech.
Playing an instrument you don’t own

Photo: Apple
Yoonjae Joung, 21, left his viola behind when he traveled from Seoul to New York University for an exchange program. After attending a New York Philharmonic concert, he missed it enough to build LeViola. It’s an app that lets users play and learn the viola using only their hands and an iPhone camera.
Using Create ML, Joung trained his own machine learning model and integrated it via Core ML. The app uses on-device frameworks to analyze left-hand joint positions for note detection and tracks right-arm angle to simulate bowing across different strings. Joung used Claude as well as OpenAI’s Codex and Google’s Gemini.
“People without instruments can now engage in classical music,” he said. “I want more people to have the opportunity to learn an instrument and enjoy orchestra, and iPhone makes it all possible.”
Together, these four winners reflect a broader theme running through this year’s challenge: that the most meaningful technology often starts with a personal story, and that Apple’s developer ecosystem — increasingly augmented by AI tools — is giving a new generation of coders the means to act on it.