Show HN: Sebastian.run – Build mobile apps from prompts using AI
sebastian.runHi HN!
I’ve been working on Sebastian.run — an AI tool that lets you build mobile apps by simply describing them in natural language.
Think of it as “vibe coding” — you type:
“Build me a recipe app with favorites, search, and pastel colors.”
...and in seconds, it generates the full app (frontend + backend + logic).
My goal is to make mobile app creation as intuitive as talking to a designer — no code, no templates, just clear intent.
Why I built it: I got tired of the friction in no-code tools. They made building faster, but not simpler. So I built an AI-native alternative.
How it works:
You type a prompt describing your idea
The AI generates the full structure and UI
You preview instantly in your browser
It’s still in early beta, and I’d love your feedback — especially from developers, designers, and founders who’ve tried no-code tools before.
Try it here: https://sebastian.run
Also wrote a short piece about the concept of “vibe coding” here: https://medium.com/@tonprofil/why-i-stopped-coding-and-start...
Would love to hear your thoughts — what feels promising, what feels missing? Thanks!
Tech details: Built with React Native + Expo for the mobile side. Supabase handles authentication and backend logic. The backend is written in Python and orchestrates multiple AI agents — one for structure planning, one for code generation, and one for image creation.
Each prompt is parsed into app structure (frontend + backend + logic), and then compiled into a deployable React Native project.
I’d love to get feedback from devs who’ve tried no-code tools before — especially around architecture, prompt parsing, or how to make this more developer-friendly.