For gym-goers who've already tried calorie tracking and quit — this logs a meal from a photo in under 30 seconds, so the habit finally sticks.
Built for adults 18–40 who lift 3–5×/week and know exactly what they want: a number goal and a plan that doesn't waste their time.
Interactive preview
Tap any meal card to expand macros. Tap "Log a Meal" to see the snap flow.
Tap any meal card to expand macros · Tap "Log a Meal" to see the snap flow
How it works
Open the app, point your phone at any meal — home-cooked, restaurant, or packaged — and tap once. No setup.
Calories, protein, carbs, and fat appear in under 30 seconds with a confidence score. Tap to adjust portion size if needed.
The app watches your real logged adherence week over week and adjusts your meal plan and workout split automatically — no manual reconfiguring.
What we expect users to say — pre-launch sample voices
"I've started and quit MFP three times. The search bar alone was enough to make me give up after two weeks. If snapping actually works accurately, I'm in permanently."
"Restaurant meals always killed my tracking. I could never find the right item in the database and would just give up. A photo would actually solve that."
"My problem isn't knowing what to eat — it's the 5 minutes of data entry per meal. That's what makes tracking feel like a second job. Cut that and I'll actually stick to it."
Honest answers
That's the most important question to ask, and we want to be straight with you: Lose It! does have photo recognition, and it's backed by a large engineering team. The honest answer is that the competitive bet here is not "we have photo logging and they don't" — it's that a focused app built around this single mechanic, with no legacy UI debt, can iterate faster on recognition accuracy and make the experience meaningfully faster end-to-end. Whether that's true is something we're actively testing. That's why we're building with a beta cohort before claiming any advantage. If accuracy testing shows we can't clear 85%+ on unstructured home-cooked meals, we'll tell you — and we'll share those results.
We won't sugarcoat this. The brief we wrote for ourselves sets the minimum bar at ~85% accuracy on unstructured home-cooked meals. Field-wide accuracy today (across available models) is estimated at 60–75% for that meal type — which means we may not be there yet at launch. For structured meals (protein + starch + vegetable bowls, packaged foods, common restaurant dishes), accuracy is substantially higher. Our kill criteria include a blind test of 50 user-submitted home-cooked photos: if we fall below 80%, we don't launch broadly. When the model is unsure, the confidence score will show it — and we'll prompt you to confirm rather than silently log a wrong number.
This is the hardest question, and we don't have a complete answer yet. Here's what we believe and what we're testing: faster logging reduces the daily friction tax, but it probably isn't sufficient on its own. The adaptive plan — one that changes based on what you actually ate rather than what you planned to eat — is our hypothesis for the week-4 mechanic. If the app responds to your real behavior rather than guilt-tripping you for a missed day, re-engagement is easier. We're measuring day-14 retention in the beta cohort specifically. Our threshold is 25%+. If we can't hit it, we'll tell you what we found before opening up more broadly. The research is honest: this is the central unsolved problem in the category.
Early access
We're testing with a small iOS beta cohort. Your answer below tells us whether we're solving the right problem — and shapes what ships in v1.
We read every reply.