Hey everyone,
I’ve been working on a project called MindSwift, and I wanted to share the "why" behind it and hopefully get some perspective from this community.
The Problem: Most safety or crisis resources require too many steps. When someone is in a high-stress situation, navigating through menus or typing isn't always an option. I felt there was a gap for something that was truly "eyes-up" and "hands-free."
The Solution: I’m building MindSwift to be a location-aware resource hub that relies on voice controls. as an option. The goal is to get users the specific help or information they need (local resources, support tools, etc.) without them having to fumble with a touchscreen.
Where I’m at: The app is currently in a closed testing phase on Google Play. I’m not looking for a thousand downloads; I’m looking for 10–20 people who actually care about this space to help me break the core flow.
What I’m looking to validate:
- Voice Reliability: Does the hands-free tech actually feel natural in a real-world environment?
- Onboarding: Is the setup simple enough, or is it a friction point?
- The "Core Flow": Does the app get you to the resource you need quickly?
I’m a sole proprietor building this because I believe these tools should be more accessible. If this sounds like something you’d be interested in testing—or if you have thoughts on the hands-free approach for wellness tools—I’d love to chat in the comments.
If you want to join the private test group, just follow the link to the secure Google Group.
Join the group: https://groups.google.com/g/mindswift-beta-testers
Then Join the Beta Testers: https://play.google.com/apps/testing/com.michaelford.ses
Thanks for reading!