The "Now What?" Moment
You did it. You recruited 12 testers. They all opted in. The 14-day countdown started.
You open Google Play Console to check on things, and you're greeted with a dashboard full of numbers, graphs, and percentages that might as well be written in ancient Greek.
Active installs: 9. Crash-free users: 95.3%. ANR rate: 0.82%. Pre-launch report: 3 warnings.
...Are those numbers good? Bad? Should you be panicking?
I'll break down every metric in Play Console that matters during your testing period, what the numbers mean, and what action you should take when you see red flags.
Metric #1: Opted-In Testers vs. Active Installs
This is the first thing you'll see in Testing → Closed testing → Testers tab.
Example Scenario
You have 14 opted-in testers, but only 7 active installs.
Google expects testers to actually use your app, not just opt in. If your install rate is below 60%, you're at risk of an "insufficient engagement" rejection even if you maintain 12 opted-in testers.
What to do: Contact the testers who opted in but didn't install. Ask them to download the app. If they won't, remove them from your tester list and recruit new testers who will actually engage.
Metric #2: Crash-Free Users Percentage
Find this in Release → Testing → Closed testing → Statistics tab or in the main Dashboard under "Android vitals."
This metric shows what percentage of your users experienced zero crashes during their session.
Real Example from My Testing Period
On day 5 of testing, my crash-free rate was 94.2%. I panicked.
I clicked into the crash reports and found that all crashes came from one specific Android version (Android 10) on Samsung devices. The issue? A null pointer exception when accessing device storage.
I fixed the bug, uploaded a new version to closed testing (which doesn't restart your 14-day countdown), and by day 10 my crash-free rate was back to 98.7%.
Click on individual crash reports in Play Console. They show stack traces, device models, and Android versions. Most crashes are fixable within a day if you know where to look.
Metric #3: ANR (App Not Responding) Rate
ANRs happen when your app's main thread is blocked for more than 5 seconds. Android shows the user a "Wait or Close" dialog.
Find this metric in Quality → Android vitals → ANRs.
Common causes of ANRs:
- Network calls on main thread: Always use background threads for HTTP requests
- Heavy computations during onCreate(): Move intensive operations to background workers
- Database queries on UI thread: Use Room database with coroutines or AsyncTask
- Large image processing: Resize images before loading into memory
Metric #4: Pre-Launch Report Results
This is one of the most underutilized tools in Play Console. Google automatically tests your app on real devices when you upload to closed testing.
Go to Release → Testing → Closed testing → Pre-launch report.
Google tests your app on ~10 different device configurations and reports:
- Crashes during automated testing
- Security vulnerabilities
- Performance issues
- Accessibility problems
If the Pre-launch report shows "Critical" issues—especially crashes or security vulnerabilities—Google will likely reject your production application. Fix these before day 14.
How to Read Pre-Launch Report Issues
App crashes on launch, security vulnerabilities, major functionality broken. Fix immediately.
Performance issues, accessibility concerns, minor crashes in edge cases. Investigate and fix if possible.
Suggestions for optimization, best practice violations. Nice to fix, not required.
Metric #5: Daily Active Users (During Testing)
This shows up in Statistics → User acquisition for your closed testing track.
While Google doesn't publish an exact threshold, here's what I've learned from testing 7 different apps:
If your daily active users graph looks like a flatline with one spike on day 1, that's a red flag.
Metric #6: Average Session Duration
Find this in Statistics → User behavior.
This metric isn't a direct rejection factor, but it tells you if testers are actually using your app or just opening and immediately closing it.
What's "good" varies by app type:
- Utility apps: 30 seconds - 2 minutes is normal (calculator, flashlight, note-taking)
- Social apps: 5-15 minutes expected (messaging, social media)
- Games: 10-30 minutes per session (varies widely)
- Productivity apps: 3-10 minutes (todo lists, habit trackers)
Ask your testers to complete specific tasks: "Please create 3 entries," "Try the search feature," "Test the dark mode toggle." This ensures they're genuinely using the app, not just opening it.
Metric #7: Device Compatibility
In the Pre-launch report, scroll to the "Tested devices" section.
Google tests your app on devices like:
- Google Pixel (various models)
- Samsung Galaxy (S and A series)
- Xiaomi devices
- OnePlus phones
If your app crashes or fails to install on specific manufacturers, you'll see warnings. Pay special attention to Samsung and Xiaomi—they have heavily customized Android builds that can break apps that work fine on stock Android.
The Dashboard Health Check (Use This Daily)
Here's my daily routine during the 14-day testing period. Takes about 3 minutes:
Daily Monitoring Checklist
Still at 12+? If it dipped, contact drop-outs or recruit replacements immediately.
Above 98%? Good. Below 96%? Click into crash reports and investigate.
At least 8-10 testers should have the app downloaded. If not, nudge them.
Any new "Critical" issues? Fix them ASAP.
Are people actually opening the app? If the graph is flat, send a reminder to testers.
When to Hit the Panic Button (And When Not To)
đź”´ Panic-Worthy Issues
- Opted-in tester count drops below 12
- Crash-free rate below 95%
- Pre-launch report shows Critical security vulnerabilities
- Zero active users for 3+ consecutive days
🟡 Monitor Closely, But Don't Panic
- Crash-free rate between 96-97%
- ANR rate slightly above 0.47%
- A few Pre-launch "Warning" issues
- Active installs at 60-70% of opt-ins
🟢 You're Fine, Stop Stressing
- Low session duration (if your app is a simple utility)
- Pre-launch "Info" suggestions
- Minor variations in daily active users
- A single crash from an obscure device model
Key Takeaways
- Check your dashboard daily—problems caught early are easier to fix
- Crash-free rate above 98% is your primary quality indicator
- Active installs should be 60%+ of opted-in testers
- Pre-launch report Critical issues must be fixed before production
- ANR rate above 0.47% triggers Google's quality warnings
- User engagement matters—flatline graphs suggest fake or disengaged testers
Understanding these metrics turns the 14-day testing period from a nerve-wracking wait into a proactive quality improvement phase. Monitor consistently, fix issues as they appear, and you'll sail through the production review.
Want guaranteed engaged testers who actually use your app? Check our testing plans for verified testers who meet engagement requirements.
Frequently Asked Questions
What crash rate is acceptable for Google Play approval?
Google expects apps to maintain above 98% crash-free sessions. Anything below 96% crash-free rate will likely trigger rejection or further review. During testing, aim for 99%+ to be safe.
How many active users do I need during the 14-day testing period?
While Google officially requires 12 opted-in testers, community data suggests at least 8-10 should be actively using the app (opening it 3-5 times during the testing period) to demonstrate genuine engagement.
Do ANRs (App Not Responding) count against my app's quality score?
Yes. Google tracks ANRs separately from crashes, and excessive ANRs (above 0.47% rate) can trigger quality warnings or rejection. Monitor the Android vitals dashboard closely.
Written by James Mitchell
Expert in Google Play app testing and Android development. Helping developers navigate the app approval process with practical insights and proven strategies.