Dashboard QA Checklist
Systematic audit for analytics dashboards. Check every data source, filter, metric, and visualization before sharing with stakeholders.
Name your dashboard
Check items by category
Export QA report
What Is a Dashboard QA Checklist?
A dashboard QA checklist is a systematic process for verifying that every number, chart, filter, and data source in your analytics dashboard is accurate before it reaches decision-makers. Without QA, dashboards become a source of confident wrong decisions.
Research shows that data quality issues affect up to 30% of enterprise dashboards. The most common failures are silent: wrong filters left active, stale data connections, or miscalculated blended metrics. These errors often go undetected for weeks because the numbers still look plausible.
Why Dashboard QA Matters
Best Practices
- Run QA every time you change a data source, filter, or calculated field
- Cross-check key totals against the native platform (GA4, Google Ads, CRM)
- Test with different date ranges, including edge cases (single day, full year)
- Verify mobile rendering for dashboards shared via link
- Document metric definitions directly on the dashboard
- Assume “it worked last month” means it works now (data sources expire)
- Skip QA for “minor changes” (a renamed column can break every chart)
- Trust blended metrics without verifying the join logic
- Ignore zero values without confirming they are real zeros, not data gaps
- Share dashboards that show “Series 1” or unlabeled chart legends
Frequently Asked Questions
Run the full checklist whenever you make structural changes (new data source, new calculated fields, filter changes). For stable dashboards, a monthly spot-check of the Data Source and Metric Accuracy categories catches most drift issues like expired API tokens or schema changes.
Time zone mismatches between data sources. When GA4 is set to UTC and Google Ads to US-Eastern, session counts and click counts will never match on a given day. This causes endless “data discrepancy” investigations that are actually just a configuration issue.
Create a separate unblended table for each source showing the raw join key values. Compare the row counts: if Source A has 500 rows and Source B has 480, the blend will silently drop 20 rows. Always check that your join key (date, campaign name, UTM) matches exactly, including case sensitivity.
Always before. The cost of a 30-minute pre-share QA is negligible compared to the cost of retracting wrong data from an executive presentation. Build QA into your dashboard development workflow, not as an afterthought.
For session/pageview counts between GA4 and a dashboard tool: 1-3% variance is normal due to sampling and processing lag. For revenue: less than 0.5% variance against your source of truth (payment processor). For ad spend: should match exactly. Document your tolerance thresholds so the team knows when to investigate vs. accept.
Partially. Automated checks work well for data freshness (is yesterday’s data present?), row count reconciliation, and null/zero monitoring. But visualization quality, filter logic correctness, and business context alignment still require human judgment. Use this checklist for the manual portion and build automated alerts for the mechanical checks.