50 Things I Believe About Analytics

This post was originally posted on LinkedIn .

In no particular order:

  1. Spreadsheets should never be bottom-aligned.

  2. Weekly reports should have a maximum latency of one day.

  3. Monthly reports should have a maximum latency of one week.

  4. Dashboards should be one page/screen.

  5. The data-pixel ratio should be ruthlessly maximized.

  6. “Key business questions” are insidious.

  7. Every analysis should start with a clearly articulated hypothesis.

  8. KPIs are just metrics if they do not have a target set.

  9. Benchmarks are not targets.

  10. KPI targets can be developed in the absence of historical benchmarks.

  11. All data is incomplete.

  12. All data is inaccurate.

  13. The length of the question has no relationship to the analytics effort required to answer it.

  14. Dashboards should not include “insights” or “recommendations.”

  15. “Insights” is a dirty word.

  16. Use of the phrase “actionable insights” should be a finable offense.

  17. It’s okay to say, “I don’t know (yet)” to a stakeholder.

  18. Dates should be in rows in the data; not across columns.

  19. Pie charts almost always add unnecessary cognitive load. Doughnut charts are just as bad as pie charts.

  20. Waterfall charts are amazing…in situations where they are appropriate.

  21. In-cell bar charts are amazing…unless terribly implemented by a digital analytics platform.

  22. Heatmaps are amazing…as long as they don’t use a red-to-green palette.

  23. Learning to program with data (R or Python) is worth the investment.

  24. Fluency with VLOOKUP, INDEX, MATCH, pivot tables, and named ranges is a must.

  25. Many analytics and BI platforms are shockingly rigid and terrible at data visualization.

  26. No report or analysis should be distributed without some form of QA.

  27. Most detected anomalies in the data should not be investigated.

  28. Any visualization that would not work if printed grayscale is problematic.

  29. If a recurring report requires manually updating chart references, the analyst is doing it wrong.

  30. No data set is as straightforward as it is initially believed to be.

  31. The unplanned destruction/loss of a computer should result in the loss of less than 2 hours of analytics work.

  32. The process and methods used in any analysis should be documented while the analysis is being done.

  33. Impressions are not awareness.

  34. Awareness is measurable (but there is a cost).

  35. Too many organizations expect new technology to solve people and process gaps.

  36. All analytics implementations are flawed.

  37. Most analytics implementations are good enough to get value from in their current state.

  38. Machine learning and AI can deliver answers, but they are terrible at delivering good questions.

  39. The physics of the internet are immutable.

  40. Digital analytics data collection is built on a hack of technology/standards intended for other uses.

  41. Media analytics is built on a hack of technology/standards intended for other uses.

  42. “How would I feel if this was on the front page of the NY Times” is a good litmus test for deciding what to track and how.

  43. Sparklines are a powerful way to provide meaningful context for a metric in a compact space.

  44. If the result of analysis is incredibly surprising, there is probably a flaw in the analysis.

  45. Gridlines should be turned off in spreadsheets.

  46. The quality of the design of an analytics deliverable impacts the credibility with which it is received.

  47. All recurring reporting should be fully automated (to the extent possible).

  48. Delivery of ad hoc analyses should not be set to a fixed recurring schedule.

  49. If the same analysis is repeated on a recurring schedule, it is not an analysis.

  50. Analytics cannot replace creative thought.

Bonus #51: Analysts should understand the realities of martech.