Everything you know about social mobile web statistics is dead wrong.

Yes, I know that’s a provocative title, but it accurately reflects some of the realizations I’ve had over the last several months while researching social engagement systems. There are a hundred million people arguing about a thousand data points with massively conflicting data that results in a high level perspective which is about as accurate as the numbers in this sentence. The reason? None of the metrics being recorded are consistent measures of success across multiple platforms; each has their own KPI, and none of the measurement systems effectively address the nuances of the competing systems.

An utterly useless graphic which demonstrates nothing of actual value.Let’s start with the top social media websites. Is Google+ in sixth place or is it in second? Since Google+ and Youtube are now merging, does that actually tie them with Facebook, or do they even exist at all? It doesn’t take more than the slightest bit of pushing on the sources, and we start finding holes everywhere. Do users spend seven hours a month on Facebook versus three minutes on Google+? Perhaps they did in January of 2012, when that statistic was published, and when Google+ was 20% of its current size. More recent posts say the number has increased to seven and a half minutes a month, yet only include data from direct browser visits to plus.google.com, ignoring the prolific app usage, and G+ interactions carried out while in other Google properties, such as Gmail, YouTube, and Google Music.

Trying to figure out the mobile platforms to focus on? The picture is only muddier. Is Android the undisputed king of the mobile space, or does iOS’ browser share mean that Apple users are more engaged? The methodology in the sourced report suffers from similar problems: is a measure of browser traffic across ten common informational sites on the internet a statistically valid view of mobile audience engagement at large?

Every Android versus iOS argument ever.

The farther you dig, the more problems you see. Commonly cited numbers are a year or more older, which simply isn’t anywhere near current enough to plot web strategy on at the pace of development and web cultural shifts. Social media studies are crafted by investment firms with a stake in the outcome, and published as data. Lack of understanding of the user behavior means studies are flawed before ever begun (as with the Google+ “minutes per month” example). Raw measurements are cited without controls (e.g. Facebook’s user count data rarely controls for the amount of non-human users on its platform).

There is a strong opportunity out there for innovative startups who can try to find a meaningful way to verify and quantify this data, not just confirming its currency and accuracy, but also providing advanced analysis into what it means.

In the mean time, one thing is certain: none of these statistics are.