What happens when you run multiple web analytics software tools side by side to test and compare their performance? Do they give different results and if so, why? Which one should you trust? Does one package always report higher or lower numbers than the other? How do they compare when measuring visitors, unique visitors and page views? A recent study shows it doesn’t hurt to analyze the tools that track and measure your data.

Stone Temple Consulting set out to test 7 web analytic packages on 4 websites. As many as 6 tools were run on a website. Called the 2007 Analytics Shoot Out, results of the study are being presented in two stages. The interim report was officially released at the Emetrics Summit in San Francisco on May 6, 2007. A final, more comprehensive report will be released in July of 2007.

The following tools were tested:

1. Clicktracks
2. Google Analytics
3. IndexTools
4. Unica Affinium NetInsight
5. WebSideStory HBX Analytics
6. Omniture SiteCatalyst
7. WebTrends

The four participating websites were:

AdvancedMd.com
Citytowninfo.com
Homeportfolio.com
Toolpartsdirect.com

Contributors were:

1. John Biundo of Stone Temple Consulting
2. Jonah Stein of Alchemist Media
3. Rand Fishkin of SEOmoz
4. Jim Sterne of Emetrics

In addition to describing the methodology and goals for the ShootOut, the first report raises initial concerns over the effects of the removal of first and third party cookies, and how this may cause inaccurate data.

Cookie deletion rates are of great concern when evaluating web analytics. Every time a cookie is deleted it impacts the visitor and unique visitor counts of the tool. In particular, counting of unique visitors is significantly affected. If a user visits a site in the morning, deletes their cookies, and then visits again in the afternoon, this will show up as 2 different daily unique visitors in the totals for that day, when in fact one user made multiple visits, and should be counted only as one unique visitor.

Configurations were evaluated. The report found “significant differences in the traffic numbers revealed by the packages.” Page view analysis was conducted and discussed in the report. Various data collection areas were measured. Vendors had the chance to present their product’s features and benefits.

The report concludes with,

Implementation of an analytics package requires substantial forethought and planning. And, when you are done with that, you have to check, and recheck your results, to make sure they make sense.

You can view the first phase, and intitial findings overview, in 2007 Web Analytics Shootout – Interim Report

Journey Through The Past (Social Media isn't Learning)
What Does 'Women in Tech' Mean, Exactly?

cre8pc

Kim Krause Berg’s long background in web design, SEO and usability includes software application functional and user interface testing, accessibility, information architecture and persuasive design. She shared her passion for Usability and SEO through her site and private consulting at Cre8pc for 17 years. Kim founded Cre8asiteforums in 1998. In the fall of 2012 she sold her forums to Internet Marketing Ninjas and retired from private consulting to join their Executive Management team where she continues her work in usability testing, customer experience and conversions design.

My Online Course: Web Site Usability 101

Member:

American Society for Information Science and Technology (ASIS&T)

Information Architecture Institute

Usability Professionals Association (UPA)

One Comment

LEAVE A COMMENT

FEEDBACK