Organic prominence is important because it’s an indicator that website content is right on the money. However we’ve long since transitioned from measuring Google keyword rankings to analyzing where traffic comes from, how visitors behave within content, and whether key performance indicators (KPI) are met along with tracking associated revenue.

We don’t play “beat the algorithm” anymore; rather the focus is on creating terrific content, relevant to the end user, and published on competent and discoverable content management systems (CMS).

Setting up Analytics.
Some analytic setups are really easy to set up, especially for single domains using script to interact with a hosted analytics server. Just drop a line of Javascript in the header of each page (include files work great) and off you go. Google Analytics, WebTrends, ClickTracks (Hosted), HBX and SiteCatalyst rely on tagging schemes. ClickTracks Pro is a log analysis tool which offers a number of benefits and can be a bit more complex to set up.

Our clients often ask us about the difference between log analysis tools vs page tagging. This post is a lay person’s guide (not for tech gurus) to help think about this classic analytic technology decision.

Modern Log Analysis Has Benefits.
Modern log analyzers can report on visitor sessions with good accuracy and state-of-the-art segmentation but aren’t plug and play. Report-reading is not real-time and takes place the next day. Log analyzers yield information regarding robots and spiders on a page by page basis but can’t track exit URL destinations. Log files are not 100% accurate: ISP page caching and proxies can twist data which leads to inaccuracies. They can be effective for deploying across sub-domains but the persistent cookie scheme can be complex.

The Javascript Option
Javascript (aka: page tagging) can parse data more easily from page content and gets around difficulties with multi-domain installations (like secure shopping cart sub-domains) since the session cookie is set inside the tracking domain as opposed to the domains in the site being analyzed. Exit destinations can be tracked but robot reports are not available.

Javascript does not “see” everything. Server activity like redirects and .PDF downloads are not easily tracked. Some technical statistics aren’t available. For instance if you want to know about bandwidth and 404 errors, log analysis tools are still required reading. Page tagging is not 100% accurate. Though more accurate than log files they are not perfect. A certain percentage of folks disable JavaScript in their browsers. DNS failures and other issues can result in data not being sent where a log file would work well.

In Conclusion: Use Both.
aimClear uses both log analysis tools and JavaScript tagging analytic technologies on sites we market. Both log analysis and page tagging offer unique benefits and drawbacks depending on the application and objectives at hand. Each installation, especially across multiple domains is a unique challenge.

Analytic Blog Resources:

Apogee Weblog

Google Analytics Blog

Jonathan Mendez’s Blog

Marketing Logic

MoreVisibility SEM Blog

ReveNews

Web Analytic Matt

Web Analytics Book

Web Analytics World

Web Analytics Demystified

WebConnoisseur

Web Metrics Guru