Time For Web Traffic Analytics
If you aren't certain you're getting optimal results from your Web traffic, it's time to look into analytics. (courtesy: Small Business Pipeline)
February 13, 2006
Welcome to Accidental IT, a series of technical how-tos for people whose job descriptions don't necessarily include tech support but who often find themselves doing just that for their co-workers.
Your Web site is alive in cyberspace. And if you're on top of your Web site, you're keeping track of the traffic reports generated by your hosting service, the hit counters on each page, and whether you are getting orders and correspondence from the site. But are you getting all the traffic you should? Are you getting all the orders you could get from the people visiting your site, or are you losing their interest and their orders somewhere before they check out? If you don't already know the answers to those questions, it's probably time for you to investigate the technology of Web analytics.
The Definition
According to Wikipedia, there are two main approaches to collecting Web analytics data. "The older method, logfile analysis, reads the logfiles in which the Web server records all its transactions. The newer method, page tagging, uses JavaScript on each page to notify a third-party server when a page is rendered by a Web browser. Both methods are now widely used."
The log file analysis method is generally available on most shared hosting accounts either at no charge, or for a fee. In addition, log analyzers like the open-source AWStats, which is available at no charge can be installed on most Web servers, including Apache and IIS. The analysis generated by programs like AWStats can deliver information about numbers of visits, referring sites, and visitors' locations. However they are not particularly adept at identifying visitors' activity once they are on the site, particularly on dynamic sites where the page, but not its contents are identified. For example the page "catalog" may be shown as being visited, but the specific items viewed are not captured.Programs like Urchin, which is now part of Google's free offerings include program code that is embedded in each page of the Web site. That code creates its own extended history files, or communicates with a Web-based service that provide richer reporting on activities.
But knowing the behavior of visitors and changing that behavior are two different activities. Your analysis may identify that 80 percent of your site's visitors search for and find products that they view for a few minutes, and that 20 percent of the visitors create shopping carts from the items they view, but only 2 percent actually finalize their purchases. The information indicates that a lot of activity is being wasted, and lots of sales opportunities lost. What the information doesn't tell you is where the problems are, and how to go about fixing them.
Stage 2: Applying the Analytics
According to Paul Geller, Internet Services Division Coordinator for Original Marketing Group,"The analysis is the non-active part of the SEO (Search Engine Optimization) and Web development process, and not the activity of design and optimization itself."
The purpose of this phase is to find the problems and define a method for resolving them. Geller says that at minimum, this includes reviewing the programming and design of the Web site and modifying it to comply with programming standards, and to make the applications clean and error free. Many companies that perform SEO services offer complete design and programming services in an effort to assure Web sites meet the best practices from the very beginning.Stage 3: Reiteration
Once the site has been optimized by editing the programming code, adding search terms in the proper places, and generally making sure everything works as it should, your task is still only in its initial stages. According to Geller, "The analysis is ongoing and cyclical. You have to see if the changes produce the results you expected, then continue finetuning the work. It is an ongoing, continuous process."
Optimost's CEO, Mark Wachen agrees that changes need to be tested in an iterative mode. However Optimost offers a service similar in some ways to Nettracker and Fireclick. These are hosted services that receive reports from program code embedded in Web site pages. The data is then analyzed, providing deep insight into visitor behavior.
According to Wachen, "Our service identifies the key pages that visitors interact with, and based on traffic analysis, point out the pages that need attention." But rather than make a change to the pages and present them as a single modification, "Optimost's software creates as many as 2 million permutations of the changes possible to the pages in question. Then we narrow that down to about 20 versions that are rotated to subsequent visitors, and we measure their behavior based on the specific version."
This real-time update process produces real-world results similar in some ways to focus groups. "In about 30 days you end up with refined, empirically based decisions, rather than instinct," says Wachen. The process is then repeated on other sections of the web site to get the very best results.ROI
The basic analytical tools that are either included with a hosting package or installed by the site manager range from free to a few hundred dollars. Services provided by professional companies like Optimost and Original Marketing Group are billed monthly and range from $500 to $5,000 for mid-sized sites, and even higher for sites with heavy traffic volumes. Whether these costs are appropriate for any specific business depend on the existing traffic, potential sales, and the effectiveness of the analytics process.
Read more about:
2006You May Also Like