Analysing traffic to a site is a key factor in tweaking search engine optimization and marketing campaigns. Determining how people get to your site is critical. Surprisingly, few sites take advantage of the data available on their servers.
Server Logs
Typically, there
are two server tools available to you. Server logs are on your server and analyse
site activity. The log keeps track of the files created on the server, the
number of times there is a request for the file, where the hit came from and
the exact phrase typed in by the person accessing the page of the site. Every
system is different, but these basic elements should be included.
The second tool, a
traffic analysis program, is pretty standard for most hosting companies, but
you may have to ask them to turn it on. The programs analyse the raw data from
your server and convert it into fascinating charts, diagrams and statistics.
Most of the information is overkill. You have to be careful not to get
overwhelmed by numerous ways to look at the data. Try to focus on the following
information:
1. What sites are
sending you visitors?
2. What search
engines are sending you traffic?
3. What keyword
phrases are people using to find your site?
4. How often are
major search engine indexing robots visiting your site?
As you access the
data, you are going to find some very surprising things. Actually, you are
probably going to be stunned.
Initially, you are
going to be amazed when you see which keyword phrases that are sending traffic
to your site. Many of the keywords will not match your meta tags. Instead, they
will be a combination of various keywords on a particular page. This is
reflection of the fact that search engines mix and match your keywords as they
see fit. So, should you change your meta tags to reflect the phrases shown in
the data? No. The data you are seeing typically reflects keywords with little
competition. Since you are already getting traffic from them, keep focusing on
your original goals.
In looking at your
data, the second key piece of data is identifying where the traffic is coming
from. This data often falls under the “referrer” heading. By reviewing the
data, you can see what search engines are producing data for you. If you are
running advertising on a site, you should also be able to track the campaign.
The final area to analyse
is the robot visit information. Depending on the program, the robot information
may appear under “robots” or “user agents” headings. Robots are programs used
by search engines to index web sites on the net. In reviewing your data, you
should be able to determine how often the robots are coming to your site. If at
all possible, make sure you add new content to your site before the next visit.
The robots for the top search engines are Yahoo - Slurp and Google - Googlebot.
In Closing
Analysing the
traffic to your site is a key element to marketing your site. The more you know
about your customers, the more you can cater to their needs.
Comments