Web Content Provider - Infopulse.
Rules of Engagement in the Internet Marketing Competition
We are not just web content providers. Infopulse creates web content optimised for humans and search engines.
Web Content Provider - Writing for the humans and for the robots
Be aware that your web content is to be read by 2 very different types of readers: humans and robots. Robots (also known as crawlers and spiders) are in fact pieces of software used by search engines like Yahoo, Google and MSN to extract the web content , analyse it and display links to the pages in the search. Your task is to satisfy them both.
Web Content Provider - What humans want
Our reading habits are different for online and offline materials. When reading from the computer screen, humans do not like text with long paragraphs, long sentences and huge pages. Consider the sentence from the previous topic:
Robots (also known as crawlers and spiders) are in fact pieces of software used by search engines like Yahoo, Google and MSN to extract the web content, analyse it and display links to the pages in the search.
The better Internet version of this web content could be:
Robots (aka crawlers or spiders) are in fact pieces of software.
They are used by search engines like Yahoo, Google and MSN to:
- extract the content of web sites,
- analyse it
- display links to the pages in the search.
Have you spotted the difference in this snippet of web content?
- Lines are about 64 characters long
- Shorter sentences
- Paragraphs are not bigger than 3-4 sentences
- Use bullet points where appropriate
If the subject of your web content is interesting and you employ the strategy mentioned above – users will not leave your web page until they finish reading. Bingo!
But let's consider how robots read.
Web Content Guide - What robots want
Robots are no match for human flexibility when reading web content. Spiders use formal logic to understand what you actually tried to say. As the first step they strip your web content off 'common' words like 'a, the, when, where, are, is, in, of..', etc. Consider the example of web content above read by a Google robot. This is what Googlebot might see :
Robots known crawlers spiders fact pieces software used search engines like Yahoo Google MSN extract web content analyse display links pages search.
The key to their logic is: how often particular words or combinations are used. In our example the 'SEARCH' is the only word used more than once. That is why the robot will believe that your web content is about SEARCH.
It is very important for your web content to use language clear to robots. Formulate subjects of web content as a simple combination of words, like: good web content provider.
Tend not to use 'it' talking about your subject, use good web content guide instead.
Try to emphasise web content provider visually as much as it is reasonable.
Build your web content around "good web content provider".
Web content provider - Keywords density
For those of you who are good at math: keywords density is what robots calculate. It is the percentage of usage of particular words among ALL other words. If 'good web content provider' takes the 3%..7 % then good 'web content provider' will be considered very important, relevant and displayed closer to the top of the search results.
Web Content provider: how good is this page?
This page contains about 600 words. The good web content guide is used 16 times.
That makes a density of 3% approximately. Very decent keywords density but not winning.
You should do better.
Web Content Provider - Summary
In your web content use:
- 300-600 words only
- as many of your key phrases as you reasonably can
- short sentences, easy to read
- paragraphs with no more than 3-4 sentences
- lines no more than 64 characters
- no useless or extra words - point right to the subject