Content remains king in SEO since it generates organic traffic to your site. Producing content, however, requires balance: it should be viewed from the perspective of the search engine, which will index your page, and the readers who will click through. What are the tried and tested rules of web content writing?

Things to Do:

Write for the Search Engine and the Reader

On the surface, humans only see their own language displayed on screen whenever they visit a site. Underneath these words and images is another esoteric language, in codes, understood by Google bots and other site crawlers.

The languages of the reader and the search engine may differ in syntax, but they both work toward the same goal of getting the site owner’s ideas out of their head, floating through cyberspace, and displayed onto your screen. One thing remains common in these two, however. Their reliance on good keywords.

The keywords in your article are like little breadcrumbs that bots follow, which lead them straight to your site.

Optimise Meta Descriptions

Meta descriptions give readers a preview of your content. Bots use them to determine whether your content matches a query from a user. You should be able to package the most important point (along with the target keywords) of your post in a span of about 150-plus characters.

Attribute Your Images

Search engines cannot decode the content of images. They only recognise these images through the words tagged or attributed to them. Most bloggers unconsciously use images with names such as image354.png. This causes bloggers to lose potential traffic. Just like in your meta description and article, use keywords in these image tags. This way, the image will catch the attention of the spider.

Produce Good Copy

The idea of producing good web copy is to create content that is worth reading. Hence, it’s useful, creative, and compelling. It encourages people to read more and share on social media, thereby generating even higher traffic. The authority and relevance of your site will depend on how well the research and writing are. Create original content that is sticky so that a reader would want to spend more of their time there.

To maintain credibility and top ranking, you must also recognise what is not SEO. Keyword stuffing and having a bad link neighbourhood will only hurt your site in the long run.

Things to Keep an Eye On:

Google Is Not Indexing Your Site

The first important objective of doing SEO is for Google to index your site. This will ensure that you receive organic traffic from Google. What if this is not happening at all?

You need to diagnose the indexing issue before you fix it. Go to the Google Search bar and type “site:yoursite:com” and check if much of your content appears in the search results. If there is a huge discrepancy, then you might have a problem. You need to check your Google Webmaster Tools dashboard and start addressing the issue one by one.

Indexing Under www or a Domain That is Not www

In technical terms, www is viewed as a sub-domain. This makes different from The two sites should be added to your Google Webmaster Tools (GWT) 2 account for indexing purposes. Verify who owns the domain and set preferences.

Google Is Yet To Find You

This is one of the challenges of new sites. It takes a few days to be found. But if this does not happen, troubleshoot by checking whether your sitemap has been uploaded and is working properly. Ensure that you have the sitemap in the first place. Send a request to Google to have the engine fetch your site.

Robots Have Blocked Your Site or Page

The problem could be because of a developer or editor who has blocked your site using robots.txt. To fix the problem, you just need to remove the entries from robots.txt. Your site will instantly appear on the ranking page.

You Lack a Sitemap

Google will have difficulty ranking sites that lack a sitemap. The map indicates the direction a Google spider will follow during indexing. It is crucial to rework and resubmit the sitemap if you encounter problems.

Crawl Errors

Some indexing problems come about because Google cannot crawl through some of your pages. Crawl errors are easy to identify with Google Webmaster tools. Google also provides the procedure for searching and correcting the error.

There are other reasons behind indexing issues, too, including duplicate content, turning on privacy settings, and being blocked by .htaccess. And, remember, you can always hire professional help to resolve these issues.