Blogger makes search engine optimization easier for novice bloggers by providing them pre installed features which saves them from getting stuck in the language of codes. Most of SEO features which are difficult to manage in other CMS can be easily handled in blogger which makes blogger simplest CMS platform for everyone. "Search Preferences Settings" is a good example of these features. It includes a set of settings which let you automatically get control of some features regarding search engine optimization of your blog.
You can find these settings under the settings menu in your blogger dashboard. One of the biggest advantage of this feature is that you can understand them easily rather than applying these settings manually in template which aren't easier for everyone to understand. It helps in proper indexing of your blog, setting how your blog appears in search results, custom redirect options, robots.txt, 404 page settings and meta content of your blog.
You must understand these settings in order to get its full benefit so in this article we are going to explain you the importance of each setting individually along with the ways to optimize them for SEO purposes.
There are three categories which contain total 5 sub-categories. Lets understand them.
A. Meta Tags:
The only setting which you find under this category is "Description".Meta Description:
This setting enables Meta description in your blog. It is piece of paragraph which briefly describes your blog and might appear in search results according to the query. Meta description doesn't have direct impact on search engines but it is basically written for humans.It explains what your page is about so if it appears in search results, user will decide to click your blog or not by first watching title and then description. It isn't necessary that meta description always appear in search results but search engines can also show a snippet from the page itself according to the query. We advise you to enable this feature and write a brief description no more than 150 characters which explains your blog completely.
Enabling this feature also allows you to write meta description of individual posts separately. It is another good thing because if you don't write description for each webpage separately, search engines might show meta description for homepage in search results which may be irrelevant to actual post. Writing a separate description for each page individually is recommended so make sure you enable this feature.
Write Meta Description For Individual Post In Blogger:
After enabling Description settings your post editor will show a "Search Description" field in post settings.To create description for a post, click on that field then write few words about the article and click on "Done".
B. Errors And Redirections:
These settings consist of 404 page settings and custom redirects.1. Custom Page Not Found (404 Error):
Blogger shows a page not found error when someone tries to browse an invalid link on your blog or a link which doesn't exist. By default blogger shows a message which says "Sorry the page you were looking for in this blog does not exist".You can change, modify, reformat and customize this message in your blog using this setting. We have already discussed a tutorial to create awesome 404 pages for bloggerfrom where you can understand this feature in more depth.
2. Custom Redirects:
With custom redirect options, you can redirect your visitors to a new URL from an old URL in your blog. Sometimes it happens that you change the URL of any page in your blog due to any XYZ reason but you were also receiving traffic from that old URL. In that case you can redirect that old URL to new URL so in this way traffic to your blog isn't disturbed. This option allows you to make a 301 redirect which is permanent or you can also set a temporary redirect.To create a new redirect, click on Edit against custom redirects option. In the first field enter old URL without domain and in the next field enter new URL where you want other URL link to be redirected as shown in following example.
By checking Permanent, the URL is set to 301 redirect which defines a permanent server side redirect.
C. Crawlers And Indexing:
The next and last category includes crawlers and indexing settings and include 2 sub-categories.1. Custom robots.txt:
Custom robots.txt is a text file which stops web-crawlers from crawling specific pages on your site. By default all blogger supported blogs use following robots.txt file.User-agent: Mediapartners-Google Disallow: User-agent: * Disallow: /search Allow: / Sitemap: http://your-domain.com/sitemap.xmlThe above file stops all bots from crawling those pages whose URL start with search. The user agent defines the crawlers. Mediapartners-google defines Googlebot software while User-agent * defines web-crawlers for all search engines which crawl and index websites. /search value against Disallow means the pages whose permalink start with search just after the actual domain are not allowed to be crawled while forward slash "/" against Allow field defines that rest of all pages are allowed to be crawled.
In media partners-google section, we see only a disallow field without any instruction. It means that Google bots are allowed to crawl whole blog.
Sitemap file tells crawlers about all pages on your blog so the bots crawl them efficiently. The Sitemap which you are seeing above is invalid so we should add a valid sitemap in blogger by enabling custom robots.txt.
Our blog uses following robots.txt file and we also recommend you to add this custom robots.txt file in your blog. Replace the text in blue with your own domain.
User-agent: Mediapartners-Google Disallow: User-agent: * Disallow: /search Disallow: /p Allow: / Sitemap: http://www.helpitx.com/atom.xml?redirect=false&start-index=1&max-results=500This way we have stopped bots from crawling search pages and static pages of our blog. URL of static page in blogger starts with "p" so disallowing "/p" means we have stopped crawling of static pages also. We have also added a valid sitemap which is valid till first 500 posts. If your blog has more than 500 posts than you should another sitemap which resembles below code.
Sitemap: http://www.helpitx.com/atom.xml?redirect=false&start-index=501&max-results=500For posts more than 1000 you can add another sitemap which looks like following.
Sitemap: http://www.helpitx.com/atom.xml?redirect=false&start-index=1001&max-results=500So the actual file looks like following one.
User-agent: Mediapartners-Google Disallow: User-agent: * Disallow: /search Disallow: /p Allow: / Sitemap: http://www.helpitx.com/atom.xml?redirect=false&start-index=1&max-results=500 Sitemap: http://www.helpitx.com/atom.xml?redirect=false&start-index=501&max-results=500 Sitemap: http://www.helpitx.com/atom.xml?redirect=false&start-index=1001&max-results=500You have noticed that a single sitemap is capable of telling search engines about only 500 posts so you should add multiple sitemaps for a set of 500 posts if you have a large number of blog posts. Just replace the number highlighted in red with the next 500thnumber. Save the sitemap.
You can test robots.txt file in Google webmaster tools using robots.txt tester tool. By using this tool, you can check any individual URL if it allows or disallows bots.
Custom Robots Header Tags:
Robots header tags are special meta tags which define how search engines index and display your webpage in search results. These meta tags are inserted manually in head section of webpage but in blogger we can use them easily with this option.Must Read: Enable Custom Robots Header Tags In Blogger And What They Mean
You should enable this feature and set values according to following screenshot which is recommended setting for this.
Hope it helps in fully understanding the "Search Preference Settings" in blogger. If you still have doubts and concerns, feel free to ask us in comments.
Post A Comment:
0 comments: