Articles by "SEO"
Showing posts with label SEO. Show all posts
Just blogging things you need to know. Social media tips and tricks, blogging tips and tricks and search engines news.

Images no doubt have a great influence on website user interface and design but along with it we should also keep in mind that images also decrease site speed hence increasing the site loading time which is a sin from SEO perspective. Images consume more bandwidth than most of other resources so we must practice the methods to optimize images for SEO. Site speed is one of the ranking deciding factor for the site to be indexed in search engines so we must follow the guidelines to reduce site loading time.

Read More: Why To Avoid Custom Fonts To Increase Site Speed?

One of the required method to increase site speed is to compress images as they basically are the resources that consume speed and bandwidth more than any other resources so reducing their size is necessary. You can complain that reducing the image size can also reduce its quality which is not good for a professional site. You might be right from this perspective but only in the situations if you are working with ordinary tools to reduce image size. Here I am going to introduce my audience with an online tool that help them to incredibly compress images without loosing resolution and quality. A viewer won't find any difference in the original and compressed images so this tool is best for SEO alongside user interface of the site.


Just blogging things you need to know. Social media tips and tricks, blogging tips and tricks and search engines news.

When it comes to loading speed of a website, the developer must follow the methods that reduce the loading time of webpage as site speed is also considered an important SEO factor. Along with other on page and off page SEO optimization, you must take care of site speed too because it is an element which lets search engines decide your rankings.

Ideally suppose that there are two webpages with same quality of content on the same niche with equal calculation of SEO factors except site speed. For your ease consider that exlcluding webpage loading time, both pages got equal marks for getting indexed with the ratio 1:1. Now search engine has to decide which page to be indexed first, so it will be decided by calculating site speed. The fast loading site would get first ranking and slower gets second.

It's just an example. You shouldn't think that site speed would work in this example only.. Site speed has also its own importance that might bring your site on first page depending on how much good your content is.

Read More: Compress CSS and Javascript to increase site speed

Page load time depends upon many factors, we have already discussed some of them. In this post, I would talk about using Font Awesome to improve page loading speed. How could it be useful? Before going further I should give a short introduction of font awesome and its usage.

What is Font Awesome?

Font Awesome is a custom web font which doesn't contain alphabets as its content. Instead of alphabets, it uses all icons from twitter bootstrap framework. In simple words it is an iconic library in font form. We can modify font awesome completely like any other font with CSS. Like all other custom fonts, it is in the form of a CSS stylesheet which can be linked easily to any webpage.

How Font Awesome Reduces Page Loading Time:

Now the question arises how it can be helpful in reducing page loading speed. So here is the answer. While making a webpage design, we have to add icons at various places for example search icon, sharing button icons, icons in menu etc. A common practice is to use images that contain the respective icon in it but thinking from SEO perspective, it wouldn't be a good idea to use them. A page isn't loaded completely until all elements in the page including CSS, Javascript and images are loaded. Images are always linked externally so addition of a single image file increase one HTTP request which delays page load time. No matter how much small sized image is, it has to go through an HTTP request increasing number of round trips a server make and if there are multiple image icons the number of HTTP requests increase with the number of icons hence it would result in delayed loading.

So according to SEO perspective, image icons are not recommended if the page is also using a lot of other resources. Good practice is to use Font Awesome icons instead of images. As I told earlier, Font Awesome occurs in the form of a CSS style sheet which has to load only once, rest of all content works like a regular font so it doesn't effect loading time irrespective of the number of icons used.

How To Add Font Awesome Icons:

Font Awesome is a huge iconic library where one can find all commonly used icons. They are all customizable with font awesome default options or by using CSS3 so they could be better than images in this regard.

I think it's clear now why font awesome is used. Then it comes to find and install the Font Awesome from world wide web. It is an open source Github project and available for free at Font Awesome official website. You can download it for free and use as regular CSS style sheet by self hosting. If you don't have any resource to host Font Awesome at your own than you can also link it from external resource. It is hosted on MaxCDN servers by bootstrap. To host it externally just add the following line in your webpage inside <head> tag.
<link href='https://maxcdn.bootstrapcdn.com/font-awesome/4.5.0/css/font-awesome.min.css' rel='stylesheet'/>
After adding font awesome in your website, you can place any icon easily in the content. Find the complete lists of icon here.
To add an icon click the required icon and find specified class for it. Copy and paste the class where you want to place an icon. Normally font awesome icons are added in the following format.
<i class="fa fa-bluetooth"></i>
Here in this example bluetooth icon will appear. In this format, "fa" is the basic class for whole font awesome library while "fa-bluetooth" is specified for bluetooth icon only. All other icons follow the same format if one is placing font awesome directly via HTML.

Read More: How Custom Fonts Slow Down Site Loading Speed

Hope that it is clear to you why and how to use font awesome. If you have any query, feel free to ask me via comments or my social channels.


Just blogging things you need to know. Social media tips and tricks, blogging tips and tricks and search engines news.

Every blogger wants to get discovered in search engines specially Google and structured data markup shows how a site will appear in search engines. It helps your content gets discovered in search results and displays it in search results as your choice. A couple of days ago, Google announced a new structured data testing tool and improved documentations regarding structured data markups. The older tool is still available on Google webmaster tools. The new update also includes expanded support for JSON-LD markup syntax.

How New Tool Is Going To Help Webmasters:

The older structured data testing tool was not compatible with the continuous changes that are made in search markup. For example, it was still showing author markup in it's tests even after Google announced to discard it from search results. In this situation, webmasters really needed such a tool which help them better understand how Google interpret their content in search results. The picture is clear after this update. It is able to validate all search markups and Google features which are powered by structured data having support for JSON-LD syntax markup in dynamic HTML pages. It gives clear display of markups that are being used in the source HTML code. It also highlights problems and errors in markup structures which help to eliminate those errors.
Google, JSON-LD

What's New In Documentation For Structured Data?

Google has simplified their policies on using structured data and documentation related to it based on webmasters' feedback.
The new documentation clarifies the markup vocabulary used in markup structures of an HTML document and explains in more simpler ways that how can you enable different search features of your pages. For example how you enable published date, title and content summary for your webpage which is displayed in search results. The new documentation has come with explanation of JSON-LD markup syntax which is newest feature added to schema.org vocabulary. Google also announced to retire the old documentation very soon.

How JSON-LD Will Help Webmasters:

According to GitHub project, JSON-LD is defined as following.
JSON-LD is designed as a light-weight syntax that can be used to express Linked Data. It is primarily intended to be a way to express Linked Data in JavaScript and other Web-based programming environments. The difference between regular JSON and JSON-LD is that the JSON-LD object above uniquely identifies itself on the Web and can be used, without introducing ambiguity, across every Web site, Web services and databases in operation today.
Google has expanded their support for schema.org vocabulary in JSON-LD markup. After the expanded support, webmasters can link company logos and contacts, social profile links, events in the Knowledge Graph, the sitelinks search box, and event rich snippets to their search results in JSON-LD markup which help better exposure of your content in search results.

Final Words:

Google has been working since a long time to bring quality content higher in its search results by supporting those webmasters who are working genuinely and I think this update is going to help them more. It shows Google lust for quality. Hope they will keep doing the same in future.


Just blogging things you need to know. Social media tips and tricks, blogging tips and tricks and search engines news.

Site speed plays very important role in both on-page and off-page search engine optimization of a site. Giant search engines like Google, Bing and Yahoo count page loading speed as a rank deciding factor. On the other hand, a user isn't going to love a slow loading site which stops them from returning to it. So a webmaster must avoid annoying factors which slow down the page loading process. There are many factors which increase site loading time which is not liked by either bots or humans.

In this article, we are going to discuss that factor which doesn't seem to have any notable impact on page loading but in reality, it takes a major part of page loading. This factor is inappropriate use of custom fonts.

What Are Custom Fonts:

In addition to default fonts, custom fonts are those third party fonts which are used by webmasters in their sites to give an elegant look to the text found in their site. They are custom designed fonts which change the appearance of text on the page. They can be easily installed in a website or blog using @font-face CSS rule. Custom fonts themselves don't reduce the site speed but their inappropriate use is responsible for this. Before moving further to custom fonts, I am going to explain the whole process behind loading of a webpage so you can easily understand it.

How A Webpage Is Loaded/Rendered:

Page loading process is also called "Rendering" process. Some files and scripts are downloaded before the rendering of a webpage is started. Until then, nothing is seen and only a white window on the user's screen. The resources, files or scripts which  are downloaded in this time keeps blocking the page loading process so they are called "Render Blocking". Majority of third party CSS files and scripts are render blocking no matter from which service provider they are. If you are using a lot of third party widgets and scripts in your site, you may experience a marked increase in render blocking period. It is best practice to eliminate unnecessary downloads (render blocking CSS files, scripts) to improve site's speed.

Role Of Custom Fonts In Page Rendering:

While talking about Custom fonts, they use @font-face CSS rule. It includes an external font file which is downloaded first before the rendering of a webpage starts and until it isn't downloaded completely, the webpage doesn't start to load. It means that a custom font is a render blocking file so it has a prominent impact on page loading speed.
Third party designers who design custom templates for websites install many types of custom fonts in them which in demo don't seem to have marked decrease in speed because usually demo blogs don't contain other render blocking resources. The problem arises when site owner install the template and add other widgets in it. These widgets collectively with custom fonts lower the pace of page rendering. Minimum but effective usage of custom fonts is always advised. Now we will be moving forward towards best practices for custom fonts.

Best Practices For Using Custom Fonts In Web-Site:

We are not advising to stop using custom fonts completely because sometimes you need them to use fonts of your own choice for glamour purpose to attract visitors. However, you should prioritize usage of default fonts like Helvetica, Arial, Times New Roman, Verdana, Georgia, Trebuchet, Courier etc which are also stylish enough to give a graceful look to your blog and they doesn't have any effect on page rendering.
If you are not satisfied from default fonts and wish to use custom fonts at any cost, then we recommend you following practices.
  • Before installing a font, you must know how fast it is loaded. Google fonts also show how much time a font takes to load. Choose the fonts which take least time to load.
  • Install those fonts only which you want to use. Never install unnecessary fonts even don't install unnecessary weights of fonts for example you are going to use only 400 weight of font, then install only 400 weight font. Don't go for 700 along with it because even you install both fonts collectively via a single attribute, their downloading speed remains as it is for installing them separately.
  • Eliminate unnecessary render blocking resources. Use Google page speed insights to find out Render blocking resources in your webpage.
  • Install custom fonts from reliable sources.
These practices will always help you overcoming slow site loading speed with inappropriate usage of custom fonts. Hope you like and find it helpful. Stay tuned with us to get more from us.


Just blogging things you need to know. Social media tips and tricks, blogging tips and tricks and search engines news.

Google is focusing more towards mobile optimization of the website as the trend of accessing internet through mobile devices is increasing. A mobile user likes to view those pages which are optimized with readable text size, width adjustable to device size and don't require zooming or side scrolling to view the content properly. This makes a perfect mobile friendly webpage. More than couple of weeks ago, Google introduced mobile usability reports in webmaster tools to find issues related to mobile devices. In recent update to Official Webmasters Blog, Google has announced to reward mobile friendly sites by highlighting them in search results as "Mobile Friendly" site.

This is good news for mobile searchers and for webmasters who have optimized their website to be viewed on mobile devices. It will help a user to figure out if the webpage is mobile friendly or not hence visiting webpages optimized to suite their device. A "Mobile-friendly" label will be used to highlight mobile optimized webpages.

Google has announced that this update will be rolling out globally over the next few weeks. A page passes a mobile-friendly test and eligible for getting mobile-friendly label if it fulfills following eligibility criteria.
  • Doesn't contain plugins, widgets or software which are not suitable for mobile devices.
  • Use appropriate text size which is readable without zooming the page.
  • Size of the page contents is fits the screen width hence user doesn't have to zoom or scoll the page horizontally.
  • Placement of clickable links should be far enough so that user finds it easier to tap correct link.
Make sure that your site meets the above eligibility criteria so that your webpage is indexed well by Google. Following measures can be taken to confirm it.

Conclusion:

The recent steps taken by Google show that they aren't ignoring mobile devices and interested in giving mobile users a friendly search experience so webmasters should look towards this trend. They must make their site suitable for all devices including mobile, smartphones, tablets along with desktops. We can predict if Google is going to use mobile optimization as a ranking signal.


Just blogging things you need to know. Social media tips and tricks, blogging tips and tricks and search engines news.

Blogger makes search engine optimization easier for novice bloggers by providing them pre installed features which saves them from getting stuck in the language of codes. Most of SEO features which are difficult to manage in other CMS can be easily handled in blogger which makes blogger simplest CMS platform for everyone. "Search Preferences Settings" is a good example of these features. It includes a set of settings which let you automatically get control of some features regarding search engine optimization of your blog.

You can find these settings under the settings menu in your blogger dashboard. One of the biggest advantage of this feature is that you can understand them easily rather than applying these settings manually in template which aren't easier for everyone to understand. It helps in proper indexing of your blog, setting how your blog appears in search results, custom redirect options, robots.txt, 404 page settings and meta content of your blog.


Just blogging things you need to know. Social media tips and tricks, blogging tips and tricks and search engines news.

It is a common belief that doing SEO on blogger is a tough job as compared to other CMS and web-hosting platforms. I partially agree with it because although in some cases we find it hard to have an SEO friendly environment on blogger but it doesn't occur in all situations. Sometimes blogger wonders us by providing such features which are not easier to manage in other platforms like the option of defining custom robots header tags. HTTP robots header tags are special meta tags which instructs search engine how an individual page of the website is indexed and displayed in search results. These tags can be placed manually in each page inside the head section as shown in below example.
<head> <meta name="robots" content="noindex" /> (…) </head>
Manually adding these tags individually in each page can make this process demanding but Blogger makes it easier for us by giving a pre-installed setup of custom robots HTTP header tags in search preference settings. Values of each class of pages can be set individually which will serve as default for that class. For example setting a "noindex" value for Archive pages will restrict all archive pages from being indexed in search engines or by setting a "nofollow" value on post pages links on that post will be treated as nofollow links. There is also an advantage that we can also define robots meta values individually on post pages even having default settings which may differ from what we have set by default for them.

In this article we will be covering how to enable custom robots header tags in blogger and the explanation of each value individually so we can have better search engine optimization for our blog.

Explanation Of Robots Meta Tags Values:

We have 10 values for robots meta tags which are mentioned explained in the table below.
Value Description
all No restrictions in indexing the page. This is a default value and has no effect on the page if isn’t checked specifically. It allows all search engines to index and show the page in search results.
noindex This stops search engine from indexing your page and showing cached links of your page in search results.
nofollow It stops search engines from following the links present on your page. In simple words those links are made nofollow.
none It has properties of both “noindex” and “nofollow” thus it makes your page nofollow and noindex.
noarchive It stops search engines from showing cached link of webpage in search results.
nosnippet It doesn’t show snippet in the search results for the page.
noodp If your site is indexed on DMOZ (Open directory project) then there might be chances search engines extract meta data of your site from there to index your page. This value instructs search engines not to use that data.
notranslate It stops offering of translation for the page in search results.
noimageindex It stops images from being indexed in search engines for that page.
Unavailable_after Stops displaying page in search results after a specified date/time. The date and time must be written in RFC 850 format. eg. 08-Feb-94 14:15:29 GMT

Enable Custom Robots Header Tags In Blogger:

By default the robots tag in blogger is set to "all" value which allows indexing of whole blog in a regular manner but that's not what we always need. Sometimes we have to stop indexing of a specific class of page or sometimes we don't want to show a post in search results so to handling these cases we must enable custom robots header tags in our blog.
To enable this feature, follow the below steps.

1. Go to blogger dashboard > Settings > Search Preferences > Crawlers And Indexing.

blogger-custom-robots-header-tags

2. Under the Crawlers and Indexing heading, you will see "Custom Robots Header Tags" option which would be disabled by default.

3. Click on Edit and set the values according to following screenshot then click on save changes.

blogger-custom-robots-header-tags

This combination of values is recommended for all type of blogs so nothing to worry about it.

Explanation of Robots Header Tags Settings:

In this combination of settings, we have applied robots tag values to 3 type of page. First is homepage where we have selected "all" and "noodp". It allows complete indexing of our webpage without any restriction. On the other hand, it uses meta data (title and description) from our own blog not from Dmoz open directory project. Same settings is applied for the post and pages because they also worth indexing. The only category which we have stopped from indexing is search and archive pages. Their indexing can cause trouble of duplicate content so we have disallowed from getting indexed using noindex value.

Configure Robots Tags For Individual Pages/Posts:

These values will serve as default settings for each category of pages. Values on post pages an also be applied individually after enabling this feature. Now you will also see a "Custom Robots Tags" option in post editor set to default settings which you can change according to your own choice.

blogger-custom-robots-header-tags

To make a change, click on Custom Robots Tags option and then remove check sign from default field. After this click on those tags which you want to enable for that specific post and click on "Done".


Just blogging things you need to know. Social media tips and tricks, blogging tips and tricks and search engines news.

Site speed is among the important SEO factors which have impact on search rankings. Sites which load faster and smoother are more likely to get indexed than the sites which load slowly. Speed factor has not only importance for the crawlers but a visitor is also going to leave your site if it is taking a lot of time to load. Optimizing each and every necessary item to speed up your webpage will help to increase your rankings. Minifying/compressing CSS and JavaScript is among those practices which can reduce your site's load time. CSS and JavaScript codes are loaded before the site's above the fold content is displayed so reducing their size by minification or compressing can also have a positive effect on site's speed. It is also beneficial because Google recommends to allow CSS and Javascript access to crawlers so in this case the minified CSS and JS will be rendered faster by bots.

Now we will learn how to compress these codes using simple online tools for minifaction. Before proceeding to the process, we recommend you to make a save an original copy of the actual code which can be re-used if there is any problem in understanding after compression. Usually it doesn't happen but it might disturb if you remove comments inside the code and when it comes to edit it in future, you get stuck in finding the required syntax.

Compress/Minify CSS:

CSS is found inside <style> tags and is used to design layout of webpage. Externally hosted CSS can be linked using link tag with rel='stylesheet' attribute which resembles the below example.
<link href='link-to-external-css-file' rel='stylesheet'/>
Externally hosted CSS code can be found by browsing the link to CSS file in link tag. To minify the CSS go to any online compression tool like CSS CompressorCSS Drive or CSS Minifier and compress the CSS according to your requirement.

CSS, CSS-compress, CSS-compression

After compressing, replace the old coding with new compressed one. Now its up to you if you embed the sheet in template or webpage itself or host your CSS externally.

Compress/Minify Javascript:

Javascript is found inside <script> tags which may include type='text/javascript' attribute. Javascript files can also be linked externally by using script tag with source attribute as shown in following example.
<script src='link-to-external-javascript-file' type='text/javscript'/>
Externally hosted scripts can be found in the same manner as we discussed in CSS. To compress JavaScript copy and paste JavaScript to any online JS compression tool like JavaScript Minfier or JS Compressor then reduce the size of Javascript according to your requirement by pressing compress or minfy button. It will reduce your code to a much smaller one saving you extra bits hence speeding up your webpage.

Javascript-compress, Javascript-compression

Replace the compressed code with original one or host it externally as per your choice.


Just blogging things you need to know. Social media tips and tricks, blogging tips and tricks and search engines news.

In a recent update to Official Google webmasters blog, it has been announced that webmasters should allow bots to access Javascript, CSS and other image files in webpages to help better indexing and rendering of their sites. Google bots are able to render a webpage like a modern web-browser does which renders all content including Javascript, CSS and Image files hence site indexing is not only dependent on textual content now. Along with following other SEO guidelines, you must allow Google crawlers to render all content of your site.  Disallowing Google bots to crawl these files in robots.txt file can have poor impact on search rankings. It directly harms the proper rendering and indexing of your webpage by Google search algorithms specially Google panda will have a negative effect if this addition to Google technical webmaster guidelines is not followed.

What's New In This Update:

Formerly, Google bots were able to render a site just like old text only browsers which couldn't render images and other advanced web design languages. An example of this type of browsers is Lynx browser. A few months ago, "Fetch and Render" tool was introduced in "Google Webmaster Tools" which can render the site just like modern web browsers like Firefox and Chrome. As you know that these web-browsers not only deliver text content present in the webpage but they also interpret Javascript, CSS and images that are used in same webpage. It makes pages more understandable and viewable by humans. Now crawlers also crawl the site like these web-browser instead of viewing and indexing text only. This update will have a good impact in the practice to make search bots more efficient because text-only rendering is no more better than whole page rendering.

How To Check If JavaScript And CSS Indexing is Allowed In Your Site:

If you are not sure about indexing of Jacascript, CSS and other formatting in your website then you can confirm it by using Fetch And Render As Google in webmaster tools.

Google-page-render-webmaster-tools

To perform this test, select Fetch option under Crawl tools and then enter the URL you want to check. Leave blank to render homepage. Press Fetch and Render button so Google bots will start crawling your webpage. After the crawling is complete, click on Submit to index and you will see the results which may be complete, partial, redirected etc according to test. If it says Complete, then nothing to do because whole content is indexed. Partial response, that shows that some of the content wasn't allowed to be indexed, might be confusing sometimes. Well it could be due to usage of third party scripts on which you don't have any control. Here is the complete list and description of Fetch as Google responses.

You can check blocked scripts, files and stylesheets by clicking on result. It will take some times and then display actual rendering along with a list of blocked resources which will help you determine what are you blocking intentionally and what is not in your hand.

fetch-as-google, render-tool, webmaster-tools, google-tool

What Has Been Changed:

Previously, webmasters were advised to check their websites on Text-only browsers like Lynx to check if they are using right on page SEO techniques because focus was more towards text rather than other formatting.

After this update, this advice has changed to check all content of the site in addition to textual content.

Tips To Optimize Indexing Of Website:

Google is advising to follow below tips for optimal indexing of the site.
  • Make sure that your page uses most common technologies which support wide range of popular browsers.
  • Pages that render quickly are indexed efficiently so improve your site's speed by eliminating unnecessary downloads and other speed improving factors.
  • Optimize the serving of your CSS and JavaScript files by merging your separate CSS and JavaScript files, minifying the CSS and Javascript files, and configuring your web server to serve them compressed (usually gzip compression).
  • As handling of full scripts and CSS can be challenging for your servers in certain conditions so make it sure that they can handle further load for rendering of Javascript and CSS by Googlebots.


Just blogging things you need to know. Social media tips and tricks, blogging tips and tricks and search engines news.

Usage of mobile to access internet is getting increased day by day. To present better viewership to visitors, trend of creating mobile friendly sites has got a boost. Every webmaster is trying to must have a mobile friendly environment which supports all type of portable devices making it easier to search and view the site according to requirements of the device. Mobile version and responsive design are two most popular ways which are used to make a site compatible with device width. Creation of responsive website is not only due to its compatibility with screen widths but popularity of mobile devices is another big factor which enforces a webmaster to make a site for all device users. An internet user loves a site in which up-down scrolling covers all the content of site and there is no need to scroll webpages horizontally. It doesn't help users except annoying them. In mobile devices users are more likely to see the websites which have these features hence Google recommends to create a responsive template for mobile users.


Just blogging things you need to know. Social media tips and tricks, blogging tips and tricks and search engines news.

Newbies are finding it somewhat difficult to get better rankings in search engines after the release of Google Penguin and Panda which has changed the way how a site gets indexed in top search results. Before these updates Black-Hat SEO techniques were used by webmasters but it isn't the thing which can help now. Things on internet change very quickly so there is nothing to wonder if SEO strategies have been changed in years. Web-spam techniques which were used to get quicker rankings in past are useless now for the sites. Filling head section with tons of keywords, creating inbound links on unlimited sites, buying paid backlink etc are of no worth now. Google algorithms are smart enough to detect webspam and strict to penalize them. Now its need to be very careful in order to run a site smoothly because no-one knows when his work is going to be objected. That's why you should avoid spam techniques which you were using before and make new strategies to remain safe. You may find a lot of techniques on internet which are claimed to be working, panda and penguin safe and blah blah but don't believe all of them. Not every technique and strategy is going to help you in improving site's SEO as most of them have died and corpses are buried so they could not effect alive people.

Now Google's focus is to boost quality content by rewarding the webmasters who are working genuinely. This reward is in the terms of bringing them at top of search results. Quality content and positive SEO may brings site slowly at top but when they come, they remains for a long time. Just like offline world where you observe that the results of genuine work seems slow but are long lasting.

The major false SEO technique is associated with backlinking strategies. We will be covering how your rankings are affected by backlinks and what should be your backlinking strategy.

How Important Backlinks Are:

Backlinks (also called inbound links) are the external links on different webpages pointing towards a domain. They interconnect webpages with each other. Backlinks are important because Google and other search engines decide the importance of a webpage by the number and quality of its backlinks. On the other hand, backlinks are also the source of traffic.

It is a fact that Google is focusing towards quality of content but still backlinks have not lost their importance as said by Mat Cutt in a video. He believes that backlink will remain important factor in search rankings for many many years.

He further added that backlinks will slowly become less important but now along with other SEO factors, this factor should also be made strong.

Must Avoid Backlink Strategies:

Here I am going to mention those things which you must avoid when making backlinks but still many webmaster adopt those techniques due to recommendations of other web-spammers.

High PR Backlinks Myth:

There was an old concept which still rotates in online world that number of backlinks on high pagerank sites is important irrespective of their niches. That is not the way Google decides your domain importance. Although backlinks on sites with higher PR show authority and importance of domain to crawlers but they have real importance if find on similar niche domains.

For example, your site is all about "Android Tips And Tricks" but you start creating backlinks on sites with religious niches thinking that no matter they have totally different niche but have high PR. This practice will not help you gain importance but it will be confusing for crawlers to index that cricket site according to its own niche.

You must keep it in consideration that does the webpage, where you are going to create a backlink has similar niche as your site or is it totally different? It will help you in creating inbound links of real worth.

Directory Submissions:

Few years ago, it was recommended to submit a domain in many directories to gain Pagerank and high rankings. Now this practice doesn't worth any more because the real purpose of directory sites have died. These were used to inform search engines about a domain and visitors also used to find a site in directories by categories but after the improved intelligence of search engines, there is no need to tell directories about your domain. Its because search engines are now smarter than some years ago and can find exact site according to query that is why people don't rely on web directories now. It has not remained a useful practice now so submission your site in hundreds of directories is only a waste of time.

Most of directory submission want a reciprocal link on your site too. If you add your site in 500 directories offering high PR backlinks but 350 of them asking you to exchange a reciprocal link too, then your site will be burdened by many out going URLs which don't co-relate your niche. When Google will see this kind of link exchange, it will declare it false technique and your site might face penalty for this.

Most directories don't also have good page rank and traffic. They have Alexa rankings ranging from 1,00,000 to 4,00,000 which could not help in getting a single click to your domain. They also provide only nofollow links which cannot boost your link juice so why wasting your time in submitting links to these worthless places?

Spammy Comments On Blogs And Forums:

This is a common practice by many webmasters thinking that links in blog comments are usually nofollow so they will help in getting traffic only and not hurt domain's reputation. So they keep posting spammy comments on different blog posts and forums. It might help in generating traffic but the worst part is that this practice if done continuously can trigger someone to file a complaint against you. In that case, Google can use its authority to take necessary measures against web-spam. So always post genuine comment which are on-topic and not seem extra for the post.

Link Exchange/Reciprocal Links:

Reciprocal links can be defined as those inbound links which are placed on two domains as exchange of links. For more simplicity, suppose Site A owner wants to have a link on Site B. He requests Site B owner to place Site's A URL on his site and in return he also place a link pointing towards site B on his domain.

This practice is forbidden in Google webmaster guidelines so Google may punish a web-site where it finds multiple link exchanges.

Buying Backlinks:

You will find different ads on internet offering paid backlinks on high PR sites claiming that they are Penguin and Panda safe. However Google forbids buying links for SEO purpose and declares it a webspam. It should be asked from those sellers that how could those backlinks are safe? Is Google webspam team is unable to locate them? When a normal SEO person knows how to detect an unnatural backlink, then how could those mathematical algorithms be deceived?
So never make a mistake to purchase any type of backlink.

Working Backlink Strategies:

The undeniable importance of backlinks always keep a thought in our minds for making a right strategy. As above techniques aren't going to help, a person would certainly be eager to know the working strategies. The following part of this article will cover the working methods in this area.

Create Attracting Content:

A known established site gets visitors no matter what the content it contains like News Paper and brand sites because people know that the site content will be what they are looking for but it is not with the new and small sites run by individual or small organizations. A site must be started with quality content which users not only love to read but love sharing it too. Having an appealing title steals visitor concentration and compel him to read the whole article. After reading whole, user decides if your article worths a share to followers. Google Trends help can help you determine trending topics on internet.

We normally think that sharing is limited to social media only but this isn't right supposition. A high quality content can also be shared by people in their site articles as reference. This gives you high quality backlinks along with traffic. So keep focus on the optimization of article body to ensure quality. Head section which contains meta tags is normally important for the crawlers but quality of the article which a human determine is represented by body section so posting informative, interesting and appealing content is helping you more than anything else. This is most important thing which gives you original backlink from others.

Answer To The Questions In Communities:

There are many communities on internet which discuss about a specific topic. Yahoo Answers, Ask, Quora, Facebook Groups, Google Pages are some examples where you can find such communities. They discuss about multiple things and ask for the solution of their problems. Search for communities which discuss about the topics which relate with your site's niche and find questions which you can answer with your site's content. Links which will you post in those answers will be natural and no objection could be made for them. The traffic generated by those links will also be interested in reading you thus reducing bounce rate too. This can be done with blog comments too.

Forums With Same Niche:

Thousands of forums cover almost each and every topic which is discussed on internet. Forums which are related to your topic can give you some natural backlinks too when you share something with them originally. They can be easily found on Google by adding forums, vbulletin or bulletin board with your topic in search queries for example
  • your topic + Forums
After you have found those forums, start your work in a more professional and genuine way. Solve the problems of other people and gain reputation. You can help them with your site too. It will benefit both visitors and you. Thus fulfilling the purpose of quality interaction.

Anchor Text:

Anchor text is the text used to mask URL in a hyper link. For example HelpITx is an hyperlink which points toward our blog. The URL is http://www.helpitx.com and I used "HelpITx" to mask this URL so it is anchor text for this URL.

The anchor text masking the inbound links pointed towards a site must relate with the content of the site so it gives a positive reputation to the site. A good example for anchor text is the brand name of site. In the above example of Google anchor text, you might have noticed I used brand name instead of using its other features like No 1 search engine, top ranked site, internet giant etc. Its brand name is announcing that its a brand and search engines like to index brands. Keep working to make your content not only a site but a brand. Using your brand name as anchor text is most useful practice to tell Google you are a real brand.

Don't use irrelevant keywords in anchor text because both Google and visitors hate this. Be original. That's it.

These are some useful techniques for taking a step towards the SEO which you need to stay longer. Hope it helps.
Stay tuned with us for more updates.


Just blogging things you need to know. Social media tips and tricks, blogging tips and tricks and search engines news.

Google uses multiple search algorithms to display search results for a specific query. When a user starts typing a query in search bar, the results based upon those algorithms start appearing. Every webpage on internet is examined by many Google algorithms so after analyzing each and every page carefully, giant search engines decides which are to be appeared first in first of search results for a specific query. A primary user has nothing to deal with those algorithms because he needs only results. No matter from where he gets, if he doesn't find results from a web-page, he sees next until he finds what he is looking for. This is the attitude of a normal person. But webmasters have different considerations for these search algorithms. The reason is, every webmaster is consuming his energies in search engine optimization to come in top results of search engine. For this, they need to know how their web-sites are examined and analyzed by crawlers to index them search rankings.

Google is most popular search engine as we can see that it is top visited website in whole world so beside other search engines, webmasters keep struggling to get top rankings in search results. That's why, webmasters try to implement those SEO techniques on their websites which Google team love and want to see on internet. So they introduced search algorithms to analyze multiple factors of a web-page which determine quality of content. The purpose of Google by introducing these algorithms is to bring high quality and original content in top of search results helping users and webmasters too. Google has set some webmaster guidelines which are to be followed in order to get indexed originally. These algorithms strictly monitor if a site is following those guidelines or not so the rankings are decided by them.

Search Engine

A blogger must have know-how of these algorithms so he does not get involved in forbidden SEO techniques which lead to penalization of a site. In this article, I am going to give an introduction to some major search algorithms by Google. The algorithms which I am introducing here are those having massive impact on search results. Lets proceed.

1. Google Panda:

Panda
According to my consideration, this algorithm is named as Panda due to the similarity in behavior of Giant Panda and this algorithm. As Giant Panda loves to eat bamboo plants which consists of thin sticks, similar to it Google Panda eats thin pages on the web. Thin pages refer to low quality content or content with no or un-noticeable text. Google Panda was first released in February 2011 and has effected 12% search results.

Google Panda is very strict algorithm which penalizes sites which violate ethical webmaster guidelines for example copyright infringements, low quality, plagiarized material, thin content etc. Mostly whole site is effected by Panda penalty due to only few pages in a site. Lets make it clear.

Suppose you run a genuine blog with great quality content which deserves to be in top of search rankings. Unknowingly, you make some posts which were of very low quality and nearly no content in them. Panda penalizes those web-pages which effect decrease in rankings of whole blog.

That is why you must not compromise on originality and quality of your content. Google's Mat Cutt recommends removal, blocking of pages to be indexed by search engines using robots.txt file or rewriting of pages having low quality content to gain back the rankings. Rewriting doesn't mean to re-publish the old content but make the modifications to enhance its quality. It is the way recover from Panda Penalty.

Google has rolled out many updates of Panda. Latest update of Google Panda (till the date) was rolled out on 27th September 2014.

2. Page Layout Algorithm (Ads Above The Fold):

Page layout refers to the placement of visual elements on a page. Visitors don't want to see ads first on the page instead of the content which they were searching on search engines. Placing too many ads above the visual fold so that original content becomes invisible on first impression triggers this algorithm to effect the site rankings. Its purpose is to provide quality content a boost while degrading those webmasters whose sites are just bucket of ads. To avoid being penalized by Google, make it sure not to place too many ads above the content so that user will see content easily which they search for.

3. Google Penguin:

Penguin, Bird
Black-Hat SEO refers to false techniques used to get quick index rankings for the sites which don't deserve to be indexed higher while white-hat SEO techniques are those which are recommended SEO practices. Black-Hat SEO techniques are used to artificially increase the rankings of a webpage by manipulating the bakclinking system. The common blackhat methods are keyword stuffing, backlinking spam, paid backlinks, buying traffic, link exchange etc. These are the practices forbidden by Google in their guidelines for webmasters. The purpose of Google Penguin is to detect web-spam and bring those site owners up who are working genuinely to ensure quality. Google Penguin penalizes the sites which are involved in webspam. This algorithm is expert in detecting unnatural backlinks, punishing keyword stuffed content which has no value for a real visitor. There are examples of those sites which didn't have any content in them but were appearing first in search results for specific queries. This was due to the webspam techniques used by the site owner.

An example was http://www.something.com which contains only a word "Something" in it but was appearing first in search results a long ago before the release of Panda and Penguin. Penguin was first released in April 2012 and it has been updated for five times till today. According to Google estimated data Penguin has effected 3.1% of total search rankings.

Recovering your penguin penalized site is sometimes automatic. Effective use of White-hat techniques, removing backlinks from suspicious sources and adopting a genuine backlink strategy can also be helpful.

4. Google Humming Bird:

Hummingbird
Google Humming Bird is Google algorithm which focus on keywords of the query and understands the meanings of whole sentence in a query rather than a single Keyword. It was released on August 2013 and was most popular update after Google Caffeine which was released to improve search results. Its purpose is to improve search results by understanding the search queries so the user gets what he is actually looking for.

For example you type a query in Google search results "report adsense violation", this will display reporting page of Adsense violation instead of other articles in search results.

Adsense

The purpose of Humming Bird is to make Google intelligently fetching the results.

5. Pagerank:

Pagerank is another important Google algorithm which was named after the founder of Google, Lary Page. It decides importance of a domain by analyzing the inbound links pointing towards the specific domain. It was first updated on 28th April 2007.

Pagerank algorithm has a scale from 0 to 10 assigning 10 to the most important domain while 0 to least important. The domain analysis for calculating its importance is done by many factors which include the authority of the webpage where the backlink is present, relation of backlink page from original domain etc.

Usually, new webmasters are advised to adopt SEO techniques for increasing Pagerank so they automatically appears well in search results. However Google's current behavior doesn't show that there is need to keep struggling to gain Pagerank. Instead of it, Google's focus is to increase quality of the content published on internet.

Pagerank had been updating punctually after consecutive quarters each year since February 2013. After which the next update was rolled out in December 2013 after 8 months wait and now almost 10 months have passed since the last update and there are no signs of any future update.

So introduction to these popular Google algorithms ends here. These five are most popular algorithms which work behind the search results. Beside them, 200+ algorithms are also working to maintain quality which shows how Google is trying to give quality content a boost.

Final Words:

So in the light of the above paragraphs, I advise new webmasters to take care of quality and applying white-hat SEO techniques to become a successful blogger. Follow the webmaster guidelines set by Google and avoid any black-hat or false techniques to manipulate your site rankings.
That's IT.


Just blogging things you need to know. Social media tips and tricks, blogging tips and tricks and search engines news.

There are doubts and concern about Google Pagerank next update in webmaster communities. It was last updated in December 2013, since then Webmasters have been waiting for newer update but nothing is announced clearly by Google. Many SEO experts are thinking that Google will not update this factor anymore. Google used to update Pagerank after every quarter since February 2013. In the last quarter of 2013 Google's Matt Cutt showed doubts if there would any future update of PR in 2013 but webmasters were surprised when Google officially announced the update on 6th December 2013. That was the first longest interval between two successive PR updates. Now the waiting period has crossed more than 9 months and nothing can be said with surety about the future updates so it is an unanswered question in thousand of minds.

What Is PageRank and How It Works?

It is a ranking algorithm by Google from scales 0 to 10. It shows how important a domain is. Many factors are involved in calculation of this algorithms which include quality content, domain presence on internet, backlinks on other sites etc. This determination helps Google to index a website in search rankings. Sites with higher Pagerank get a notable presence in search results so webmaster make strategies to increase their websites Pagerank. The strange silence on Pagerank update by Google is creating an environment of confusion.

Why There Was Long Delay In Last PR Update:

The 8 months delay was an unexpected event. In last quarter of 2013, Matt Cutt said that the "pipeline" used to send automatic updates to Pagerank toolbar had been broken thus Google hadn't updated it. He further added that it might be the end of Pagerank updates. He claimed that the last update was unintentional update. Well, if we see his statements, we might conclude the latest is also the last update but it raises another question for the new sites which were launched after 6th December 2014. Would these sites will be enjoying any more benefits in Search rankings because if PR is not updated, they will always have 0 pagerank which indicates least important site. Will they be able to compete those sites which have higher Pagerank?

We still believe that PR effects search rankings. Until it is clearly mentioned if PR factor has been excluded from ranking signals or Google will update Pagerank, new webmasters will remain in a state of worry. Like all other signals announcements, Google should also clear doubts and concern over this factor.


Just blogging things you need to know. Social media tips and tricks, blogging tips and tricks and search engines news.

Daily million of image searches are performed in giant search engines which generate a good traffic. Image optimization is among those SEO factors which directly increases your sites traffic. Take an example of a site which is highly SEO optimized and gets lot of traffic from web search but what if he start getting even half a percent more traffic from image search. Isn't it an awesome deal? Sure it is. This factor refers to the techniques which are used to bring those images in front of user which he is searching. You might wonder how Google opens exactly those images which you search. You all know that search engine crawlers can read text only and they don't understand what is hidden in images that's why they cannot recognize image until you tell crawlers about them. In this article, we will cover the image optimization aspects for acquiring better image rankings.

How Image Optimization Works In SEO:

Enhancing images works in following ways.
  • Instructing crawlers about the image.
  • Telling humans about the image.
  • Improving image format
  • Maximum quality with minimum size
This is the list of things which we have to keep in mind when we use images in our website.
Now we move forward to methods used for this purpose.

1. Choose A Descriptive File Name:

File name must be relevant to your image. Usually images which are downloaded from internet or taken from camera are denoted by numerical strings like 0024420551752520.jpg. This type of name won't help in SEO. Keep in mind that image file name has also an impact on image searches so select a distinguishable name. For example, you have posted an image which contains a comic of Spiderman in jpg format. Best method of naming it would be "comic-spiderman.jpg".

Choose relevant keywords in image name and separate those keywords with dashes or hyphens. Don't trying to separate with underscore because search engines don't consider it a separator. Below you can see how Google reads dashes and underscores.
  • Comic-Spiderman will be read as Comic Spiderman
  • Comic_Spiderman will be read as ComicSpiderman.
File name has a direct relationship with SEO so don't ignore this. It is an instruction for crawlers.

2. Use Of "alt" Attribute:

Image tag in HTML uses some attributes which define an image. Among them "alt" attribute is that which tells crawlers and human about the image. It tells what the image contains in it and has direct relation with SEO. Below is the example that how alt attribute is used in image HTML.
<img alt="this-is-alt-text" src="helpitx/icon.png" />
Search engines read this text to know what is the image about so write a descriptive alt text relevant to image. Don't exceed 150 characters because after 150 characters, Google don't read the text.

In addition to SEO, alt tag has another function which we see when the images are disabled in browser settings. The alt text is displayed in the empty image box which tells users what is inside this image box actually. Descriptive alt text helps both users and crawlers understand content of your image.

3. Give A Descriptive Title To Image:

A user watches title of your image too which helps him decide whether he should click on image or not. A relevant and descriptive title is better for getting traffic from images. Use "title" attribute in image tag to make an image's title. For example see below.
<img title="this-is-title-text" alt="this-is-alt-text" src="helpitx/icon.png" />
Title text is basically for humans, not for bots so write it in a manner that it attracts humans. Relevancy like all other factors is important along with descriptiveness.
Title text is also displayed when someone hovers mouse pointer to an image. So like alt text, it also tells about image. Some site owners keep alt text and title text same. It isn't a bad practice too.

4. Reduce Image Size:

Site speed is among important SEO factors. Slow sites are not indexed higher by Google. Heavy images reduce site speed thus effecting index rankings so always reduce the size of heavy images before posing them. Paint.net, MS Paint, Adobe Photoshop etc programs can be used to crop, size reduction and modify the image. Another important thing is maximum image quality with minimum size. You can use imageoptimizertinypngjpeg-optimizer etc to reduce image size without effecting the quality. Low sized images have great impact on site's index rankings.

These are some methods to optimize images for SEO rankings. If you have further questions, feel free to ask in comments.


Just blogging things you need to know. Social media tips and tricks, blogging tips and tricks and search engines news.

Heading tags play very important role in search engine optimization of a blog. Heading tags in HTML are generally used to organize and categorize the content. They are also the way of telling search engines the importance of the content. There are 6 types of headings  according to HTML which ranges from H1 to H6. The number shows the priority for headings. The H1 heading is given more priority than rest of the headings by search engines. Similarly H2 heading has lesser importance than H1 and the same followed by H3, H4, H5 and H6. In this post I will explain the method of optimizing heading tags in blogger with respect to SEO but first we will look how they work in blogger.

Talking w.r.t SEO, we should optimize headings in a way that most important heading of the page is assigned H1 tag while the lesser important gets H2 and least important gets H4 or greater. On blog's homepage, blog title is the most important heading while on post pages, title of the post is most important heading which must be read by search engines.

By default, blogger uses following order of headings.
  • H1 tag for blog title
  • H3 tags for post titles
  • H2 tags for widgets headings
For better SEO optimization, we must change this according to following sequence.

On Blog Homepage:

  • H1 tag for blog title
  • H2 tags for post titles
  • H4 tags for widget headings

On Post Page:

  • H2 tag for blog title
  • H1 tag for post title
  • H4 tags for widget headings

Why We Have To Use This Sequence?

We are using this sequence due to some reasons. First of all, we know that Blog title is the most important tag on homepage so we assign it H1 heading on homepage but blog title has secondary importance on post pages because title of the post is what which must be indexed first so we are assigning H1 to post titles. Side widgets' headings are not of any importance so giving them H4 tags isn't going to harm SEO.

One may ask why we are assigning separate tags to titles. Can't we place all post titles and blog title under H1 on all type of pages?

The answer is "NO". It is because H1 is the most important heading tag which helps in boosting search engine rankings but it is a worse practice to place more than 1 H1 tag on a single webpage. Search engines specially Google detects it as webspam thus result in decreased rankings.

Lets proceed towards the method.

For this we should access blogger HTML editor first. Login to blogger, go to Template>Edit HTML.

Blogger Template Editor

Change Blog Title To H2 On Post Pages:

Click anywhere inside the template editor and press CTRL+F. Now search for following piece of code inside the template.
<h1 class='title'>
(If you don't find this code, just search for H1 and see if it represents your blog title).

SEO optimization of heading tags in blogger

Replace this tag with following code.
<b:if cond='data:blog.pageType != "index"'>
<b:if cond='data:blog.pageType == "archive"'>
&lt;h1 class='title'&gt;
<b:else/>
&lt;h2 class='title'&gt;
</b:if>
<b:else/>
&lt;h1 class='title'&gt;
</b:if>
Now watch for closing </h1> tag and replace it with following code.
<b:if cond='data:blog.pageType != "index"'>
<b:if cond='data:blog.pageType == "archive"'>
&lt;/h1&gt;
<b:else/>
&lt;/h2&gt;
</b:if>
<b:else/>
&lt;/h1&gt;
</b:if> 
Repeat this step wherever you find h1 title tag. Usually it is found only one time in a template.

Change Post Title To H1 On Post Pages:

Now comes most important part of this tutorial. Default blogger templates use H3 for post titles but some custom template use H2 type post titles as well. We will change either type of heading to H1 on post pages.
For this search for following piece of code in template editor.
<h3 class='post-title entry-title' itemprop='name'>
Note that if you are using custom template, you might have to replace h3 with h2 for searching this tag.

SEO optimization of heading tags in blogger

Now after finding this tag, replace it with following code.
<b:if cond='data:blog.pageType != "index"'>
<b:if cond='data:blog.pageType == "archive"'>
&lt;h2 class='post-title entry-title' itemprop='name'&gt;
<b:else/>
&lt;h1 class='post-title entry-title' itemprop='name'&gt;
</b:if>
<b:else/>
&lt;h2 class='post-title entry-title' itemprop='name'&gt;
</b:if>
Replace the closing h3 or h2 tag with following code.
<b:if cond='data:blog.pageType != "index"'>
<b:if cond='data:blog.pageType == "archive"'>
&lt;/h2&gt;
<b:else/>
&lt;/h1&gt;
</b:if>
<b:else/>
&lt;/h2&gt;
</b:if>
Here optimization of heading tags for blog title and post pages complete. Now we will move towards final step which is changing the heading tags for widgets from h2 to h4.

Change Widget Headings To H4:

Find all occurrences for the following piece of code.
<h2><data:title/></h2>
Replace h2 with h4 in all occurrences. Some times you will see different type of code for widget headings but <data:title/> will be same in all so just replace h2 with h4 in them too. Its up to you.

The work is finished here. You might see some changes in structure of your headings after changing this sequence. If it happens, make some changes in your blog's CSS styles for those headings.


Just blogging things you need to know. Social media tips and tricks, blogging tips and tricks and search engines news.

A couple of weeks ago, Google announced to use HTTPS as a ranking signal in their search algorithm. Google team declare that they are working to make web secure and wish to see every web site access from Google should be secure. To make sure this thing, they announce to give secure sites a boost in search rankings. After this announcement, webmasters and blog owners are considering to move their sites to HTTPS. After the NSA's spying leaks, there was a pressure on Google to maintain internet security so it may be the result of this pressure.

You might be worried about your site rankings after knowing this but calm down, it is not actually gonna push you completely back in rankings because it is not a single factor on which site rankings are decided. Multiple factors decide rankings of site in search results so focus on what you can do and do not worry about the things which you don't need indeed.

What Is HTTPS/SSL And How It Works?

HTTPS is the abbreviation of Hyper Text Transfer Protocol Secure which is the secure version of HTTP (Hyper Text Transfer Protocol). Usually sites use HTTP protocol until they are made secured by linking an SSL certificate. SSL stands for Socket Security Layers which make a site secure to use by encrypting its information. When a site is linked to an SSL certificate, it uses HTTPS protocol instead of normal HTTP. The URL of a secure site in address bar will show HTTPS and a Padlock with sometimes Green Text is also visible in almost all browsers which make sure that the site is secure to use and the data sent by the site is encrypted. This makes user's information more secure which is to be sent over.

Why HTTPS Is Used?

As I mentioned above, HTTPS makes a site secure and protects its users information. It is mostly used by big social media sites, E-commerce websites and sites which must ensure privacy of a user by making its data safe. It is also used by the sites which use log in. Actually the data which is sent over internet travels between many computers before reaching to required server. In this way, the users credentials can be hacked such as credit card numbers, username and passwords etc. HTTPS encrypts the information which is sent over these places thus making it secure from unauthorized people. In this way HTTPS and SSL ensure a site's security.

Will It Impact My Rankings?

As Google announced it, it will surely give you a slight increase in rankings but don't completely rely on it because Google says, it will have only 1% impact on rankings. So you will not see a rocket boost in your rankings even if you make it secure. As only 10% websites from all over the internet use HTTPS protocol so it is not a matter to be worried if you are not already using SSL. It is among multiple SEO factors which help to improve rankings of a site such as quality content, solid presence on internet, social media optimization, site speed and much more.

It will not gonna to be alone rank deciding factor. A long ago, Google announced that site speed would also have impact on search engine rankings so the webmasters moved towards the tips to make their sites work faster but still most of the sites which have relatively slower speed appears first in rankings. This is due to quality content and other SEO factors. HTTPS isn't usually needed for the sites which don't require login, accept payments or contain private data. Blogs and informational sites have nothing to do with users data thus no competition might be predicted for this factor.

Now take an example how this factor will decide the rankings.

Suppose you own site A with quality content and all other SEO factors similar to site B. Your site uses SSL certificate while site B does not. If all things are similar except for SSL, then Google will prefer to place your site on first rank while site B on second. But as I have told earlier, no tough competition is predicted in near future for HTTPS so if you aren't already on SSL, don't rush towards it isn't actually requires for your site because basically it has to work with the encryption of site's confidential data. If your doesn't require users credential, why you are thinking to make it secure?

Security Limits Of SSL:

So you might be thinking about security benefits of SSL and HTTPS. This is in fact a security factor but it is limited to encryption of users' credentials. It will not secure you from DDOS attacks and hacking attempts. If you are thinking from this point of view, then you must review before doing anything how to make your site secure from hackers and DDOS attacks. SSL has its own limit and it will not save your site but it only save your users.

Conclusion:

It's better option to use HTTPS in your site if you can afford it because it is going to increase your rankings. Running an informational website or blog without requiring users credentials doesn't necessarily need HTTPS or SSL. However if you still want to use it you can use it.
If you are running any Ecommerce website and websites with active logins like forums, social media sites, which require payment information and others, then you must move your site over HTTPS. It is not only recommended but you must do it to keep your customers and users safe.
If you still have any doubts about HTTPS and SSL, you can ask me.