SEO

The Dirty Little Guide to Optimizing Your Blog for Google Hummingbird

Google has dropped its age old Core Search Algorithm and replaced it with a brand new version called “Google Hummingbird.”

But have you optimized your website or blog for this new algo? Have you changed your content marketing strategy for higher ranking of your blog posts in Google SERPs?

Even though you think your answer is “Yes,” most probably, you are “Not.” The reasons – you are not clear:

1. What is Google Hummingbird?

2. Why Google introduced it?

3. How to optimize your blog or blog posts with this new Google Hummingbird algorithm?

Continue reading… to know in-depth about Google Hummingbird, and a simple yet effective SEO strategy to optimize your blog for this algorithm.

Google Hummingbird

Understanding Google Hummingbird

Google uses a set of search engine algorithms to index, analyze and rank web-content semantically to provide the best possible results to web-searchers.

Using these computer driven algorithms they index billions of web-pages and store them in large servers located in different parts of the world. Then analyze searcher’s query using semantic search technology. Finally, ranks different web-pages using various ranking signals. Google uses more than 200 search engine ranking signals to return the best possible results to searchers. So, indexing, analyzing and ranking are the three basic jobs of Google Search Engine.

Google Hummingbird is Google’s new Core Search Engine Algorithm to meet the search behavior of modern people, and to be the best answering engine in the world.

Query Rewriting: Back in year 2003, Google had filed a patent for query rewriting which was passed in 2011. Query rewriting is the technique to incorporate conversational search in Google.

It means – rephrasing searcher’s query implementing semantically available data to understand the actual meaning or intent. The data may be like place, device used, search history, synonyms of keywords used, and various information shared with Google by the searcher. And Google Hummingbird is better positioned to handle the query rewriting than its older counterpart.

Why Google is thinking about “Query Rewriting” now?

The change needed to be done, because people have become so reliant on Google that they now routinely enter lengthy questions into the search box instead of just a few words related to specific topics.

Amit Singhal, the head of Google’s core ranking team.

The statement speaks lot of things. It means – searchers are using long-tail queries instead of few head-terms. They usually enter their search keywords in the form of questions. That means conversational search is on rise. And most of the conversational or voice search comes from mobile devices like smartphones and tablets. To better understand the conversational search Google implemented query rewriting, and to better handle query rewriting Google Hummingbird is the right set of algorithm.

Entity Search: To return the best possible results to searchers Google has been constantly shifting its focus from few ranking signals like keywords, links and link anchor-texts. And focusing more and more on entity search or meaning search.

The reason- You can manipulate keywords, you can manipulate links and link anchor-texts, but you can’t manipulate the meaning or the theme of the web-page or article. So, Google uses query rewriting to understand the actual intent or meaning behind the search query and try to find and rank relevant web-content which actually fulfills the searcher’s needs.

In the context of search, entity means – people, places or things. Google has stored millions of structured data which are verified and validated in its knowledge graph. These data act as entity, or people, places or things. Semantic search technique uses various data like place of the searcher, device used for the search purpose, search history and behavior of the searcher and various other information shared with Google by the searcher.

By leveraging its vast knowledge graph along with semantically available data about the searcher, Google perform entity search or meaning search with the help of query rewriting. Entity search doesn’t mean Google only looks for a specific entity in their knowledge graph or in the web. It actually tries to establish a relation between the entities in the search query and find the searcher’s intent or actual needs.

For example: Suppose you do a Google search for “I want to buy an iPhone.”

“I” is an entity. “An iPhone” is an entity. And “want to buy” combines or links the other two entities to establish a meaning or intent.

Data shows – number of searches from mobile devices will surpass search performed from desktops by the end of 2014. Voice search or conversational search is gaining pace rapidly. And Google Hummingbird is the best search engine algorithm to understand meaning or intent of the searchers.

Authorship Rank: There has been lot of chatter about Google Authorship Rank since last one year or so. When you implement your authorship mark-up to a piece of content in the web, Google attributes you as the creator of the content or author of the content.

Authorship Rank acts like a double edged sword. When you create some really great content which people like to bookmark, share with their friends (via social media) and revisit again and again, then you become an authoritative publisher in the eyes of Google. That means your Authorship Rank goes up and Google ranks your content higher in search results. The opposite thing happens when you create low quality web-content or articles which don’t fulfill searchers needs.

Always keep in mind – You’ve implemented your Authorship Markup means you’re sending all possible data to Google as an author or creator of your content. Web-content with authorship markup not only get higher CTR, but also ranks high in search results. Google+ and Google Hummingbird combinedly helps Google to use Authorship Rank as a ranking signal.

Matt Cutts, the head of Google’s web-spam team hopes Authorship Rank will be explored by Google in near future. Check out the video here.

Social Signal: Before the release of Google Hummingbird, time and again, Google has said that they are not using social signals like Facebook likes, Tweets and G+ in their ranking algorithms. Matt Cutts, the head of Google’s web-spam team has personally cleared this fact at various meetings and conferences.

One argument is “Correlation doesn’t mean causation.” If a web-page is ranking high and has 100s of Facebook shares, re-tweets and Google +s; this doesn’t always mean the page is ranking because of high number of social sharing (or social sharing by influencers). It might be that more number of people found the page helpful and shared in the social media. The page may have other strong search engine ranking signals for which it ranked higher.

For example: This year there was heavy rain and maximum people were wearing Yellow rain coats. You can’t simply correlate Yellow rain coats with heavy rain. You can’t tell – there was heavy rain because maximum people were wearing Yellow rain coats.

Why Google is not using social signal for content ranking? Not because Google doesn’t like social signal; because it had technical limitations to use social signals as a ranking factor.

With Google Hummingbird Google can process social signals in a more efficient way. And in near future you will start seeing Google Plus impacting search results. So also the likes, tweets and stumbles.

  • Google Hummingbird is a platform for Google to incorporate existing algorithms like Panda, Penguin etc. and also the advanced algos to be developed in the future.

Google Hummingbird Takeaways (Act Now)

1. Keywords: Craft your content or web-pages around a theme or concept, instead of keywords. By encrypting all searches (https://) or making “keyword not provided” 100%, Google has cleared its intention. Google has gone beyond keywords.

It doesn’t mean you shouldn’t put keywords in your content which your target audience are actively searching for. But your content should answer some specific queries of your visitors or searchers.

2. Long-tail Keywords: Did Google Hummingbird kill long-tail keywords?

No. Google Hummingbird acts at query level and makes long search keywords used by searchers in to short by query rewriting. It doesn’t affect the long-tail keywords in your content while ranking. You should construct quality and meaningful content keeping long-tail keywords in mind.

3. Links: Did Google Hummingbird kill links and link anchor-texts?

No. Although the value of links and anchor-texts as a ranking signal has reduced a bit, but it will be here for long time. Page Rank (PR) still flows via links, and Google doesn’t want you solely depend on links for better ranking of your content.

4. Mobile Optimization: Your website or blog theme should be optimized for all types of mobile devices like smartphones and tablets. Basically, it should be responsive and fast loading.

If your website doesn’t offer a better mobile user experience, your rankings will drop in desktop search also, along with mobile devices.

5. Panda & Penguin: Where are Google Panda and Penguin?

Algorithms like Panda, Penguin, Top-heavy, EMD etc. which are filters or algorithmic penalties are parts of Google Hummingbird. Algorithms which will be developed in future to catch spammers or to rank web-content higher will also be parts of Google Hummingbird.

6. Author Rank: As of now Google has not declared about it officially. But be prepared for it. By introducing Google+, Google Hummingbird and Authorship Mark-up Google has made its intention clear.

Everything you create on the web, whether on your own blog or on others blog will be attributable by Google towards Authorship Rank. Spammy authors will be equally punished by reducing their AR.

7. Data Validation: Google thinks its Knowledge Graph and Wikipedia (to some extent) always provide correct facts and figures. If your article doesn’t resemble their data then it will be ranked lower. On the other hand, if you contribute Google’s Knowledge Graph by providing high quality and authentic data on your web-page, then Google will rank your content higher.

8. Social Signal: Google Plus will play a major role in coming days by providing social signals to Google for ranking content. Your brand or website should have its own Google+ page. And if you perform better, may be in future, your brand will be a part of Knowledge Graph.

Decreasing authority (sequence) of social signals will be like Google+, Twitter, Facebook etc.

9. Advanced Link Analysis: Google Hummingbird is going to fulfill Google’s long cherished wish of analyzing different kinds of links for assigning different values. Now also Google is doing it. But it will be more sophisticated in coming days.

The value of a link on the sidebar, on the footer, within the article and inside the author-box will be calculated with more accuracy.

10. User Experience: Time spent, CTR and user interaction with the web-page will be more important for ranking content with Google Hummingbird. Along with quality content, focus on your website or blog theme like navigation, readability, contrast etc. for better user experience.

Dear friends. Greetings from Google Hummingbird! ;)

Because you love to create content and always adopt White Hat SEO Techniques. With the introduction of Google Hummingbird there will be no place for spammers in the future. Bloggers and content marketers with good intention will be rewarded highly for their hard work.

This is Akash KB, signing-out from AllBloggingTips.com, till next post Happy Blogging :)

Don’t forget to shoot your comments (questions), I’ll be right here to answer them ;)

The Beginners Guide to Creating Profitable Niche Sites with Alexa

Earlier, We already done explaining in great details on how to actually find long tail key words which one can use the build profitable niche sites that’ll pump in money on end. So today, I’d like to elongate more on that topic by showing you exactly the process I’d take if I were you use Alexa for niche site creation.

Now, assuming I want to start.

There will be 3 significant things I’ll do first. And they’re:

  • Run sites through Alexa.com to find Long Tail keywords
  • Do keyword analysis on the long tails found
  • Build Niche site

So in other to let you comprehend more, I’ll try and expand the points as follow –

Run sites through Alexa.com to find Long Tail keywords

This part actually involves getting sites on the niche that you are trying to enter in other to get long tail key phrases on them through Alexa. This means that if you were planning to start a tech niche site, then finding tech sites will be your priority so that Alexa can show you long tail keywords on that niche.

Where is the best place to look for sites on different niches?

  1. Technocrati
  2. Dmoz 

So let’s say I want to start a niche site on the “writing tips” niche. What I’ll do first is to find sites on “writing tips” through those directories.

But for the sake of example, I’ll be using one site to show you how you can do yours effectively.

Once I get a site, www.productivewriters.com which is in the writing tips niche, I’ll quickly head to www.alexa.com; enter the URL and hit search.

create profitable niche site

A page will load.

I’ll go ahead to click on the URL.

create micro niche site

This will immediately take me to the main analytic page of productivewriters.com where the long tail keywords are situated.
Once there, in other to find the long tail keywords, I’ll scroll down and locate the part termed: Top Keywords from Search Engines (as shown by the screenshot below.)

create profitable micro niche website

What you’re looking at are the top search key phrases sending the most traffic to productivewriters.com. These keywords are sometimes very profitable key phrases that are good to start-up niche sites with.

These are the search keywords (culled from the above screenshot):

  1.  Ways to backup your files
  2. How to backup files on my pc
  3. How can I write articles faster
  4. Freelance writers den
  5. Auto response sales email

So, now that I’ve gotten 5 not too popular long tail key phrases on the niche I want to enter. The next thing to check now is to ascertain which of the keywords are profitable and quite easy to rank for.

Do Thorough Keyword Analysis on the Long Tails Found

The beauty of a long tail keyword is the gift of not being too competitive or hard to rank for.

However, under normal circumstances I should normally do keyword analysis on each of the keywords above so that I can choose the best to use for a niche site.

But for the sake of this example, I’ll use the number 2 long tail keyword:

How to backup files on my pc

So in other to know whether this keyword is competitive, I’ll run it through Google first to know which sites are ranking for the top positions for the long tail keyword.

These are the current search engine ranking placement of websites for the keyword –

make moey with micro niche site

Now that we know which sites are ranking for “how to backup my files on pc”, the next thing is to check why those posts on those sites are ranking top for the key phrase.

However, since we know that Google uses some metric like backlinks pointing to a post and site authority to rank websites on SERPS, then it would be really clever to find out how many backlinks each post has.

I’ll use www.opensiteexplorer.com and www.ahrefs.com to check the backlinks of the posts on those sites.

So, to check the backlink each post has I will quickly open the first 4 posts in a new tab (I believe that the first 4 posts on SERPS are quite the most important posts to pay attention to if you want to outrank and gain the number 1 spot.).

(These are the 4 top ranking posts for “how to backup my files on pc”)

make money with niche websites

After opening them in new tabs, I’ll copy the first ranking post URL and paste it into Ahrefs and OpenSiteExplorer in other to check how many backlinks are pointing to the post URL.

I’ll go ahead to apply the same action to the other remaining 3 posts URL too.

These are the breakdown of their backlinks according to their position in SERPS –
– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –
Position in SERPS: 1st
Topic: Set Your Mind At Ease: Back Up Your Files Now – Windows
Site URL: windows.microsoft.com/en-us/windows-vista/set-your-mind-at-ease-back-up-your-files-now
OpenSiteExplorer: 3 backlinks
Ahrefs: 2 backlinks
– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –
Position in SERPS: 2nd
Topic: 5 Ways to Back up a Computer
Site URL: www.wikihow.com/Back-up-a-Computer
OpenSiteExplorer: 39 backlinks
Ahrefs: 76 backlinks
– – – – – – – – – – – – – – – – – – – – — – – – – – – – – – – – – – – – – – – – – – – – – – –
Position in SERPS: 3rd
Topic: How to Back up Your Computer to an External Drive
Post URL: www.lifehacker.com/5816453/how-to-back-up-your-computer
OpenSiteExplorer: 145 backlinks
Ahrefs: 337 backlinks
– – – – – – – – – – – – – – – – – – – – — – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –
Position in SERPS: 4th
Topic: How to Back up Your Files for Dummies
Post URL: https://www.youtube.com/watch?v=EwKMx-4YfsU
OpenSiteExplorer: 1 backlink
Ahrefs: 1 backlink
– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –

Well, what you’ve just looked at is the backlink profile of the 4 top posts ranking for the keyword “how to backup my files on pc” and the outstanding thing I noticed is that none of the posts has more than 400 backlinks pointing to it.

So this basically means that these posts are ranking at the top of the search engine as a result of the authority of the sites that are hosting it.

With this discovery, it basically mean that getting to rank at the top for that key phrase won’t be that hard as a result of the low backlink profile of each posts ranking for it on SERPS.

With that said, I can go ahead to build a niche site on that keyword since if I could just get more high authority blogs linking to my niche site, then I will surely claim the top of the search engine for that term.

Build Niche site for That Keyword

In other to build a niche site that’s profitable for that keyword, it would be wise to find an “Exact match domain” that has every key phrase in “how to backup my files on pc”; a domain like www.howtobackupmyfilesonpc.com is a good fit. But to know whether it’s available I’ll run the domain through Godaddy. And luckily, it’s available.
build niche site

So from this stand point, I can go ahead to build a site with that domain (if assuming I bought it).

The next thing to engage in is to build backlink back to the site after I must have set the niche site up and added enough content on it.

Likewise, once I keep on with this on a continually level, my niche site would get to the top of that search term in no time.

And finally, once the niche site has gotten to the first page of Google for the search term, “how to backup my files on pc”, I may then decide to find affiliate products on backup softwares to promote on the site which will inevitably get me some very good affiliate income on the long run.

These are some of the affiliate companies you need to know if you want to start a niche site soon using this Alexa method that I used for this post –

So once you’ve been able to get on the first page of Google for your own key phrase, along with constant traffic, then you can decide to leverage any of the products these affiliate companies above have in the line of your key phrase and also earn some side bucks too from your niche site.

In conclusion

Alexa, like I said before, shouldn’t be ignored as a long tail keyword finder. When you leverage it and find profitable keywords; make sure to do thorough keyword analysis on the long tail before you proceed to build a niche around that key phrase.

Remember, do your keyword analysis thoroughly in other not to enter the wrong niche.

So finally before I go, I’d like to ask, “Which niche are you currently targeting for your new niche site?

Let’s discuss this below at the comment. Don’t forget to share.

Thanks for reading.

How Using Google’s Disavow Link Tool May Help Rid of Bad Backlinks

Bad backlinks and the impact they will have on SEO has become a concerning factor among different webmasters and site owners.  With the introduction of the Penguin update which was set to penalize low value backlinks, the situation will worsen with the upcoming Penguin 2.0. Site owners may be justified for crying foul as they often don’t have control over links from external websites. A competitor could deliberately point thousands of spam links to your site to encourage penalties on your side, a practice known as bad SEO.

Even sites that are not targeted by such spam links may still be penalized for past link building practices as long as they don’t confirm to Google’s Webmaster Guidelines, and if they fail to have such links removed.

Google-Disavow-Links-Tool-Graphic

Fortunately there is a way in which webmaster can resolve the matter, and that’s by using Google’s Disavow Link tool. Basically the Disavow link tool lets site owners submit files through the Google Webmaster Tool listing those links that should be valued and those to be devalued.

Say you have 5 good links pointing back and one bad one. With Disavow link tool you can let Google know of the 5 links you would like considered when your site’s links are valued.

This has actually come as a great solution to webmasters and site owners struggling to remove low value backlinks, and with Google’s Penguin 2.0 set to roll into motion it will be gravely important for site owners to remove poor quality links. However there are some things to bear in mind when using this feature.

Note that if you submit links incorrectly and choose to remove non malicious links your site could drop in search engine rankings. For example, if you choose eliminating links by domain name instead of by individual backlinks you risk removing links from good sites which may have actually contributed to your site being where it is.

Use Disavow Only In the Event of An Attack

With the potential of harming your search engine rankings, site owners should only use Disavow links once sure that they’ve fallen prey to negative SEO, or in the event that certain backlink issues are not resolved with other site owners.

Note that the process of separating negative backlinks from beneficial ones can be a hectic process and one that will devour your time and effort. Sites such as Majestic SEO charge $49.99 to retrieve your entire link profile. From then it will be up to you to determine which links are worth removing and which are not. Bear in mind that certain malicious looking links may actually be rated by search engines, and it will depend on how they are placed on the referring site.

Google hasn’t really given a solid explanation on how it uses data obtained from the Disavow Link Tool. Although it may remove reported links from your site’s profile, it also reserves the right to thoroughly and independently review your submission and decide on which links to value or dis-value.

Given that the links submitted may come useful as supporting evidence in the event that you want your business reconsidered by Google after penalization in search rankings. With the potential harm associated with using the tool incorrectly it will best to use the services of a qualified SEO professional instead of attempting to troubleshoot and repair damage from SEO attacks on your own.

If you want to know more about using Google Disavow tool in details read this.

Did I missed any point? Are you using this Google Disavow tool? Does it really work?

Why You Should Use Google Webmaster Tools?

Google Webmaster Tools is an online tool used by millions of bloggers for checking most of the statistics about their Website/Blog. It allows us to check almost all the stats to related to the site performance in the Google Search results .

Webmaster Tools is used by many bloggers and webmasters, but most of them do not know what is Webmaster Tools and what are the capabilities of Google Webmaster Tools .

This article will guide you through all the objectives that can be done through this online tool provided for free by our favorite search engine, Google .

This online tool does not show the traffic stats of a blog, instead it shows the amount of clicks on a certain link of your blog in the search engine . The traffic and other related stats are shown in the Google Analytics which is also another free service provided by Google .

1014419_10152085455666040_1475835767_n

It is a must for every blogger to use this service as it helps us to improve our site and its performance in the search engines.

Sitemap Submission

There is no other way to submit a sitemap to Google . Submitting the sitemap is an important factor in building up the search ranking of your blog. Sitemap is a list of all your posts . It is indeed very essential as it can help the search engines to index our blog and all its link easily . Else , the search engine crawler can miss some of the posts of your blog .

To get a higher ranking and traffic , it is better to add a sitemap through Webmaster Tools rather than doing some other long SEO processes. Sitemap Submission is an easy way to get a lot of optimization through a short process .

Search Queries

This section of the Webmaster Tools provide us with the search terms which are performing well for our blog . It shows a list of queries which are bringing us the highest organic traffic . In my case, my highest performing keyword is still ‘QADABRA REVIEW’ . It has a good search result position between 1 and 4 .

I only came to know of such a great thing I did after a long time when i started using Google Webmaster Tools . After , i came to know of this keyword ranking high in the search engines , i optimized that post again and again till it got the first position . Now , it is changing at times .

Crawl Errors

This is the most useful tool in the Webmaster Tools . Sometimes , we might miss some 404 error pages which do not exists . So, people who come to this post will most probably press the back button of the browser . So , you are losing a lot of targeted visitors through this error . Google Webmaster Tools shows us all the pages and posts that are showing the 404 error .

So , it becomes easy for us to correct it and redirect that page if it does not exist . So again , this wonderful tool helps us regain our targeted visitors who were going to leave our blog .

Google Webmaster Tools also helps you find out the number of backlinks that your blog has obtained . All those links out there on the internet pointing towards your website/blog will be listed here with all the details .

Backlinks are necessary for a blog to become a popular one . Guest Blogging , Commenting , etc are some good ways to get a good number of backlinks .

Malware Checking

Malware is virus or some infectious things that can ruin your blog as well as others who visit your blog . Malware should be removed instantly as it is spotted . But , to spot it , you will have to Use Google Webmaster Tools or some other Malware checker available over the net .

A Malware affected Blog can become a blog with zero visitors within one night . So , it is important to check if your blog is containing any malware scripts in the widgets or any other places that can be easily crawled by search engines .

Structured Data

Structured Data is a way of making search engines know about the way your blog is and how all the pags and posts are categorized . You might have spotted in the search results that some of the search results have a different type of appearance thus increasing the attention that result receives .

Again , Webmaster Tools is essential to form a structured data markup for your blog/website. Structured Data Markup can be optimized through the Data Highlighter . If your website do not have a Structured Data, then you can form one using the Data highlighter.

HTML Improvement

Google Webmaster Tools also has a feature which shows all the errors in your HTML Website or Blogger Template . We all should try to improve our website as much as we can .

So , using Webmaster Tools also helps you to improve the HTML of your website thus improving the whole website . Sometimes , even a small mistake that escapes from your eyes will feature in this section of this free online service provided by Google .

Google Authorship

authorship-serp

Google Authorship is a feature of Google which shows the photo of the author of specific articles in the search results along with the links . This increases the attention of the people who are searching . Moreover, your photo will become famous . Google Authorship can be obtained only through linking your Google+ Profile to your webpage and vice-versa using a tag .

In case, if Google Authorship is not working , you can visit the Structured Data Testing Tool option in the Additional Tools of Google Webmaster Tools . Here , You can check if authorship is working for any webpages on the internet . Indeed, This is another great feature of Google Webmaster Tools .

And don’t forget that showing your photo in the search results can gradually increase website traffic !

Final Words

These were some important features of Webmaster Tools . Well , there are a lot many more that can be used to make a blog a superhit blog by optimizing it for the search engines as well as the readers . Happy Blogging!

Why Your Blog Posts Were Not Shared And Commented?

Why blog post why not shared and commentedMany newbie bloggers and some older bloggers were wonder why blog posts were not shared and commented by their audiences.
There were some reasons which force users to shares and comments on your articles. The below tips are some of those reasons..

Why Blog Posts Were Not Shared And Commented?

#1 articles is not unique and quality contents

Most of the people love to read educational articles which are fresh update information. They could learn about new knowledge which gives them the more experience and expertise learning.

You know?

They may want to share those value information or educational articles to their friends, colleges, and families since it valuable expertise development. That is a simple reason why blog posts were not shared and commented or nothing at all.

The readers will enjoy and share you articles, if you have written a quality post which included above benefits of learning.

#2 you have no private relationship

How many friends in your social media connection? And did you have a strong relationship with them?

Well, private relationship is majority reason of why blog posts were not shared and comment. I have 2k+ follower, 2k+ fans, and 1k+ friends connect, especially I have built strong relationship with those peoples.

Whenever, I shared some information into social media profile they always help to spread my information by shared and commented on the posts and make it popular social release. Therefore, to build a stronger relationship will configure out the problem and give you the answer why blog post were not shared and commented by other people.

I suggested you to spend at least 30 minutes to 1 hour daily to the community with others friend connect through to them and talk about some social relation.

  • Tips for Building Real Connections to Help Your Blog Succeed

#3 Shares unattractive posts

Some blogger just share only links which make other people complicated and wondering about what inside the link. Moreover, links could attract other social media users.

Now a day most of the people are very busy with their current job, thus they will never waste their time to click on some links and no valuable information.

The time constrain is another important reason of why blog posts were not shared and commented. Actually, most love to keep working and don’t want to interrupt their work without any benefit such your link.

To make those people skip their value time to click on your link you have to put some call actions such as added attractive image and brief information about value content inside the article.

#4 You Don’t have participant social media network group

Social media strategy is like WTO trading. None membership traders will gain less benefit from doing international trading.

You know there a group call CommentsDX which all members could exchange some comments with other members and gain the value of sharing to post of each other. When you ask a question why blog posts were not shared and commented, you should ask yourself that did you joint any networks or groups which benefit for increase more blog comments and sharing.

#5 you never share minds, why blog posts were shared and commented by me

When first time I start integrated social media as marketing strategy, I am afraid to share other bloggers and competitors posts.

I am really wondering about my audiences will move to other bloggers or to my competitors websites. But absolutely no, sharing other bloggers or competitors post will make audiences love you and enjoy more big resources of information entire your social profiles.

Sometime your post will be shared and commented by others since you always like or commented them.

Right now it is your turn

Thanks for reading my tip: why blog posts were not shared and commented. I think that it may not a unique idea to increase more share and comments, but I think that these above tips are still working. I believe that it will benefit for newbie bloggers to implement social media strategy.

Did I missed any point? Are you getting good comments and shares on your blog post?

Do share with us below.