Google Panda Ranking Factors, Recovery and Other Implications

Google Panda is no longer a separate algorithm to decide content quality. It is now a part of Google’s core ranking algorithm. In case you haven’t gotten the news, here’s the link to the update.

It’s been a few days since Google announced it, and SEOs are confused already.

The confusion is nothing new; it’s been in the industry for long because of Google Panda ranking factors being undisclosed. The latest announcement takes it to a new level. Two things that we know for sure about Panda are it scales content in terms of quality, and it’s

Not real time

Back in July, when the news of the refresh broke out, some speculated it to be a real time update, chiefly because of its extremely slow nature. Webmasters failed to reach to a consensus over whether the update is real time.

But Google’s Gary Illyes has recently made clear that Panda is not real time.

What this means is neither Google will check anything, nor will the webmasters receive any update in real time. Beyond these two things, we don’t know much. Some webmasters are insisting us to expect

Silence from Google

Google has always been tight-lipped about its core ranking algorithm. Does that mean we won’t get to hear anything about Google Panda ranking factors in the future? If it’s indeed the case, then webmasters, especially the ones, whose sites have been hit by Panda have reasons to worry.

That’s because Google Panda recovery has hitherto been easy thanks to information provided by Google. For instance, Google’s representative Pierre Far spoke a volume when Panda 4.1 update was released in September, 2014. But in future, Google may not disclose anything, making Google Panda recovery hard.

Panda will be regular

It’s not an anticipation, it’s been confirmed by Google’s John Mueller. The following is what he said from a Google Hangout session:

It’s something where we we we try to look at the quality of the website and understand which ones are higher quality, which ones in general are lower quality and take that into account when ranking the site. This is essentially just a way of kind of making those updates a little [bit] faster, a little bit more regular.”

Even though he added that it’s his anticipation, and he’s not 100% sure, given how all the tips from him turned true in the past, we should believe him.

What differences will it make?

More penalties

After speeding up and becoming regular, Panda may hit more sites than it did in the past. Webmasters fear Google Panda ranking factors because they don’t fully understand those factors. Those, who got hit, did their best to recover, and those who didn’t get hit became afraid to play the game without the rules. They resorted to using and safe and industry approved techniques.

If Panda becomes regular, this fear may cement and become permanent. The subservient marketers may feel scared to venture out from the outdated SEO frame, and the daredevils may get penalized. And the regularity of the update might make a Google Panda recovery difficult because webmasters may not get enough time to recover from the hit.

Search query percentage

All the hitherto released Panda updates have affected some specific percentages of search queries in English language. Whenever Google officially admitted the release of a Panda version, it articulated what percentage of search queries would be affected by the update.

On a technical level, the information didn’t help webmasters much. But it helped them understand the expansion of a particular update. For example, many webmasters had a sigh of relief in July knowing Panda 4.2 will affect only 2-3% of search queries.

If this info remains undisclosed, they’d have a difficult time picking from the pool of search queries.

Quality guidelines

We know Panda deals with content quality. But Google doesn’t seem to be satisfied with Panda alone. In May 2015, Google released an update, and later confirmed it as the sequel of 2013 Phantom update, focused on quality. What this indicates is after making Panda classified, Google in future, may lay down quality instructions via Phantom.

Details on how Phantom assessees quality are not available. The industry is busy speculating. Among some of the popular opinions, one is “How-to” type posts are given better quality score than other posts.

My suggestion to content marketers is keep following the content marketing tips that you are following now until these start to hurt you. There’s no harm in writing “How-to” posts. In fact, it’s recommended because such posts are read by many, and trigger scores shares. Alongside, work on your site’s layout, and make it mobile-friendly. Gauging the Google Panda ranking factors could save you from possible hits, and make Google Panda recovery measures unnecessary.

What do you think of the article? Do you agree with our analysis? Do you want to differ with us? Why? Let us know what you think in the comment section below

Is This SEO Agency The Right Choice? Tips to Hire a Right SEO Company

Anyone who deals with search engine optimisation knows that they need help. Sorry to dent your ego, but it isn’t embarrassing to admit that you aren’t an expert. What is shameful is pretending you are and running your business into the ground.

Companies outsource marketing for a reason: it’s a specialist subject. If you’re not an expert, you shouldn’t be in charge, and it’s that simple. But, you should choose a partner that can fulfil your needs and take the strategy to the next level.

Tips For Hiring The Right SEO Firm

To make sure you pick wisely, here are the questions you must ask and what they reveal.

Can You Guarantee Results?

There isn’t a company on the planet that doesn’t want to back a sure thing for obvious reasons. But, no matter what a firm says, there is no way to guarantee anything. They can, of course, show you a plan of action which will increase the chances of a higher ROI. And, they might use the term ‘guarantee’ as a means to show off their confidence. However, you should always take it with a pinch of salt. Instead of empty words, a business like yours needs to look for a deal that provides results through their actions.

How Long For Results?

Now, this is a tricky question because you want to hear something along the lines of immediately. After all, everyone is impatient and wants results as soon as possible. The thing is that SEO is a long-term game. To win the game, you need to be patient and wait for your strategy to gain traction. Otherwise, your short-term plan will burn out and you’ll be up a particular creek without a paddle. Quality agencies know this, so they should look to the future instead of the present. It might seem like they are stalling, but they’re just pragmatic.

Does One Size Fit All?

The answer is a firm no. It doesn’t take a rocket scientist to understand that businesses have certain unique requirements. And, the agency has to meet these needs for guaranteed SEO results. Although it sounds simple, it’s amazing how many organizations use the same tactics over and over again. They can use excuses like they are going with what they know, but it’s just lazy. Any SEO company worth their salt knows that they need to tailor a strategy to the individual firm. If they can’t show evidence of this, they shouldn’t be in your considerations.

How Is SEO Different In 2017?

Nope, this isn’t a trick question. In fact, it’s a very real and important one that requires a detailed answer. Simply put SEO changes and evolves every year. So, what was successful in the past isn’t as effective any longer. An ‘expert’ should know this because it’s their job to provide a platform for success. By testing their knowledge, you can gather whether they are faking it or if they indeed know their stuff. It goes without saying that you also need to know how SEO has changed in the past decade.

These questions aren’t exhaustive, but they are four of the most important and revealing.

Everything You Need to Know About Google Possum Update

Based on dozens of ranking report’s analysis, the community of SEO experts announced another massive Google algorithm update on 1st September of 2016. One of the industry experts, Phil Rozek, gave it the name “Google POSSUM update”.

So below we are going to briefly discuss what the Google possum update is.

Actually, a massive change had been noticed on September first, i.e. the variation in the ranking of Google 3 pack and local finder results (AKA the local results or Google map results).

Google Possum Update

What the Google Possum update was in fact,

After the update, industry experts noticed that there is a massive change in Google local business ranking.

According to joy Hawkins (a lead name in SEO industry’s experts),

“Google’s aim with this update was to diversify the local result and remove the duplicate and spammy businesses listing from Google’s local pack result.”

With the change in the algorithm, now Google uses a proximity test to diversify their local search, to rank the quality business lists whether you are outside the city or in the heart of the city.

Here is the perfect example of “Direct inspections”; they extremely wanted to rank for the keyword “home inspector Sarasota” but they were based in outside the city so that time because of Google’s algorithm they couldn’t rank in the list of 10 for that keyword.

Possum Update local pack

But after the possum update, their rank jumped directly 31 to 6. That is an amazing example of “Google possum update”.

Google start to filter local result on the basis of address

As we saw before, sometimes the SEPR (search engine page results) of local business contains some business with the same address. However, their website and contact information was different. So after this update, we will not see these types of spam again.

Indeed, Google updated their algorithm to list only legitimate businesses. They start to filter also on the basis of address.

It should be noted, though two businesses can use same address if their businesses are different.

Now we can determinately say that Google is now more sophisticated than before.

Now location of searcher’s also matter

It is simple to understand, let’s consider that you’re a business owner and there are several branches of your business and your head office is based in Texas. So if your one branch is based in Florida and also rank 1st for the keyword “web-development company in Florida”.

But after the possum update, if you search the same keyword from your head office i.e. Texas, then you found that your rank will down in local business page result.

Because of possum update, the local business search result page will show different pages for different searcher’s locations.

So if the searcher searches the keyword “web-development company in Florida” from Florida then he will found the same result as before.

It is highly sensitive term to change a few of keywords

Undoubtedly, that is another big effect of the Google’s possum update. So now if you’re going to search keywords with some time with small changes, then you will find the different pages for every single change.

For an instance, if someone Google for “attorney in San Diego” and “attorney in San Diego, California” then the both page result will be slightly or may be extremely different.

(5). Local listing operates independently to the organic filter

As we’ve seen before the update, local lists didn’t contain any result that linked to an organic filtered site.

It also has been seen after this update that now the list isn’t affected by linking problem with organic filtered site.

After the update, it can be easily seems that lots of businesses rank high for competitive keywords, while they are linked to the pages that are organically filtered.

Google Panda is Now Included in the Ranking Algorithm

Google Panda Ranking Algorithm

Google Panda is no longer a separate algorithm, used by Google exclusively for the purpose of elevating or decreasing a site’s ranking. Reports are to and fro that Panda has been officially encapsulated by Google’s ranking algorithm.

Jennifer Slegg reported this yesterday. This is what a spokesperson from Google told her:

Panda is an algorithm that’s applied to sites overall and has become one of our core ranking signals.”

Google’s Gary Illyes corroborated this report in his tweet. The Panda 4.2 was released by Google in July, 2015. After analyzing it, experts held that it’s a continuous update, meaning it will keep affecting a site’s ranking.

There have been a number of Panda updates so far. Panda 4.2 seems to be the last on the list.

Relevant questions

Some questions remain unanswered. Last thing we knew about Panda 4.2 was that it’s continuous, and takes several months to completely roll out.

Will Google’s ranking algorithm receive updates real time from now? Barry Schwartz of Search Engine Land dismissed this possibility by saying he’s come across a massive update today and received the confirmation from Google that it was a core ranking algorithm update, which Google rolled over the weekend. The time of the consolidation is currently unknown.

Schwartz speculated the time to be late 2015. He initially couldn’t show any telling evidence in support of his view that the update is not real time, but later got a statement from Gary Illyes. This makes sense because the update being incredibly slow made webmasters suspicious about its nature. Some of them opined it’s a part of the core ranking algorithm, and that’s why taking so much time for rollout.

Takeaways for digital marketers

So far, Google used to evaluate onsite content quality separately with Panda. But now onwards, it’d be one of the parameters for ranking. I consider this as the parameterization of content quality.

We’ll keep our eyes open and our ears sharp. If there’s any update on this, we’ll share it here with you.

Image Credit: http://www.searchengineland.com

3 Takeaways for Google Panda 4.2 Update

Recently Google updated its panda algorithm, PANDA 4.2 in the mid July. Here are some takeaways for you.

panda 4.2 update

>> Panda 4.2 is a site-wide action. Different pages of a website will show different results if hit by panda 4.2 as the very slow and pagewise roll out.

>> This is the slowest ever algorithm update by Google, taking several months and the reason is described as technical but not to confuse webmasters or website owners.

>> There is nothing new about panda 4.2. So the recovery process is same as suggested by google during the first panda update.

You have to wait for several months until the completion of the whole rollout. Else you can check your analytics right away here and check the organic traffic changes for your website. As per Google information, this refresh will only affect maximum of 3% of search queries, which is 2% lower than the previous refresh of maximum 5% in September last year.

Here are couple of more resources:

  • https://blog.seoprofiler.com/2015/08/google-panda-4-2-update-slow-technical-reasons/
  • http://contently.com/strategist/2015/07/31/3-things-content-marketers-needs-to-know-about-googles-panda-4-2-update/
  • http://www.thesempost.com/google-why-the-panda-4-2-roll-out-is-so-slow/
  • http://www.inc.com/peter-roesler/why-you-should-care-about-google-s-panda-4-2-update.html
  • http://www.brafton.com/news/seo-1/panda-4-2-big-marketing-mystery

Google Structure Shakeup, Now Alphabet Inc Is the New Parent Company

Google is no more the parent company for all Google products and ventures. They got a new parent company, Alphabet with ultimate power that would board Larry Page as CEO and Sergey Brin as president. So along with the new name many of the Google’s business are now separate entity under Alphabet while the core internet business will stay as Google Inc.

Larry page describes they liked the  name  Alphabet because it means a collection of letters that represent language, one of humanity’s most important innovations, and is the core of how we index with Google search! We also like that it means alpha-bet (Alpha is investment return above benchmark), which we strive for!

Businesses that will stay a part of Google:

  • Search
  • Apps
  • Advertising
  • YouTube
  • Maps
  • Android

Shareholders will get one Alphabet share for every Google share they previously owned, while Alphabet includes the following entities and will be managed separately.

  • Google Inc.
  • Calico – biotech research
  • Nest – Nest Thermostat and other smart home products
  • Google Ventures and Google Capital
  • Google X – self-driving cars and delivery drones

Larry Page the founder of Google wrote, “Our Company is operating well today, but we think we can make it cleaner and more accountable. So we are creating a new company, called Alphabet  (http://abc.xyz). I am really excited to be running Alphabet as CEO with help from my capable partner, Sergey, as President.”

You can read the whole article here: G is for Google.

In the meantime Chennai, India-born Pichai Sundararajan, better known as Sundar Pichai has been appointed as the new CEO of Google Inc.

Larry also wrote, “I feel very fortunate to have someone as talented as he is to run the slightly slimmed down Google and this frees up time for me to continue to scale our aspirations. I have been spending quite a bit of time with Sundar, helping him and the company in any way I can, and I will of course continue to do that. Google itself is also making all sorts of new products, and I know Sundar will always be focused on innovation — continuing to stretch boundaries. 

SO it is clear now with Alphabet Larry plans to control over all communication and the future of business growth.

3 Results Instead Of 7 – Google Updates the “Local Pack”

Yes, the 7-pack is now the 3-pack. Google has made it short and redesigned it to fit more with the mobile user interface. Due to the ever-growing importance of mobile devices, so the change in the local SEO SERP is more about mobile responsiveness now.

Here is a picture of the new local pack, displaying only three results:

google 3 pack result

Here is a screen shot showing what the local results looked like a week ago.

Google 7 Pack Removed

Ahh, bad news for the websites those were there in the 4 -7 places. So they will have now a fewer sales or leads on those keywords for those they used to be appeared in the 4th or lower position.

Google explains it as “We are constantly exploring the best way to bring a better search experience to our users. This update provides people with more relevant information, including photos, reviews and prices, for searches that have multiple results for a given location.”

Mike Blumenthal has a great post here: THOUGHTS ABOUT THE NEW LOCAL STACK DISPLAY.

You can find some more valuable advice here by Jennifer Slegg: Google Local Shakeup: 3-Pack Only, 7-Pack Removed; Addresses & Phone Numbers Gone

Another great post by Brian: Google goes to Local 3 Pack

My take on the change: With the update the competition for the three spots is going to heat up the local search engine optimization and digital marketers will have to work again on their local SEO strategy.

Share your comment.

Google Issued Warning on Blocking GoogleBot for JavaScript & CSS

Earlier in 2012, Google warned against websites blocking its access to their CSS and JavaScript files to make it open for GoogleBot.  Here is what Matt Cutt said webmasters not to block the CSS for GoogleBots.

Although, most of the Webmasters ignored it then, But Google’s recent email warning through the webmaster consol could open a new discussion. This time Google is very serious and making sure the message is loud and clear by using notifications via email and the Webmaster Search Console, as blocking the JS and CSS files is not allowing the Googlebot to crawl your websites CSS and JavaScript files to serve its users better results while considering the on the user friendliness of a website as a ranking factor.

Though warning about suboptimal rankings is not new here, but the Search Console notifications are quite important to consider. If you don’t want to lose your ranking, you have to let the Googlebot to access your CSS & JavaScript Files.

Here is a picture of the notification as received:

Google Search Console Warnings Issued For Blocking JavaScript CSS

If you received this notification, follow the instructions in the email to diagnosis the issue and/or use the fetch and render tool within the Search Console to see what Google sees and what are blocked by your robots file.

As far as I can tell, all the webmasters received this warning this morning. It is beloved that, this JS and CSS notification was sent out to a huge number of webmasters. I have received the same for 10 of my client websites where I have blocked there JS & CSS files.

But do not panic, it is not a lot of things to do on your website to fix it. Just fix the robots.txt file on your and everything would go fine as soon as you allow the modify the robots file to crawl the files.

But you must act quickly and here is why you need to do it fast. What you can do just update your robots file with below code to quickly unblock javascript & css assets of your website.

User-Agent: Googlebot
 Allow: .js
 Allow: .css

Once you have this, you can test your setup in Search Console – Blocked resources feature.

John Mueller of Google said this in a comment on his own post on Google Plus saying “We’re looking for local, embedded, blocked JS & CSS. So it would be for URLs that you can “allow” in your robots.txt.”

CSS and JavaScript files will improve interaction with GoogleBots

The reason Google is so determined to crawl your website is to see it like an average user. When you block the css/js access, Googlebot cannot access your website layout and will probably never know if its user friendly. As usual Google wants to improve the quality of the content and services it provides to its users and it only wants to render and index websites that are user-friendly and have quality content. Blocking your CSS and JavaScript files will harm your website chances of getting better rendering and indexing.

Identify and Fix Indexing Issues on Your Website through webmaster Console

Search-Console-robots.txt-Tester
Googlebot is the web crawler used by the Google to access a website’s content. Most of the webmasters think it’s a useless resources and block crawling of their CSS and JavaScript files. In fact many of the CMS have a default block on their include files. The rendering and indexing of your website by the Google will allow you to see how the Googlebot sees your page and that’s exactly how your visitors will see it. This will help you in improving your website content for better accessibility. But, this can only happen when you allow Googlebot to fully understand your websites.

Googlebot Crawling Doesn’t Bog Down Your Website, It Rather Improves It

Most of the webmasters have doubts about Googlebot ability to process their CSS and JavaScript files and fear it will increase the bandwidth consumption and bog down their website. But, the Google has got better at processing and won’t make your website sluggish.

Another doubt that most the webmasters have is that since Google is not apt at processing CSS and JavaScript content, it might misinterpret their content as something malicious and block it. But, Google has got better and the fear of getting penalized because of your CSS and JavaScript content is unnecessary.

How It can Impact your Website Ranking

If you are thinking Google has got bullish, you might need to change the perception. Since, Googlebot cannot access your website content, it cannot render or index your pages and eventually your website will have suboptimal ranking. In order to get better ranking, Googlebot needs full access to your contents so that it can judge its quality and see if its user friendly or not.

Google’s warning about blocking the CSS and JavaScript files might open new doors of improvement. Instead of seeing it as a threat, webmaster needs to allow full access for better user interaction and better ranking. A forum discussion you can find at Stack Overflow and don’t forget to comment here.

Google Refreshed The Google Cache Page

Google has made a design upgrade to its cache landing page. So now if will check your Google cache page, it is now a little bit cleaner with option to view the full view of the web page, the text only version of the web page and the source code of the web page at the same tab of the browser.

You can check your website now. To check your Google cache Page of your domain, type cache:domainname.com and search it in Google.

https://digitalvani.com/wp-content/uploads/2015/06/Google-Refreshed-The-Google-Cache-Page.png

Do you like the new look?

All You Need to Know about Google’s Upcoming Mobile-Friendly Update

For quite some time, Google has been popular due to its strict algorithm updates. Google is back again with another algorithm update on April 21st i.e. “Mobile Friendly Ranking”.

Zineb Ait Bahajji from Google said that, “The forthcoming Mobile-Friendly Ranking Algorithm that will start on April 21st will have more of an impact on Google’s search results compared to the Google’s Penguin and the Panda update did”.

Google’s Upcoming Mobile Friendly Update

Google will be increasing its use of mobile responsiveness as a ranking signal effective on April 21. This change will have a major impact on search results, while it will only affect mobile searches (for the time being). Now mobile device users will get search results that are optimized and relevant for their devices. Consequently those websites that are optimized and relevant for mobile devices will get higher rankings.

Google stated that, this change will benefit both the mobile users as well as Google itself. This could give Google a boost in opposition to its rivals, such as Yahoo and Bing, in the search market by providing a better experience for mobile users.

Google suggests taking the following steps to make your website mobile-friendly.

  • In order to identify any issues with your website when viewed on a mobile device, use Webmaster Tools to generate a Mobile Usability Report.
  • To see how optimized your website is for mobile viewing, take Google’s Mobile-Friendly Test. Check out exactly how Google’s own Googlebot views the pages when determining search results by testing a single page on your site or by testing several web pages.
  • Visit Google’s guide to mobile-friendly sites. Here you will get all the vital information on how to make your website mobile friendly.

Also, when ranking search results, Google will factor content from mobile apps; which will be effective immediately. This new feature called App Indexing, indexes apps for content and after that displays those apps more notably in the search results. Note that this only works when users have the website’s mobile app installed on their devices and are signed in to Google.

However, Google does not automatically index app content. For app content to be scanned and to appear on search results, webmasters will need to manually activate App Indexing. Google offers this step-by-step guide to App Indexing to help you get started.

This news is not great for firms with websites that are not optimized for mobile devices. However, this is excellent news for companies with responsive websites. In order to have an advantage over competitors who have not optimized their sites for mobile yet this algorithm change presents an opportunity for mobile-optimized firms.

If you want to get a step ahead of the competition, be sure your website is responsive and optimized for all devices as mobile search traffic will only continue to increase.