What is Google’s “Possum” Update?

On September 1, Google launched a massive, local algorithm update that had some business owners confused. At first glance, it appeared that their “Google My Business” listings had vanished. This, however, was not the case. Their business listings had simply been filtered. The behavior and initial impression this update made is why many are referring to it as “possum.” The listings are playing a classic game of possum.

After further review, it appears that this update affected ranking signals in the 3-pack and local results. It has been suggested that this update may have been launched in order to expand and diversify local results in order to keep spam websites from establishing a high ranking signal.

One of the adjustments this update made was to Google’s filter for local results. Once this update launched, businesses that are just outside nearby city limits saw an increase in their local ranking. Businesses that try to rank for nearby city-related keywords when they fall short of the physical city limits is typically a challenge, even when they are paying for certain keywords. This is due to the fact that Google Maps can recognize when a particular business’ address is in fact outside of city limits.

This update also caused a lot of businesses to be filtered out of results because the address associated with the business was the same as another listing in the same category. What business owners have to keep in mind is that this filtering process is not a penalty. Google has not removed the listing(s). This filter just picks the most relevant listing and filters the rest that have similar information, much like an organic filter would. This type of filter just proves that the physical location of the searcher is more important than ever.

With so much fluctuation still happening within search results, experts are assuming that Google is still A/B testing certain ranking signals. It is also quite possible that Google could revert back to previous practices.

Do you want to learn more about the recent possum algorithm update? Be sure to check out Joy Hawkins’, a local SEO expert and a top Google contributor, webinar that dives deeper into what this algorithm means for local search.

Google Updates AdWords Policy on Lending Products

David Graff, Director of Global Product Policy at Google, recently announced that an update is set to go live on July 13, 2016 regarding how AdWords handles lending products. Lending product ads will soon be prohibited by Google, just like counterfeit goods, illegal drugs, guns and hacking services, just to name a few.

“We’re banning ads for payday loans and some related products from our ads systems,” Graff stated within his recent blog post. “We will no longer allow ads for loans where repayment is due within 60 days of the date of issue.”

What exactly is a payday loan? These types of loans are short-term and usually have a due date that revolves around the consumer’s upcoming pay day. Even though these loans are typically for smaller amounts, they come with an extremely high interest rate if the debt isn’t paid in full when it is due.

“A typical two-week payday loan with a $15 per $100 fee equates to an annual percentage rate of almost 400%. By comparison, APRs on credit cards can range from about 12 percent to 30 percent,” according to the Consumer Financial Protection Bureau (CFPB).

Graff goes on to say that they are also banning ads for loans that have an annual percentage rate (APR) of 36% or higher. After a lot of research, he said it was shown that these types of loans can result in unaffordable payment plans and massive default rates for users.

“We will be updating our policies globally to reflect that,” Graff said.

Graff said the decision to end their relationship with lending ad products is to better protect users from deceptive products and companies who are offering harmful loans and credit card offers.

“We’ll continue to review the effectiveness of this policy, but our hope is that fewer people will be exposed to misleading or harmful products,” Graff said.

With this upcoming adjustment to the AdWords policy, it doesn’t mean that you won’t be able to search for payday lenders. Lenders for these types of loans will still be found within Google’s organic search results. This policy adjustment simply means that you will no longer see advertisements for these loan services at the top of the page.

Google Launches New Mobile-Friendly Update

On May 12, John Mueller of Google announced on Twitter that a new mobile algorithm had officially gone live in Google’s mobile search results. The discussion of this mobile-friendly update first surfaced back in March.

The objective of this recent update is said to increase the overall effect of mobile-friendly webpages, giving them a boost in page rank.

Even though this update has officially rolled out, the two points that Google made back in March still remain the same.

  • If you have already made your website mobile-friendly, you will not notice an impact to your website.
  • Search query intent still sends a very strong signal. So, if you have pages on your website that have solid, highly sought-after content, it will still rank well even if it is not found on a mobile-friendly page. However, embracing the mobile-friendly appearance is still highly encouraged.

Are you not sure if your webpages are being viewed as mobile-friendly by Google? Insert your website’s URL into the Google Developers Mobile-Friendly Test to see what your pages look like. If your pages are not mobile-friendly, the test will be able to give you a list of reasons as to why it is not compatible with mobile devices. The test will also tell you how your website is viewed by Googlebots and will provide the proper steps you can take toward creating a mobile-friendly website.

If your website is mobile-friendly, it is important to occasionally check the developer’s tool to ensure your website is being viewed properly. Be sure to check out the AdSense Multi-screen Implementation Guide to learn how to grow your audience. You can also help your website show up in search results by reviewing the steps found in the Mobile Search Engine Optimization Guide.

Search Quality Rating Guidelines Gets Updated

On March 28, Google updated their Search Quality Rating Guidelines. This recent update reduced the once 160-page PDF document to 146 pages. Back in November 2015 when the original 160 pages was introduced, this was the first time Google released anything of its kind. Previously, only short snippets of the document were leaked and only made available to Search Quality Raters. The driving force behind these guidelines was to assist Google Search Quality Raters and give them a better understanding of how to rate the search results they were testing.

There were changes of all sizes included in the recent Search Quality Rating Guidelines update. Some of the most talked-about changes include the following:

Supplementary Content

One of the biggest adjustments to the guidelines was the minimized emphasis on supplementary content. This change caught many by surprise, especially since older versions of the guidelines spotlighted its importance. It is suggested that this may be due to the fact that supplementary content is not used as often on mobile devices. However, that doesn’t mean you should take it off our webpage completely. By offering supplementary content, your webpage is user friendly.

Jennifer Slegg (Found & Editor at The SEM Post) said it is important to create your content for users, not raters.

“The last thing you would want to do is remove your supplementary content,” Slegg stated in her post which focused on the recent guideline update. “After all, you really should be creating your site’s content for the users, not for Google and/or their raters.  But webmasters shouldn’t feel the pressure of ‘have to add tons of supplementary content’ for the sake of the raters.”


In the updated guidelines, Google removed all references to local searches and started calling them “Visit-In-Person” instead. This adjustment was made in the hopes that it would make it easier, especially for raters in other countries, to understand the difference between a local generic search opposed to a search that shows intent of visiting the actual location.

Overall Maintenance

Google removed the entire section regarding maintenance from the updated Search Quality Rating guidelines.

In the previous version of the guidelines, Google stated “Webmasters are responsible for updating and maintaining sites they create. Most websites add or change content over time. Web browsers, such as Chrome, update with new versions. Webmasters need to make sure their websites function well for users as web browsers change.”

Google also encouraged raters in the previous guide to “poke around” to see if links worked, if images loaded properly, and to ensure that content was added and updated over time. By doing so, raters were able to decide if a webpage was being properly maintained.

The removal of the maintenance section in the updated guidelines is suggested to mean that Google may just be shifting around the level of importance between what types of webpages are updated and which ones are not.

Expertise, Authoritativeness & Trustworthiness (E-A-T)

The updated guidelines gave a deeper explanation as to what Expertise, Authoritativeness and Trustworthiness (E-A-T) entails.

“The amount of E-A-T that a webpage/website has is very important. MC quality and amount, website information, and website reputation all inform the E-A-T of a website,” the new guidelines stated.

They also added that regular updating for these types of pages is imperative. This may suggest that Google feels that maintenance is only important for particular pages that rank at a certain level.

To review an extensive list of the recent adjustments, be sure to check out the rest of Slegg’s article on The SEM Post for more information. To see the changes for yourself or to brush up on the general guidelines, you can download the updated version of the Search Quality Rating Guidelines in PDF form.

Google to Update Mobile-Friendly Algorithm

On March 16, Google announced on their Webmaster Central Blog that they would be boosting the mobile-friendly experience for users in May 2016. This boost will be updating the effects of the original algorithm that was first launched in April 2015 that used mobile-friendliness as a ranking signal on mobile searches. Google said in their announcement that they strongly believe users should be able to get the answers to their searches, no matter what type of device they are using.

“This update is a continuation of what took place last year. It reinforces the reason why you need to invest in a mobile friendly website and optimize it for mobile search” — Brad Pitzl, SEO Manager at Intertwine Interactive

Google assured readers in their announcement that if you have already made your website mobile-friendly, you will not be impacted by the update. They encouraged those who have not taken that route yet or who may have additional questions about the process to take the Mobile-Friendly Test and to review the Webmaster Mobile Guide.

However, Google stated that the “intent of the search query” is still an important factor and gives off a strong signal. This is especially true for pages who have yet to be converted as a mobile-friendly site.

“Even if a page with high quality content is not mobile-friendly, it could still rank well if it has great, relevant content,” stated Google.

Google Update on Search Effects of JavaScript Sites, PWAs

John Mueller, Google Webmaster Trends Analyst, recently published an update on Google+ that touched on how Google is currently dealing with JavaScript (JS) sites and Progressive Web Apps (PWA) within Google Search, as well as, ranking and crawling.

Mueller’s update began by pressing the importance of websites not cloaking to Googlebot. He encouraged the use of “feature detection” and “progressive enhancement” techniques that will make your online content available to everyone. He also reminded readers that using rel=canonical is required when serving content from a number of URLS. He also stressed the importance of avoiding redirection to unsupported browser pages. He suggested using a reliable fallback option such as polyfill, which is a piece of code (or plugin) that provides technology not already built into a web browser.

How do AJAX-Crawling scheme and AMP affect my site?

Mueller went on to say that it is important to avoid the AJAX-Crawling scheme on brand new sites, and to migrate old sites that currently use the scheme. It is also important to remember to remove “meta fragment” tags when you do decide to migrate the sites. He says you don’t want to use a meta fragment tag if the “escaped fragment” URL doesn’t offer completely rendered content.

When using AMP (Accelerated Mobile Pages) for your website, the AMP HTML must be “static as required by the spec.” However, the associated web page can be constructed using the JS/PWA techniques. Mueller reminded readers to use a sitemap file with the correct “lastmod” dates in order to signal that changes have been made on your website.

Is Google still indexing my URLs?

According to Mueller, it is important to avoid using “#” within URLs because Googlebot rarely ever indexes those site URLs. He suggests sticking with the traditional URLs that are made up of path/filename/query-parameters, and that the History API is used for navigation. Using Search Console’s Fetch and Render tool is another great way to test how Googlebot sees your pages. Mueller reminded readers that this tool doesn’t support “#!” or “#” URL addresses.

He also touched on the importance of ensuring that all required sources, including JavaScript files, frameworks, server responses and 3rd party APIs, are not being blocked by “robots.txt.” By using the Fetch and Render tool though, you will be able to see a list of blocked resources. If you find that resources are uncontrollably blocked by robots.txt (or are just currently unavailable), be sure that your client-side code fails gracefully.

Mueller told readers to limit the number of embedded resources, especially the number of JavaScript files and server responses that are required to render your page. A high number of required URLs can cause timeouts and rendering without the availability of embedded resources. Developers are encouraged to use reasonable HTTP caching directives.

Will this change in the future?

Readers should keep in mind that other search engines and web services that are accessing your content may not even support JavaScript at all, but may support a different subset.

“Looking at this list, none of these recommendations are completely new & limited to today, and they’ll continue to be valid for foreseeable future. Working with modern JavaScript frameworks for search can be a bit intimidating at first, but they open up some really neat possibilities to make fast & awesome sites, said Mueller.