2011 google quality rater guidelines pdf




















Sites that don't look trustworthy are treated as such - Things like sleazy advertisements, comment spam, broken formatting and other sketchy looking things guarantee a low quality rating. A bad reputation is a recipe for a low quality rating - Great content on a website with a terrible reputation may receive the lowest quality score.

A bad rep drags a whole site down. High quality content can become low quality content - Webpages that aren't updated with new information will lose quality over time. This is especially true for financial and medical topics. Low effort articles are low quality articles - The quality of a webpage is proportional to the effort that it takes to create it. High quality content must add some kind of value. High quality articles are authored or supported by an expert - Things like financial and medical advice should be written by, or include quotes from, an accredited expert.

Every page should have a clear purpose - A high quality webpage should help the user know something, do something or go somewhere. Since Google released the file in , they have quietly released updated versions, but never pulled the file without explanation.

They have moved the URL before, but with the blog post not being updated to match, that might not be the reason why. There were a ton of changes with the new version of the quality rater guidelines, but nothing specific that would lead me to believe they pulled it for a specific reason. Hopefully we either get an explanation for its removal, or we will see an updated version released that we can compare to and see what Google changed.

In this chapter, you will learn about this document, the quality raters who use it, and whether it can impact your Google search rankings. As the name suggests, they are the guidelines that these raters are to use as they perform their function. The guidelines outline the conditions and elements that need to be considered and how the site should be rated by that person.

And wonderfully, Google places their updated versions at the same URL so you can bookmark it and always be able to find the most current version easily. For those unaware, Google has hired many thousands of individuals from around the world to rate websites and record whether the site is good or bad across a variety of areas. Now, it is important to understand that these people have no impact on the rankings of the sites they rate. This data would then be made available to machine learning systems that would use it to augment the algorithms based on known signal data.

For example, if a site or group of similar sites are consistently rated High or better, the system could review all the signal data from the site s to look for commonality. And by signal data, I am referring to everything from structure, size of page and domain, and related section for that matter , backlinks and backlink profile, author signals, navigation, and likely a whole lot more.

It is also possible that Google may skip the review phase and just push the new signal weights into the algorithms for testing, but I suspect they use their raters more often than not. Arguably, significantly more influence than just looking at a single site and deciding that it should move up or down the results is deciding how all the results on a page are positioned.

Now that we understand what the Search Quality Raters Guidelines are, the next question we need to explore is:. This is an area that gets too little attention — and in their October 14, update, it appears that the folks at Google agree as they grew that section.

Needs Met is a fairly straightforward concept… it basically means intent. During this testing, a rater may visit a single page or visit a search results page and rate every result. Both will send the information to Google about the site structure , device, demographic, and location results differences. These ratings will then be used to drive changes to improve the results to algorithmically determine which signals or signal combinations are common to the higher rankings results.

I suspect that in the case of Needs Met, the signals will predominantly focus on the onsite factors, including but definitely not limited to content, links on the page expanded on in the recent version , structure, and user experience.



0コメント

  • 1000 / 1000