Google publish guidelines for webmasters – people who build websites – to help them understand what they can and can’t do to increase the chances of their content ranking.
Here follows a digest of the main points to save you having to read the whole thing.
The General Guidelines are important because Google tell us explicitly that following them will help Google find, index and rank your content.
The Quality Guidelines are designed to identify webmasters being seen to use deceptive or manipulative practices to improve their rankings, which can lead to penalties being applied by Google to demote their content.
We know that the intricacies and subtleties of developing websites and content to be found on the search engines can be a little dull to anyone who works outside of this industry. But if you’re considering taking us on for a project, we’d urge you to at least skim through this article – and the accompanying Search Quality Rater Guidelines – so you understand a little about what we’re doing and why we’re doing it.
To reward you for you for your time, we’ve broken up the text with images of animals which are in some way related to the the text they accompany. It may not always be immediately apparent how they’re related, but there you are.
Ok, here goes.
Help Google FIND your pages
Eagle image by cocoparisienne
For Googlebot to find a page – Googlebot is the name Google give to their crawler, which crawls the web looking for content – it must be linked to from another findable page. This basically means that if you don’t add a link to it from any navigation, or another page, then Google won’t know about it.
You can send Google a sitemap that provides links to all the important pages on your site. When you add new content to the site you can either resubmit the whole sitemap, or just the individual pages.
Google use the phrase crawl budget, which is the number of pages Googlebot will crawl on any given day. If your site has lots of content and a low crawl budget, then the pages are not going to be crawled as regularly. For most webmasters with small websites, think less than 1,000 pages, crawl budget is not something to be concerned with. If you do have a large site, you need to make sure your robots.txt file instructs Googlebot to only crawl the relevant pages on your site, or you could be wasting budget.
Whilst it’s important to add links to a page, Google state that the number of links should be reasonable i.e. don’t add too many, they must be relevant.
As well as submitting your site or content to Google, they suggest that other sites that should know about your pages are made aware of them. This is also known as out-reach and is a common Link Building practice. Google recognise links from other reputable sites to yours as potentially an indication of quality content.
Help Google UNDERSTAND your pages
Dolphin image by Claudia14
Let’s suppose Google have found your pages. Next they need to understand them so they can add them to their index accurately. Until they are indexed, they won’t appear on the Search Engine Results Pages (SERPs).
Conversely if you have pages on the index that are contributing to crawl bloat (aka index or indexation bloat – a lot of pages for Google to crawl), it might be sensible to remove them from the index. Adding a rule to robots.txt will not do this, but there are other means.
You should create useful, information rich content. This tallies with what they tell their raters to look out for in their Search Quality Rater Guidelines.
You should focus on the words users would type into a search engine – the query (aka the keyword) – and ensure these words are included on the page.
You should make good use of the title tag and alt attributes to describe your content. The former is one of the more important fields to help optimise your page, and it’s generally recommended you include the most important, relevant keyword in here, preferably towards the start of the tag.
The alt attribute relates to naming of any images on the page for screen readers, so that your content is accessible to the visually impaired. Images should be assigned image titles too, and uploaded with descriptive file names (describing what’s happening in the image), which might help your images to rank on image search.
Your website should have a clear conceptual page hierarchy which basically means that your site architecture should be well structured to help users find their way easily to the content they’re looking for. The most important pages should be closer to the homepage, in other words, the visitor shouldn’t have to click too many times to get to it.
You should use structured data markup on your pages to help Googlebot understand the content on your pages. By adding markup you stand the chance of having additional elements in your result blocks (aka snippets), rating stars for example for products, should they be returned on the SERPs. These additional elements can help your result blocks stand out on the SERPs and therefore attract clicks.
Help VISITORS USE your pages
Hamster image by PublicDomainPictures
We know from Google’s Search Quality Rater Guidelines that user experience is a pretty crucial ranking factor. These webmaster guidelines go on to tell us more about how to satisfy users.
You should use text rather than images to display important names, content, or links.
Try to avoid having broken links on your site – that is links from pages on your site to other pages that no longer exist for any reason. You can keep track of these by crawling your own website regularly.
Your pages should load quickly. Users don’t want to wait long for a page to load. If they click back to the SERP before the page has opened this is deemed a pretty negative user signal and as such may adversely impact the ranking of said page. The advice is to test your page load times and optimise as appropriate.
Your pages should be easy to use on all devices – desktops, tablets and smartphones. There is a test you can use to discover whether or not your site is considered mobile-friendly by Google.
Our advice, should you be considering building a new website, or rebuilding an existing one, is to consider how the site looks on mobile-devices first. It’s far easier to prioritise your design for mobile, then desktop, than the other way round. You should also check to see how your website looks on a variety of different website browsers, think Google Chrome, Microsoft Edge and Apple Safari for starters.
Make your site secure with HTTPS. Users need to know they are on a secure site, particularly if they are adding their data and/or payment details to a page. From July 2018, Google will highlight all HTTP sites as not-secure on their browser.
Ensure your site is accessible, for example to visitors who may be using a screen reader. There are many accessibility tests that you can carry out on your site.
Dog image by moreharmony
Webmasters are advised to play by the rules for their content to be in with a chance of ranking well and to avoid a manual action (aka penalty) which could lead to a removal of a site from the search results. So it’s about upholding the principles rather than looking for loopholes to exploit.
And these are the principles:
- Write content for users, not search engines.
- Don’t deceive users.
- Avoid using tricks to improve your rankings. If what you’re doing doesn’t help users, it’s probably not a good idea.
- And lastly, we’re told to think about what makes our website unique, valuable, or engaging. In other words, what makes your website and content stand out from the competition.
And the tactics we’re advised against using are listed below:
Duplicate or copied content
The message here is to write your own content, not to copy it from somewhere else on the web. They cite examples of this including automatically generated content (which may or may not be stuffed full of keywords or gibberish), doorway pages (created specifically to rank for certain queries, with little or no unique content) and scraped content (directly copied from other sites)
Participating in link schemes
Links to your content from other authoratative, relevant websites are often seen as a positive endorsement by Google and can improve rankings. There are schemes available to webmaster whereby they can buy links, and these are to be avoided at all costs.
Serving different content to Googlebot than the users, in an attempt to manipulate results.
Redirects generally are ok if they help users find content to serve the intent of their search. Deceptively redirecting a user to content that doesn’t deliver on their intent, is not acceptable to Google.
Hidden text or links
Squirrel image by Jim_Combs
You can’t add text or links on your pages and hide them from the user (using white text on a white background for example) in an attempt to improve your rankings. It’s just not cricket, as some might say.
Participating in affiliate programs without adding sufficient value
Affiliates sites, i.e. sites that are similar in nature to another because they are promoting the same products or services are ok, as long as they don’t completely duplicate the content of the other. They must add their own value, or differentiator, where possible.
Loading pages with irrelevant keywords
This refers to keyword stuffing, adding as many keywords to a page as possible, often to the detriment of the user experience. I’d refer you back to one of the quality principles, write for users not bots, or you could become unstuck.
Creating pages with malicious behavior, such as phishing or installing viruses, trojans, or other badware
If the user doesn’t see on the page what they’re expecting to see, this is bad news for said piece of content and potentially the site as a whole.
Abusing rich snippets markup
Google have published guidelines on how to use structured data to help inform Google as to what’s on a page. They’ll not tolerate attempts to manipulate these.
Finally we’re told to monitor our site for hacked content or user-generated spam.