Part site building, part content creation, this should be of interest to both developers and marketers, who are often asked to provide technical input for SEO asks and ideation. Wherein discourse is provided on what we can do as site builders and content creators to provide relevancy to our content, and subsequently better search engine placement.
The Google Algorithm is the set of formulae that Google uses to determine where any particular web page will appear in search engine results pages (the ‘SERPs’) in response to a specific user query. The precise list of factors that go into assaying where a page should appear is obviously a closely-guarded secret known only to a very select few at Mountain View, and particular weighting on any one factor changes periodically. There may be as many as 150 different factors that inform the algorithm, but some are weighted more important than others, and for sake of succinct, I will detail the ones that will help the most in terms of return upon effort.
Let's get started.
When you’re looking to rank a site in the search engines for a particular search query, you should have a list of keywords to hand - these are the particular keywords that you want the site to rank for. You should be aware of the difference between long and short tail keywords. Digital strategy to hand and good to go? Methodically detailed, you should have access to a record of the site’s keywords and phrases in a keyword list spreadsheet (or a ‘keyword universe’ as some are apt to call it). It should match a keyword to both a site user intent, and a landing page if known. You need to cross reference this with your site map. Do the aspirations of this keyword list match with existing site structure, layout, architecture? I’ll do a separate post on this at some time, but for now get aware of how crawlability can be restricted, encouraged, and sculpted to form rivers (and funnels) of SEO-goodness across your site. Think primary information flows and especially internal linking.
The number one rule when considering any particular page for ranking for a particular search query from a user is relevancy. Google lives and dies by relevancy. That is all that matters. Keep this in mind as you get creative, tackle issues and provide solutions in this journey.
And additional considerations? The most that any internal page will rank well for is two or three short keywords on page one - but there are exceptions. Don’t try and apply the scattergun effect. These keywords will trigger another 3 to 4 key phrases beyond each keyword when stemmed (‘a good day out in Vancouver’ will also rank for ‘great places to visit for a day in Vancouver’ so NLP Intent is a factor). If you need more keywords in a Google top ten position then create more content for those words rather than potentially upsetting a major landing page. Be aware that 90% of users do not progress beyond Google’s page one when searching.
And things take time in SEO. You should start to see good results in 14 to 28 days. Use a good software like SEMRush or AHRefs to track positioning for you. Finding a good rank for a keyword can be a cryptic process. Deep analysis and some out of the box thinking when reviewing incoming stats will help you understand how Google is responding to your site, and offer you potentially new and innovative solutions as to how you might express relevancy to the search engines.
Sidenote: Why SEO? I often explain SEO to potential Clients thus; that you should equate your SEO investment as your Internet ground rent. Little or no SEO spend means you get a side street position, with little or no footfall. Decent SEO expenditure will give you a High Street positioning, with the correspondingly greater footfall, and a busy site, with all the intelligence and the benefits that can bring. It really does affect the bottom line. To those Clients that have PPC spend, and are spending serious money on SEM AdWords campaigns - I say that we can manage your AdWords campaigns, but we would like to start taking some of this budget and going for the longer view - and getting you organic placement. If you’re overly-reliant on Adwords, then something isn’t working. We can fix that.
So the tl;dr then, in no particular order …
If you are using the metatag module you have a second bite at the cherry to provide relevancy to the search engines. The page title is the title that appears in the browser tab, that can be different from the H1 tag you are providing within the body of the content. Don’t keyword stuff, but use the keywords that you want this page to rank for in the SERPs.
Google will reward those sites that have implemented a mobile-first responsive design. Unless you’re a news site, I suggest AMP is not relevant.
H1 Tag: Semantically-Correct Usage of H2 -> H5 Tags
The H1 tag is probably the single most important thing for page ranking, as long as everything else is in place. If you think about it, this defines relevance. It’s what the page is about. There can only be one on each as per the HTML specification. The other tags can be used to properly define content - and page sections, you can use this as a driver for Sculpted Flow.
Keyword Content, Positioning, Frequency
I’d look to have my chosen keywords appearing very near to the front of the first paragraph, variations - or stems - somewhere in the middle, and definitely in the closing paragraph.
Integrate SEO hooks into the content creation process. Apply SEO consideration to the content creation process, to system workflows and to architecture.
Content Creators: Fresh, Original Content with 400 Words+
Some clients do not have content-origination. I might end up writing the content myself as an interesting aside. You find yourself mid-way through an SEO campaign becoming a bit of a subject matter expert. You begin to understand your client’s business well, you’ve identified competitors, and you know both the market and SEO landscapes; and thence identified opportunity. Sourcing supportive content should not be an issue. Google doesn’t rank static sites particularly well, and will reward those sites that provide fresh and original content on a regular basis.
Quite simply, Google rewards high-quality sites with ‘original content, thoughtful analysis, research reports etc.’.
The most important directive is that you regularly review your stats, and you apply interpretation and proposed solution in response to your analysis and perceived insight of same.
Case Study: In the month of December and January 2018/2019, we doubled the amount of static content pages available on one particular site. These pages provided information on product and process. This allowed us to target more keywords (remember, each page targets 3 to 4 keywords max and then stems beyond that). Google Analytics has shown that people are now reading ~twice as much content as before - average pages / session has risen from 2.2 to 4.3 as of February 2019.
As the site becomes ‘stickier’ and engages more users, 189 key phrases have appeared in the top 10 SERPs that were not there before - courtesy of the new articles. We have engaged the user better and now have more opportunity to provide a memorable brand or product experience, leading to further engagement and goal completions.
Applying interpretation of this particular stat, and knowing that the growth was seen to product pages and not just to generic blog news, we can see that there is a specific user requirement for more information about the generalized product/service our client supplies that is not fulfilling user demand of interest, and thus identify new opportunity and content requests that will apply significant and targeted audience growth.
Keywords appearing in the URL are still important - and I see this every week. Sites should support a friendly URL structure - so no ?query strings. Site navigation should follow a logical and easily-navigated structure - i.e. homes/ then homes/langley/ then homes/langley/3-bed-family etc., and be self-discoverable in a fairly neuralistic way.
And a health check. Get oversight of 301 redirects on your Site. Get access to Google Search Console and look for errors, fetch as Google and check sitemap.xml and robots.txt. What’s .htaccess doing?
Site Architecture, Structure, Authority
Site structure can fall anywhere between flat - or linear - with little depth (i.e. a news site), to a hierarchal multi-depth site (a detailed Product/eCommerce site with location segmentation). It depends on the amount of content across subject matter while retaining relevance. For more linear sites, inner index pages (Or ‘rivers of news’) can provide keyword-rich landing pages and create a more 3-dimensional array with connecting nodes, centred around one particular subject or location, cascading clickthroughs with granularity.
Site authority is a determinant reflective of how well Google sees your site within it’s subject-vertical ecosystem. Authority recognition is aided by the acquisition of backlinks from multiple sources, including other authority sites. It’s an inner-circle, yet almost fractal in nature - varying between concept, entity, field - or industry. It is one factor in the Google algorithm and for the majority, recognition is a long term game.
Internal Links, Tree Root Foundation
Page rank (another factor in the Google Algorithm) can be sculpted by strengthening an internal page by linking internally to it from other internal pages, especially by adding a keyword into the title attribute for the link. I call this the Tree Root Foundation, because strong internal linking builds a good mesh of keyword pages with strong page rank, that will subsequently pass on this good page rank to child pages for other, associated, keywords. When building a site from scratch, consider how as the developer you might provide the heavy lifting yet ease of use to an editor when trying to provide strong and relevant inner-site linking on new and existing content.
In recent years Google has turned to Social Signals - backlinks from Facebook, Twitter et al - as an indicator of how relevant your content is to a particular keyword. As previously stated, backlinks from authority sites are also weighted. Who links to your content, for what keywords, where from, by what title attribute, and how many - are all good indicators to Google of relevance for a particular search keyword or phrase. Authority backlinks cannot be garnered overnight - but keep on providing good, strong and unique content and promote it well across social media, and the links will come. There is some psychology in play here - what makes people link to a particular page? How useful are your posts?
Charting the Knowledge Graph - Structured Data and the Rise of Zero-Click Search
2018 saw the significant emergence of position zero content - that content that appears at the top of the page as ‘quick answers’. Nearly 35% of desktop searches on Google now result in no clickthroughs as Google scrapes information to frequently asked queries and displays same. Mobile devices yield even higher - ~62%. Search engine or information engine? It’s a valuable position if you can get it. But how can you get listed for that page positioning?
For every 100 searches on Google mobile in September 2018, there were:
- 38.5 clicks on an organic result
- 3.4 clicks on a paid result
- 61.5 no-click searches
For every 100 searches on Google desktop in September 2018, there were:
- 65.6 clicks on an organic result
- 3.7 clicks on a paid result
- 34.3 no-click searches
This position zero content is provided from Google’s Knowledge Graph, an AI-powered, contextually-aware (search for Mary Shelley and it knows you might also be interested in Bram Stoker) Google-owned sub-search engine that can be seen as feeding into Google’s main search engine. Getting into the Knowledge Graph is obviously attractive, add in the fact that the Knowledge Graph powers voice search - wherein the next couple of years 40% of search will be initiated by voice (‘Hello Google!’) - and you get the picture.
The Knowledge Graph essentially links entities with nodes (or nuggets) of information about them. Mining analogies are never far away in search engine land.
It is updated constantly. So I can get relevant <information> about a <place or thing> where <information> is the node and the <place or thing> is the entity - being more immutable. What is of particular interest is that we are seeing more and more Knowledge Graph entries being returned as regular search results, and not part of the usual ‘position zero’ heraldry.
You can help yourself get shown by the Knowledge Graph by implementing structured data into each of your web pages, structured data is JSON - descriptive metadata added to your HTML applying the schema.org standard, informing the search engines of the content and context of your web page. Schema.org is backed by the major search engines, including Google, Bing, Yahoo and Yandex. For bonus points, parts of the DOM can be assigned a speakable div, thereby indicating to the search engines which parts of your page can provide a structured (and a good experience) summary as a response.
There is a Drupal module that can aid in this implementation.
Structured data can help provide structure to how your search result appears within Google - and this obviously helps you sculpt clickthrough rates. There are early indications that structured data can help placement in organic as well as position zero placement, although as of February 2019 Google says the opposite - that it does not affect organic placement. Either way, being able to grab a position zero spot, with the possibility of getting a voice search result, and being able to manipulate your listing result (for example having a small search text block appear in your result to search your site, or to get a right hand side detail placement for your company) are strong signals that implementation of Structured Data on your site would provide a very good return on investment at this still-early-adoption stage at the beginning of 2019.