As part of your retained SEO support we conduct development and DevOps tasks to ensure your website is optimised for search. Maintaining your technical SEO and ensuring it is kept in-line with best practice is key to ensure organic growth for any website. We will triage any technical SEO issues and address the most pressing needs first. Many technical SEO issue are systemic in nature, so we will generally attempt to create a permanent fix where this is possible – by applying SEO best practices at a systems level.
Google and other search engines have operating costs associated with crawling, indexing and algorithmically evaluating your website. Each time Googlebot crawls through your website there is an efficiency to this process – slower websites take longer to crawl than fast websites. Websites with an XML sitemaps link in their Robots.txt file will be more easily crawled than manually crawling through the entire hyperlink structure of the website to discover all the links. The XML sitemap can also indicate how often the content will change / be updated, meaning the search engine crawl spider can schedule fewer visits to keep the index up to date. These factors and others fall under what Google terms “crawl budget”. Our job is to make your website’s crawl budget inexpensive – this will ensure the spiders can crawl the website more efficiently and your content stays up to date. If your crawl budget is too expensive and certain pages are discounted, Google may not crawl them or take a long time to index these pages. We will analyse your site and ensure it can be crawled efficiently.
Dead links, errors and server response issues can all produce crawl errors when a search engine spider is trying to access and crawl the website. We will monitor these errors and iron out any issues the search engine spiders encounter.
Google algorithmically favours a mobile-first philosophy and is actively pushing AMP and PWA technologies to help improve mobile UX and speed. This means at the very least, you must have a responsive or mobile version of your website. Google will also analyse speed and page rendering / layout to determine whether the UI and UX are suited for mobile users. We will ensure your website’s codebase, templates and architecture are optimised for mobile users and that these meet Google specific criteria.
Google has algorithmic rules regarding content duplication. The most common form of duplication is usually unintended and comes from the same URLs resolving with slightly different and unintended variations. Common examples of duplicate resolving URLS are:
To a search engine these page variations are technically different pages. We will ensure that these technical duplicates don’t occur by systemically addressing them at the server URL rewrite rule level and template level.
As well as server and template based technical duplication, we will also monitor and ensure that content introduced via the CMS does not produce duplication penalties. For example, if you have a lot of product pages containing the same “delivery” blurb, how much other content will be required to offset a potential duplicate content trigger. We will review pages and content to ensure that the organic day-to-day management of the site does not introduce these issues.
Page speed and overall site performance has a huge impact on UX – 40% of users click-back after 3 seconds if a page has not loaded. Google understands this and has factored this into how they gauge and assess relevance in their results. Making sure your site is performing in the top percentile of similar sites is key to maintaining a solid search presence. We’ll work with you to ensure your website is optimised for speed and can deliver users the content they are looking for as fast as possible. Google’s targets are a time-to-first-byte (TTFB) of 200 milliseconds and a page load in 3 seconds.
What’s more, your website’s information architecture and nomenclature have a huge impact on SEO performance and relevance. We will ensure these conform to your commercial goals and support your keyword strategy. We will therefore liaise with your team and recommend the best naming format and taxonomy for new categories or content.
Your page structure and internal linking will determine where link authority is pooled and also how accessible pages are to spiders trying to crawl your website. Internal linking link text will also define topics and subjects of pages just as much as their headings and page titles – so the nomenclature of your information architecture needs to be optimised towards your keyword strategy too. We’ll work with you to ensure pages are structured and the link authority is sculpted across the site for maximum impact.
Google uses schema.org semantic structured data to expose “rich snippets” on the search engine results page. These cover a wide range of data points from business location, telephone number and opening hours through to product prices, reviews and ratings, or article authorship. We will used Google’s preferred method (JSON-LD) to append this data into your website’s code and ensure all content and data fully leverages this technology.
Like structured data, social data is a behind the scenes code that allows us to mark-up and present data for social networks in formats that they have prescribed so your pages and their content when linked to in social media appear / are presented in an aesthetically please manner. This ensures you are presenting your content in an expected and predictable way.
Our team can conduct a lightening audit of your site and give you some feedback and recommendations for better technical SEO scoring.