Home Technical SEO

Technical SEO Audit

This is a full-scale example of Miklagárd SEO Teams last years (2018) Technical SEO Audit document with the top most issues and general recommendations, now available for the wider public, i.e. you! Accompanying this audit document, we include an ​Audit Action Items sheet​ containing all the relevant tabs mentioned throughout the audit. Within this sheet​ is normally a ‘Summary’ tab that Miklagárd recommend your development team use as a checklist for implementing our recommendations, that is, if Miklagárd is not assigned and handling the full-scale SEO audit and implementation of your website.

The Technical SEO Audit document provide insight and recommendations regarding the technical onsite web ‘elements’ and configuration of your website.

Miklagárd provides recommendations to improve the technical performance and subsequent visibility of your website in relation to search engine optimisation (SEO).

Miklagárd use a range of leading SEO tools to research a wide variety of technical website ‘elements’. These elements include on-page as well as off-page factors that influence how a website is evaluated by a search engine and subsequently perform in terms of site visibility and ranking in search engine results pages (SERPs).

Each element is reviewed and then assigned an evaluation grade based upon the priority and degree of importance;

  • High priority. Elements categorised as ‘Red’ should be rectified first
  • Medium priority. Although important to fix, ‘Red’ elements will take precedence
  • Low priority. Elements classified as ‘Green’ have little or no issues to be rectified

Miklagárd will when necessary provide links to reports from Google Analytics, Google Search Console, so it would be a good idea to log in to these in your default browser. If you have a personal or other professional Gmail (Google Apps login), you may want to log out of it in your default browser and use a secondary browser.

Content

Since the beginning of the Internet, search engines have sought to serve up web pages that searchers like and engage with, without immediately returning to the search results web page to try again. Taken in aggregate over millions of queries a day, search engines build up a good pool of data to judge the quality of their results. Google, in particular, uses data from Chrome, its toolbar, and Android devices to measure these engagement metrics. It is therefore essential that each web page on the site includes relevant, unique, and fresh content that is strategically optimized to rank for a site’s targeted keywords.

Common content problems and how to deal with them:

Issue

Site contains thin web pages:

Problem

These web pages have less than ~450 words within the body content of the page. Google needs enough content per web page to calculate the pages content and meaning to better rank the web page for relevant queries.

Recommendations

Step 1: Identify top web pages with thin content.

Step 2: “Thicken” these web pages to allow Google to better understand their meaning and context.

Issue

Under-optimised web pages:

Problem

These web pages can be under-optimized for the keyword(s) you want them to rank for. One unintended consequence of this is keyword cannibalisation. This occurs when you have multiple web pages optimized for the same keyword, and the search engines can’t determine which web page they should send searchers to for that keyword.

Recommendations

Step 1: Identify potential query cannibalisation.

Step 2: Optimising these web page by reviewing the title and description, adding more content and aggregating user intent will give the web page more value to both users and Google.

The solution with eliminate the issue of preventing users from engaging with it for their search query.

Issue

Web pages with missing H1, H2, H3 header tags:

Problem

Search engines use the h1, h2, h3 header tag to interpret what a web page is about, much the same way readers of a newspaper use article titles to get an idea of what an article is about. When web pages are missing header tag, it is harder for both visitors and search engines to decipher what the web page is about.

Recommendations

Step 1: Identify the most important heading within web pages with multiple h1s and make those headings h1.

Step 2: The least important heading can be tagged as h2 or h3 headers.

The header tag should contain a ​judicious​ use of the keyword(s) you are targeting the web page for. Be careful to make the language natural and not keyword stuffed. If you have to decide between user experience and SEO, go with user experience.

Issue

Web pages contain multiple H1 header tags:

Problem

Following the above newspaper analogy in the above issue about ​missing header tags, section​, you wouldn’t have two titles on a newspaper or magazine article.

Recommendations

Step 1: Identify web pages missing h1 header tags.

Step 2: Add h1 heading to all web pages missing h1 tags.

A web page should only contain one h1 tag – it tells the reader what your web page is about.

Crawl

Search engines frequently crawl websites in order to determine which ones should be indexed in their massive databases. Search engine crawlers – also known as robots, bots, or spiders – collect web pages they deem high quality.

Over time, search engines will continue to request previously downloaded web pages to check for content updates. A certain amount of bandwidth is then allocated for the periodic reviews based on the web pages’ calculated relevancy. Every time a web page is downloaded, bandwidth is used, and once a website’s allocated bandwidth limit is reached, no more web pages will be crawled until the next review. This is referred to as a site’s crawl budget, and it works just like any other budget. But in the case of a search engine, when your site’s crawl budget has been exhausted, the bot moves on.

Since there is a limited amount of allotted bandwidth, it is crucial to direct the crawlers to the content you most want to be included in the search engine’s index of web content and eliminate any unnecessary or duplicate content, so as to avoid search engine crawlers dropping a crawl too early, a process known as ​crawl abandonment​.

Google uses a variety of metrics, including PageRank, to determine a website’s most important web pages and then crawls them more frequently.

To use your website’s crawl budget effectively, it is important to investigate errors reported in Google’s Search Console (formerly Google Webmaster Tools) to make sure that those web pages render properly. If a web page can’t be fixed, due diligence must be applied to make sure the web page doesn’t negatively impact the rest of the site. This can be done several ways:

  • 301 Redirect. Redirect these web pages to their new URLs. Use a 302 if the redirect is truly temporary (e.g., a product that’s out of stock).
  • Missing web pages. If you have a lot of missing web pages, allow them to return a 4xx (e.g., 404, 410) status code but remove these web pages from your XML sitemap(s).
  • Error web page links. Remove any internal links pointing to these error web pages.
  • Redirect web traffic. If any of your 4xx web pages have received significant traffic or links, you need to make sure you 301 redirect these web pages.

Common crawl problems and how to deal with them:

Issue

Dips in crawl rate:

Problem

Googlebot need to spend time downloading (or crawling) a web page. The more time Googlebot spends crawling a page the more “crawl budget” is being used up. Dips in crawl rate is seen when the time spent crawling a web page has exceeded the average, which indicates an inefficient crawl.

Recommendations

Step 1: Identify pages with slow page load time.

Step 2: If the pages are not important consider removing them to save crawl budget or optimizing the page’s load time.

Excessively slow pages are using up crawl budget.

Tip! Use Google Analytics: Behaviour > Site Speed > Page Timings and select Avg. Page Load Times to compare page load to the website’s average.

Issue

Index bloat:

Problem

Certain webpages on your website are receiving organic traffic and others are rarely, or not at all. If they are all indexed web pages that needs to be crawl to receive traffic, which ones are e.g. the ones that just received teen visit in the past month or one visit the past six months etc. Indexing pages that do not generate web traffic has a direct impact on the crawl budget.

Recommendations

Step 1: Identify “dead-weight” web pages that don’t receive any traffic

Step 2: Remove index pages from Google’s index to improve crawl efficiency to the web pages that do matter.

Issue

Broken web pages:

Problem

Search engines removing important pages from their indices because of broken web page. Examples of important pages that are those that have attracted quality links, garnered traffic, or driven conversions.

Recommendations

Step 1: Identify web pages receiving links and traffic that are serving 404 errors.

Step 2: Redirect these pages with the correct redirect code to their corresponding relevant page.

Note. Not all 404 pages are bad. Sometimes pages are removed, and there really aren’t good pages to redirect them to. Google understand that and will eventually remove pages returning a 404 from their indices.

Issue

Ineffective 404 web page:

Problem

Landing on a default 404 error web page and not knowing why the website is not returning what was expected or how to proceed from the error page.

Recommendations

Step 1: Improve the 404 web page template for a better user experience. Consider:

– Add links to your most popular articles or posts, as well as a link to your website’s homepage.

– Have a site search box somewhere on your custom 404 page (sophisticated visitors tend to search; less advanced visitors tend to browse).

Tip. A good custom 404 web page should help keep visitors on your site and help them find the information they’re looking for.

Issue

Site contains cloaked URLs:

Problem

Cloaking is a server technique that some sites utilize to try to fool the search engines into awarding rankings a website doesn’t deserve. A website shows one version of a web page to users and a different version – many times overly optimized – to search engines. This is a stealth method that Google and the other search engines consider deceptive since it attempts to bias the spiders into ranking the web page for terms that should be out of reach. If detected, cloaking could cause your site to be penalised.

Recommendations

Step1: Identify any cloaked URLs

Step 2: Remove cloaked URLs from the website.

Issue

Category pages provide minimal value:

Problem

Category pages ​that receiving traffic can​ create duplicate content for a site and provide minimal value in comparison to individual content pages.

Recommendations

Step 1: Expand content on these web pages (preferred to applying a noindex tag).

In cases where category pages are very similar, or are targeting the same keyphrase(s), a canonical HTML tag can be added pointing to the preferred page URL, instead of a noindex tag, so as to pass any link equity to the preferred page that you wish search engines to display in the SERPs.

Note. They can be made useful to both searchers and search engines by providing a paragraph or so of content pertinent to that particular category. By adding some unique content to these pages, you can keep these pages open to search engines. However, in many cases, if you don’t take that extra step, you should block them from search engines by adding a noindex follow tag to them or a canonical tag pointing to a similar, preferred page URL.

Issue

Googlebot crawls non-existent or blank web pages in pagination series:

Problem

Non-existent pages in paginated series can be crawled and produce 404 errors in Google Search Console reports. Even blank web pages, although they serve a 200 HTTP response, have no content listed. The pagination HTML markup references links to these extra blank web pages which will then be crawled by Google.

Recommendations

Step 1: Identify non-existent and blank pages in pagination.

Step 2: Redirect 301 web pages that have been removed or serve a 410 response code.

Step 3: Remove blank paginated pages (remove references to additional pages, with no contet listed in series, from HTML markup and rel=​"​next​"​/​"​prev​"​ HTML markup).

XML sitemap

XML sitemaps are designed to help search engines discover and index your website’s most valuable pages. The sitemap consists of an XML file that lists all the URLs of the website that you want to compete in organic search. This XML file also contains ancillary information, such as when a web page has been updated, its update frequency, and relative importance.

Google recommends creating separate sitemaps for different types of content: images, videos, and text, for example. It is important to update the sitemaps each time new content is published with a multi-format XML sitemap​, ​​i.e. an image and​ video ​XMLsitemap.

It’s important to update the sitemap only with the original versions of the URLs. They should not redirect or return errors. In other words, every web page in the sitemap should return a 200 status code. Anything other than a 200 status code is regarded by the search engines as “dirt” in a sitemap.

Finally, you should upload these sitemaps to Google webmaster tools. Google will tell you how many of the submitted URLs have been submitted and how many of those have been indexed.

Common XML sitemap problems and how to deal with them:

Issue

Missing specialty sitemaps:

Problem

Missing speciality sitemaps like:

  • Image sitemap
  • Video sitemap
  • Google News sitemap

Recommendations

Consider adding video, image and Google News sitemaps to your website. Adding speciality sitemaps will help Google discover your content more effectively.

Issue

Blocked web pages in sitemap:

Problem

Pages with noindex tags were found in the sitemap. Adding blocked pages is wasteful, because you only want web pages in our sitemaps that you want indexed by search engines.

Recommendations

Step 1: Identify the web pages you don’t need in the sitemap.

Step 2: Remove the web pages that don’t need to be in the sitemap.

Robots.txt

The robots.txt file is an important text file placed in a web domain’s root directory. This file provides search engines directives regarding the crawling of the site, from blocking the crawling of specific web pages or directories to the entire site.

It’s not recommended to use robots.txt to prevent the indexing of website content that has quality inbound links pointing to it because, if the search engines can’t access a web page, they can’t credit you for those links. Similarly, you shouldn’t use robots.txt to block a web page that contains valuable links to other web pages on your site. If the bots can’t access the web page, they can’t follow links on that blocked web page, which means authority will not be passed to those linked web pages.

Even if you choose to not use the robots.txt file to prevent the crawling of any of the website pages, it’s still recommended to use the robots.txt to specify the location of the website’s sitemap.xml file​. If the site uses multiple sitemaps, then the link should point to a ​sitemap index​ that includes links to all the website’s sitemaps.

Common robots.txt problems and how to deal with them:

Issue

Line items missing from robots.txt:

Problem

Disallowing specific CMS files and folders to be crawled.

Recommendations

Websites setup in a CMS, e.g. WordPress, should disallow the following directories in the robots.txt file: Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/themes/

Architecture

A good website URL structure should at least:

  • Group topics together into category and sub-category web pages to help both visitors and search engines to maintain topical relevance.
  • Use only lowercase to avoid confusion with users and search engines.
  • Use dashes (-) to separate keywords in the URL

To increase crawl efficiency and ensure that authority flows naturally from category web pages down through the rest of the site and back up to the category web pages, a few measures should be taken:

  • Link directly to the canonical URL. By linking to non-canonical URLs, you are wasting your precious crawl budget by sending search engine spiders on a scavenger hunt to discover content you want indexed by search engines.
  • Use relevant, descriptive anchor text in the links without being unnecessarily repetitive or spammy.
  • Only use the nofollow link attribute when linking to pages that you don’t really want to bequeath authority to. Beyond that, keep the use of nofollow for internal links to a minimum.

Common architecture problems and how to deal with them:

Issue

Broken links throughout the website:

Problem

Broken external and internal links​ within your website.

Recommendations

Step 1: Identify internal and external broken links and where the content has been moved.

Step 2: Redirect the broken links to the moved content source web page. If no content can be found contact the website or redirect.

Note. Broken external links provide a poor user experience and can also affect SEO performance by sending poor quality signals to Google.

Issue

Web pages on the website pointing to 404 pages:

Problem

Your website have source pages containing broken internal links.

Recommendations

Step 1: Identify broken links.

Step 2: Relink all internal links pointing to 404 pages to their new, updated, relevant pages and categories.

Issue

Web pages with too many links:

Problem

Web pages that have excessive many internal links.

Recommendations

Step 1: Identify web pages with too many internal and external links.

Step 2: Review pages flagged with too many internal links. If the links frequency feels spammy consider reducing the number of internal/external links.

Note. Google doesn’t penalize for having more than 100 links on a page anymore, but it can still be a signal that the page is spammy.

Issue

Website does not link to category pages enough:

Problem

Sometimes the most qualified page to rank for some keywords – particularly highly competitive head terms – is a category or tag page.

Recommendations

Consider internally linking to category pages and linking top blog pages to category pages where relevant.

Tip. One way to bolster your category pages is to make sure every URL in that category links to its corresponding category page. This can be accomplished with editorial links (i.e. links in the content itself), breadcrumbs, and/or links in template.

Issue

Redirect-chain URLs:

Problem

Redirect chain where there is more than one 301/302 redirects response before the page response with 200 OK response. Redirect chains eat up crawl budget and in excess can slow down the website.

Recommendations

Step 1: Identify any redirect-chain on your website.

Step 2: Redirect chains​ for optimal crawl efficiency.

Note. Clean-up web pages that have been identified as having redirect chains by ensuring that each URL in the chain redirects throughout a single 301 redirect to the final destination page URL.

Semantic Markup

Using ​schema​ with microdata or JSON-LD (JavaScript Object Notation for Linked Data) in a website’s HTML to declare specific elements allows search engines to better understand the meaning of its content and provide it with higher visibility in the search results through rich snippets. A rich snippet is essentially a bite-sized summary of the content that a user can expect to see on a page.

Snippets come in different forms: an image of a video screenshot, information about a concert, star ratings, cooking time for a recipe, etc. But all serve the purpose of providing sneak peek of the content on your website.

These snippets are highly coveted they improve click through rates. They can also help reduce your website’s bounce rate because users will know they are interested in your website before they click. However, 

In general, it’s recommended to start using the following microdata when specifying this type of content in HTML:

  • Address
  • Blog
  • Authors
  • Breadcrumbs
  • Events
  • Medical
  • Product reviews
  • Product information
  • Ratings​
  • Videos

This content can be validated through the ​Structured Data Testing Tool​ inside Search Console. Google now also supports JSON-LD markup.

Common semantic markup problems and how to deal with them:

Issue

Content worthy of rich snippets:

Problem

Specific content is worthy of having rich snippets, however some content is not being optimised with structured data to help influence rich snippets.

Recommendations

Step 1: Identify which content top web pages (category, product category, product pages etc.) and mapping out additional elements of the web page that could benefit from structured data.

Step 2: Structure data on the selected top web pages to get rich snippets to show up on Google results.

Tip. Product pages are an excellent place to start because rich snippets such as review stars ​increase Click-through rate (CTR)​.

Images

Images offer a great opportunity to earn visibility and additional organic traffic with image and universal search results.

The use of images on the website is also helpful to make web pages more attractive, illustrate an idea, and earn visitors’ attention.

Below are a few best practices in optimizing images for search:

  • Include alt description tags using specifically relevant keywords for each image, without keyword stuffing them. These are primarily for the benefit of visitors with disabilities using screen readers.
  • Optimize images file names so they include relevant keywords.
  • Include images in your XML sitemap following the ​multi-format​ sitemap specification or better create an ​image sitemap​ and submit it to Google.

Common image problems and how to deal with them:

Issue

Large images:

Problem

Website load time is ranking factor. One factor that can bloat page load times is large images. In the rush to publish, it’s easy to overlook image size.

Recommendations

Step 1: Identify which images are over 100KB in size.

Step 2: Optimize all/top pages where web page load time is an issue.

Note. Images can be reduced without affecting image quality.

Issue

Images with missing or irrelevant ALT text:

Problem

Images on website are missing their alt text description tag or have irrelevant, meaning that the alt text does not accurately describe the image.

Recommendations

Step 1: Identify and review images with no or poor alt text.

Step 2: To each image add an alt text that describe the image, using keywords when possible, to help search engines better understand the web page’s content.

Videos

Videos offer a great opportunity to earn visibility and additional organic traffic with video or universal search results.

Below are some video best practices to keep in mind to maximize impact and reach in the search engines:

  • Optimize all the video related elements — titles, description, categories, keywords, etc. Add your target keywords, not only in the site but also in popular video platforms, such as YouTube or Vimeo. Just keep in mind that putting your videos on YouTube will sabotage your own site’s traffic. However, YouTube is the second largest search engine, so oftentimes it’s worth losing out on the traffic to your site.
  • Use ​structured markup​ to mark up your videos.
  • Add videos to your XML sitemap following the ​multi-format​ sitemap specification or create a new specific ​video sitemap​ and submit it to the search engines.

Common video problems and how to deal with them:

Issue

Commercial and promotional videos on YouTube:

Problem

YouTube is a great platform for videos that are more viral in nature, as well as tutorials. However, videos with a commercial or promotional purposes tend not to do well on that platform.

Recommendations

Note. Going with a self-hosting platform ​can be more beneficial for sites creating their own videos and wishing to drive engagement to their own website. This is because you’re not competing against others for views. This should only be done with cations and great measure/test to ensure higher page ranking.

Issue

Missing (social) call to action (CTA) on the videos:

Problem

Some videos on YouTube are lacked strong call-to-actions in the video to share the video on social networks.

Recommendations

Note. YouTube makes it easy to add ​annotations​ that encourage viewers to like videos and subscribe to your channel. You should add annotations that prompt viewers to also comment on their videos as well as link to conversion pages. It’s important to vote up and engage with people who comment on your videos. Video responses show YouTube that your video is popular and relevant. Replying demonstrates you’re approachable.

Issue

Missing links to other YouTube videos in annotations:

Problem

There’s anecdotal evidence that YouTube favors videos that link to other YouTube assets within annotations.

Recommendations

These links can be added using annotations with links to your other videos.

Note. Use annotations to encourage viewers to take actions such as commenting, sharing the video and clicking through to e.g. your conversion pages.

Issue

Videos tend to be short/long:

Problem

There’s anecdotal evidence that YouTube favors videos that link to other YouTube assets within annotations.

Recommendations

Note. Depending on your current practice, you should consider creating shorter/longer videos, if it varies too much from average, to stand a better chance of ranking highly within YouTube. YouTube uses watch time as a ranking signal.

HTML Titles

HTML Title tags (technically called title elements and web page titles) define the title of a document and are required for all (X)HTML documents. It is the single-most important on-page SEO element (behind overall content) and appears in three key places: the title bar of a browser, search results pages and social websites.

As a ranking factor in organic search and a highly visible element to users in the search engine result pages (SERPs), it’s important to make sure that a site’s titles meet certain criteria:

  • They’re relevant, descriptive, and unique for each web page.
  • They read effortlessly. HTML titles with keywords separated by pipe characters, for example, aren’t good for social sharing or click throughs from search results.
  • They’re different from the meta description and offer complementary information.
  • They include the most important keywords for each specific web page, preferably toward the beginning of the title (but aren’t keyword stuffed).
  • They include the name of the site at the end of the title to strengthen its branding.
  • They’re concise since search engines only show 512-pixels (approximately the first 65-70 characters).

Common HTML title problems and how to deal with them:

Issue

Pages with long html titles:

Problem

Web pages with HTML titles over 65 characters. HTML titles should be kept under 65 characters (or 512 pixels) to be safe.

Recommendations

Ensure that HTML titles are not too long, over 70 characters will truncate the title in search results which may affect Click-through rate (CTR).

Note. If you don’t, your HTML titles could be truncated in the SERPs, which has been correlated with lower click through rates.

Issue

Pages with short HTML titles:

Problem

Web pages with HTML titles that are shorter than 30 characters. Excessively short HTML titles can fall short of adequately communicating what content a viewer can expect to see when she clicks on the search result.

Recommendations

Ensure that all HTML titles utilise the whole 50-65 characters for optimal HTML title presentation in search results.

Issue

Duplicate HTML titles:

Problem

Web pages with duplicate HTML titles. Pages with duplicate titles are often times good indicators of duplicate content caused by pagination or URL parameters.

Recommendations

Step 1: Review all duplicate HTML titles.

Step 2: Remove all duplicated URLs with query parameters.

Issue

Web pages with non-descriptive titles:

Problem

Writing titles that are both optimized and compelling is critical to both ranking and click throughs.

Recommendations

Step 1: Review all/top web page titles.

Step 2: Optimize web page titles to be descriptive of the content.

Meta Descriptions

Meta descriptions are not a ranking factor, but they are highly visible to users in the SERPs and can affect click through rates. Google highlights not only search query terms but also synonyms in meta descriptions, making these keywords pop off the page for searchers.

You should keep a few things in mind when writing meta descriptions:

  • They should have a compelling call to action or lure the searcher to click through.
  • They should be unique and describe the content of the page well.
  • They should be no more than 156 characters long since search engines truncate descriptions beyond this point (and sometimes before this point since Google uses pixel width instead of characters now).

Common meta description problems and how to deal with them:

Issue

Pages with missing meta descriptions:

Problem

Missing meta descriptions. Any page that is intended to be public facing should have a meta description.

Recommendations

Step 1: Identify and review meta descriptions that are missing.

Step 2: Add meta description.

Issue

Pages with too short or too long meta descriptions:

Problem

Long descriptions will be truncated, reducing their readability and click through rates. Short descriptions have an impact on website traffic.

Recommendations

Step 1: Identify and review meta description that are either too long or too short.

Step 2: Too long or too short should be amended, rewritten with care with a descriptive and compelling approach.

Issue

Description missing power calls to action:

Problem

Page description is not compelling or written in a passive voice that don’t encourage users to click.

Recommendations

Step 1: Review top pages on your website, such as category and product pages.

Step 2: Amend and write more compelling meta descriptions to encourage organic CTR.

Site Speed

Web page load speed is part of the Google search ranking algorithm​. For usability reasons, best practices dictate that a web page should load within 1-2 seconds on a typical connection. However, according to Search Console data, a load time of 1.4 seconds is the threshold between a fast web page and a slow web page. That means, ideally, that every web page on your website should load in 1.4 seconds or less, to receive the maximum SEO benefit for fast-loading web pages.

Besides the ranking benefits, there are crawling benefits for faster sites. As discussed in the ​crawl section above, each site is assigned a crawl budget. When search spiders have exhausted that budget, they move on. There is a strong correlation between improving web page speed and the search engines’ ability to crawl more web pages.

Google gathers web page load time data through actual user experience data collected with the Google Toolbar, Chrome, and Android and may also be combining that data with data collected as Google crawls a website. As such, web page load speed in terms of the ranking algorithm is being measured using the total load time for a web page, exactly as a user would experience it.

Common site speed problems and how to deal with them:

Issue

Page load times of top landing web pages high:

Problem

Top landing web pages have high page load times.

Recommendations

Optimize the page load time for your websites top 25 landing web pages.

Issue

Homepage load time lags in comparison to competitors:

Problem

Competitors’ website load time is faster than your website and affects your ranking on Google.

Recommendations

To be competitive and rank higher, page load time (especially for mobile) need to be kept under 20 HTTP requests.

Issue

Server Response (Time to First Byte):

Problem

Slow server response, especially Time to First Byte.

Recommendations

Review server response time and aim for a time to first byte of 500ms or less.

Issue

Data Transfer Protocols:

Problem

Website assets are transferred over the H2 or HTTP/2.0 transfer protocol apart from other assets that are that are sourced from IIS, Apache and nginx servers using the HTTP/1.1 protocol to transfer data. This transfer protocol can be prone to delays due to sequenced fetching of page resources.

Recommendations

Use the most up-to-date transfer protocol to ensure fast site speed.

Note. Work with external providers to see if these organisations can transfer assets to their client’s sites more efficiently by upgrading their transfer protocols.

Issue

Page and Asset Caching:

Problem

There are assets that are not being cached. These assets must be downloaded again upon each page request by both bots and users, causing a prolonged page load.

Recommendations

Step 1: Review HTML caching rules.

Step 2: Improve caching rules for HTML for optimal user-experience.

Note. The performance of websites and applications can be significantly improved by re-using previously fetched resources through HTTP caching.

Mobile

Search engines require that your website is mobile ready.  There are a few key facts marketers and site owners should know about mobile friendly website:

  • It is assessed on a page-by-page basis in real time.
  • Google released a ​Mobile Friendly Tool​ to help marketers assess if their key landing pages are mobile friendly.
  • Google released a ​Mobile-Friendly Test API​, which lets you test URLs using automated tools. For example, you could use it to monitor the performance of important pages in your website.

Completely apart from Google’s Mobile Friendly Update, sites should strive to serve up pages that are friendly to mobile users. If mobile visitors have a difficult time digesting your content, they will oftentimes grow frustrated by the poor user experience and leave your site. 

This will not only result is a reduction of traffic (as search engines could send less mobile traffic to your site) but also fewer social shares. All of these consequences have the potential of hitting an organization’s bottom line. Therefore, it is in the best interest of most site owners to make their sites into compliance.

Common mobile website problems and how to deal with them:

Issue

Website isn’t using Google AMP pages:

Problem

Mobile visitors are becoming increasingly intolerant of slow websites. Lag time can cost a website in lost revenue and trust.

Recommendations

Consider AMP for the following types of content: News articles, Blog posts, Recipes, Product listings, Travel guides and the like.

Consider implementing an AMP strategy for the homepage, category pages, product listing pages and blog posts.

Note. Although we typically think of AMP for publisher sites, it can also be successfully implemented on ecommerce sites with some caveats.

Caution. AMP is not without implementation issues.

International

Google differentiates between multilingual and multi-regional sites. A multilingual website is a website that offers content in more than one language, whereas a multi-regional website is one that explicitly targets users in different countries. Some sites are both multi-regional and multilingual (for example, a site with different versions for the US and Canada, and both French and English versions of the Canadian content).

Expanding a website to cover multiple countries and/or languages can be fraught with challenges. Because you have multiple versions of your site, any issues will be magnified.

Common internationalization problems and how to deal with them:

Issue

Website uses multiple languages:

Problem

If your website is multilingual, search engines use the content language of your page to determine its language, not lang attributes. Your website has mixed translation issues or have only translated the boilerplate text of your web pages, while keeping your content in a single language. This can create a bad user experience if the same content appears multiple times in search results with various boilerplate languages.

Recommendations

Ensure that each multilingual version of your website is translated fully to its specified language.

Issue

Sites are missing return hreflang links:

Problem

Web pages are missing hreflang return links.

Recommendations

Step 1: Identify and review hreflang tag errors report.

Step 2: Ensure all returned hreflang links match the canonical link of the page.

Note. hreflang is fragile at the best of times, keeping a key on the ​hreflang tags with errors report is recommended on a daily basis.

Technical SEO Audit document 2018, Miklagárd SEO Team


Miklagárd can support your website with the implementation of audits, we offer in-house training, consulting and hand-holding where necessary.

CONTACT NOW

Do you have a question or do you want to hear more about
our partnership oppurtunties?

Call us on +45 31 32 60 86

Our phone support is open Monday to Friday from 9.00-16.00

? Top posts

Follow us