Home page Articles SEO 7 Fundamental Technical SEO Questions To Answer With Log Analytics (And How To Do It Easily) 7 Fundamental Technical SEO Questions To Answer With Log Analytics (And How To Do It Easily) Posted: 2020-11-18 Log analysis for SEO Shadow Making Log analysis has evolved into a fundamental part of technical SEO audits. Server logs allow us to understand how search engine crawlers interact with our website, and analyzing your server logs can lead to actionable SEO insights that you might not otherwise have gleaned. First: choose your tools Shadow Making There are many tools available to help you analyze server logs, and the right one for you will depend on your technical knowledge and resources. There are three types of log file analysis tools you'll want to consider (unless you're doing it from the command line, which I wouldn't recommend.
Check if the pages and directories you blocked via the robots.txt file are actually crawled. Pages that should not be crawled [Click to enlarge] You can also search for pages that aren't blocked via the robots.txt file but shouldn't be prioritized from a crawling perspective - this includes pages that aren't indexed, canonicalized, or redirected to others pages. For this, you can perform a list analysis from the exported list with your favorite SEO crawler (e.g. Screaming Frog or OnPage.org) to add additional information about their non-indexing meta crawlers Shadow Making and canonicalization status , in addition to the HTTP status you will already have logs. 6. What is your Googlebot crawl speed over time and how does it correlate to response times and serving error pages? Unfortunately, the data that can be obtained through Google Search Console's 'Crawl Shadow Making Stats' report is too generic (and not necessarily accurate enough) to take action on. So, by analyzing your own logs to identify Googlebot's crawl rate over time, you can validate the information and segment it to make it actionable. With Loggly, you can choose to show Googlebot activity over the desired time range in a line chart.
Find bots and pageviews [Click to enlarge] What's important here is not just that search bots come to your site, but that they actually spend their time crawling the right pages. What Shadow Making pages are they exploring? What is the HTTP status of these pages? Do search bots crawl the same or different pages? You can select each of the search user agents you want to check and export the data for comparison using pivot tables in Excel: HTTP state by user agent Based on this initial Shadow Making information, we'll start digging deeper to check not just how these bots differ in their crawling behavior, but whether they're really crawling where they should be. 3. Which pages are not serving properly? Look for pages with HTTP statuses 3xx, 4xx, and 5xx. By searching for the desired search bot (in this case, Googlebot), then choosing the “state” filter, you can select the HTTP values of the pages you want to analyze.