Screaming Frog: Get the most out of this SEO tool

SEO can be a much easier job when you have the help of some tools that are accessible, like Screaming Frog.🐾

Today, you can find hundreds of tools that offer basic SEO help and others that offer more features. However, Screaming Frog is one of the best options you will find.

Having so many tools to compete with, Screaming Frog allows access to a plethora of features that make it ideal for SEO. Also, is perfect for those whose budget is not that big and want to purchase efficient functions for the .

The best thing is that it is one with great recognition today, so you will be sure that it works correctly.

And if you want to learn more about what Screaming Frog is, I recommend you continue reading this post where we explain its characteristics.

What is Screaming Frog🐾 and what is it for?

To be able to start, we must know what Screaming Frog is and what it is used for as a tool.

It is a very practical option for the SEO of a website and its due optimization, being essential to know its functions.

As an excellent complement, Screaming Frog provides a large number of functions that allow for better positioning. It works like a Crawler, since it is in charge of analyzing an entire web page according to your aspects of interest. And with it, offers you a presentation with the status of those aspects of your website.

In addition, it allows you to track all the URLs belonging to a website and find any On Site error to solve it as soon as possible. Thanks to this detail You will be able to optimize your website in the best way and you will have a qualified status to be better positioned.

Not to mention that Screaming Frog easily helps to offer a .

Clearly, with the help of a tool like this one can access 404 errors, duplicate H1s, and other details that affect optimization. And correcting them in time can make the difference between staying at the current optimization level or improving it, to have a better position in the SERPs. In addition, the codes that it can throw are the following:

  • 404: Pages within the web that throw an error because it no longer exists. By means of a simple filter, this SEO tool will be able to export a detailed list of all the pages of our site that are giving an error. This is useful to detect any type of page that has ceased to exist on our site and continues to be linked.
  • 500: Server response errors. These are the server response errors. Screaming Frog also gives you the possibility of knowing the list of pages in which your server returns an error message.
  • 301 and 302: Web pages that have redirects, whether permanent or temporary. Do you want to know which of your pages use temporary or permanent redirects? With a single click and in CSV format you can export a list of all the redirects used on your website.
See also  Tutorial: How to configure Autoptimize step by step |

And it has other functions such as:

  • Sitemap of your website.
  • See the canonical content.
  • Domains in the IP of the website analyzed.
  • View a web from the past, if it was stored by the Wayback Machine.
  • Visualize the backlinks of a web.

How to configure Screaming Frog?

Since you already understand how this wonderful tool works, we can proceed to explain the best method to configure it. You simply have to follow these steps and at the end of the post, you will have all the techniques you need to make Screaming Frog your best ally.

The first thing you should do is download it, as it works as an application for operating systems such as Windows, Mac or Linux. Once downloaded, you can proceed with the configuration that will give meaning to the tool and help it to work properly.

  • Screaming Frog SEO Spider Settings

When entering this application, the first step will be to enter the URL of the web page that we want to analyze. We simply select the “Start” button when entering it and let it do its job crawling all aspects of the web.

But in order to take advantage of each of the optimization features it offers, you need to configure it correctly. So that you can offer only the information that is highly relevant to your website. You simply have to enter the Configuration menu and start this process.

1.1 Spider

Spider is the first section you will find in that menu and it allows you to define the type of files that will be tracked by the application. The same way, allows you to provide specific information in the way you choose.

What to track?

To select what the application is going to track, it is necessary to use the “Basic” section. There you will find the formats of the files to be tracked and you can use it to include or exclude those that are not of great interest to you. They can be images, CSS or different types of files.

In the same way, you can select to be in charge of following the internal or external links considered as nofollow if you prefer. If you select the “Crawl All Subdomains” option, you will be indicating to the application that the subdomains should be considered as a link.

You also have the option of being able to start the crawl through a URL that is not the root of your website and for this you must use the “Crawl Outside of Start Folder” option.

If you have a large website, it is best to prevent it from tracking some files such as images, CSS or Hreflang if it is from a language. That way, you can help it track the information you want much faster.

See also  Expanded Text Ads in Google Ads -

In the case of this section, you can choose which data you want to extract from your website using Screaming Frog. It is recommended that this section not be modified too much because it already provides the necessary information by default and may be essential. And in case you want to carry out an SEO audit, it can be of great help.

limits

For this section certain limits can be used that will help the web to be crawled to be smaller for a faster crawl. So it is divided as follows:

  • Limit Search Total: An option to define the limitation of links that should be tracked, although by default it is better to leave it empty to track the entire web.
  • Limit Search Depth: This section defines the level of depth that the trace must have. The higher the level, the deeper and more clicks away from the main page.
  • Limit Max URI Length to Crawl: Thanks to this option, it is possible to set a character limit that the URLs to be tracked must have. Which means that any URL with a number of letters greater than that placed in its syntax will not be tracked.
  • Limit Max Folder Depth: Here, instead of limiting the depth level of the application, the directory level is limited. This means that Screaming Frog’s URL tracking level must be selected. The higher the level, the greater the number of URLs with that directory level. Example: Choose level 1, 2 directory levels are crawled.
  • Limit Number of Query Strings: The option allows limiting the number of parameters in a format like ?x=. In this case, it is a very interesting option for those who have an online store.

rendering

In the next section you will be able to choose the way in which Screaming Frog will be in charge of processing all the content and if it includes the JavaScript. Its default configuration is responsible for emulating a type of system used by Google to be able to track AJAX-type content. Nevertheless, As obsolete as it may seem, it may become the closest option to the current process.

It is best to keep this option checked, but if you have a website with JavaScript-type elements that prevent proper tracking, there is a solution. Simply choose “Text Only” and the application will take care of crawling the HTML code without counting these JavaScript elements.

In this same section you can find a third option where JavaScript is executed in case of being on the tracked web page. And it also provides screenshots of what it looks like, although it is important to note that this is one of the heaviest functions of the tool. For that reason, it only applies when you have enough time to take advantage of it.

You can also indicate the parameters in which you want the code to be executed, such as:

  • Amount of time in seconds until the capture is made.
  • Screen size to capture.
See also  What is Shortpixel and how to configure it? Tutorial 2022 -

Advanced

In addition to all the features mentioned above, you can access advanced options to improve web SEO. The first option in the section is for those who have a website where they need to accept cookies for it to work properly. If yours is like this, you should mark it. However, there are other functions such as:

  • Pause on High Memory Usage: This option is checked by default and helps web pages that have a large size. So Screaming Frog can pause the crawling process when the application reaches its memory limit. You just have to save that project and then continue.
  • Always Follow Redirects: This option is to know if you have 301 redirects and to be able to follow them in case they are a chain of more redirects. After detecting them, you have the option to segment them using “Respect noindex”, “Respect Next/Prev” and “Respect Canonical”. And if you uncheck any, it will not appear in the report.
  • Extract Images from Img srcset Attribute: It is a fairly simple function, since it allows you to extract all the images that have a srcset attribute when selecting it.
  • Response Timeout: In case you have a web page with a high load time, you can tell Screaming Frog the wait time for that URL. In the option, 20 seconds are marked by default and at the end of the time, the report will indicate that the link has the code “Connection Time Out”.
  • 5XX response retries: It works to indicate the number of times that the application will try to access any URL that has such a code.
  • Max Redirects to Follow: Indicates the maximum number of redirects that the tool can follow.

preferences

The last but not least of the tabs is Preferences, there you can modify the default values ​​according to your needs. In the first case, the default value for the title tag is between 30 and 65 characters, but if your website does not comply with it, it identifies it as an error.

For that reason, you can modify it according to your standards and also modify other details such as:

  • Image size (kylobites).
  • Title and description pixels.
  • Maximum characters for URLs, H1, H2 or ALT attribute.

1.2 Robots.txt

At the end of the first part of the configuration you can proceed with the Robots section, which is usually much shorter. Simply enter the configuration option and let’s start making the following adjustments through options such as:

  • Ignore robots.txt format files from a website.
  • Indicate that the URLs that were blocked by these files appear in the report. It…
Loading Facebook Comments ...
Loading Disqus Comments ...