Where is Robots.txt in WordPress
Where is robots.txt in WordPress? This is the question we are going to answer in this tutorial.
We will cover where to find it and look at the contents of the file. With explanations of each line helping you understand what it does.
With that knowledge, you can edit the file with confidence. We will also test the changes using tools from Google to make sure there are no mistakes.
Let's dive in.
What is Robots.txt?
Bots or web crawlers use the robots.txt file when they visit your site. The most important bot is Googlebot.
Googlebot will visit your site and read the contents of your pages. This adds them to the Google Index and then they will appear in the Google search results.
This is why your robots.txt file is so important for your SEO.
If you make a mistake when you edit your robots.txt file this can block Googlebot from crawling your site.
Don't worry we will cover later how you can test your robots.txt file so this does not happen.
Next, where is robots.txt in WordPress?
Where is Robots.txt?
Robots.txt is a text file found at the root of the website. For example, here are the robots.txt files for a few well-known companies:
- https://airbnb.com/robots.txt
- https://amazon.com/robots.txt
- https://philips.com/robots.txt
- https://pinterest.com/robots.txt
Your robots.txt file is in the same place, something like this:
https://example.com/robots.txt
Add /robots.txt
after your domain name.
The default robots.txt file in WordPress has only three rules, it looks like this:
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Each line in the file is a rule (a directive) that a bot will read and follow when it visits the site. Let's look at what each line is doing:
User-agent: *
The User-agent
directive tells the bot to follow all the rules underneath. The colon separates the directive from its value. The value of the user agent, in this case, is *
. This is a wildcard and it will match all user agents.
What are these user agents? The user agent is the way a bot identifies itself when it visits your site. Here is a list of common user agents:
- Googlebot - Used for Google Search
- Bingbot - Used for Bing Search
- Slurp - Yahoo's web crawler
- DuckDuckBot - Used by the DuckDuckGo search engine
- Baiduspider - This is a Chinese search engine
- YandexBot - This is a Russian search engine
- facebot - Used by Facebook
- Pinterestbot - Used by Pinterest
- TwitterBot - Used by Twitter
You can use these user agents to create rules for specific bots. For example, you can block facebot from visiting certain parts of your website like this:
User-agent: facebot
Disallow: /users
WordPress by default has two rules all bots should follow.
Disallow: /wp-admin/
This directive is “Disallow”. This tells the bot that they are not allowed to go to a certain area of the website. The value is /wp-admin/
which is a folder on the website. This means that all bots are not allowed to visit the admin area of the WordPress site.
Allow: /wp-admin/admin-ajax.php
The directive “allow” will give access to an area of the site to a bot. In the last rule, we removed access to the admin area. This rule then gives access to a single file within that admin area.
You may be wondering why Google needs to read this file when it is in the admin area.
Some plugins and themes have made use of this file to load web page assets such as CSS and JavaScript. If you disallowed this file then any plugin that uses this file would not work when Googlebot visits. This could stop the page from appearing in Google search results.
Do not delete or remove this rule unless you know what you are doing.
If you would like to learn more about the rules available then check out more examples of robots.txt files.
We know where the robots.txt file is in WordPress and what the default one contains let's look at how we can edit it.
We are going to look at two popular SEO plugins Yoast and All in One SEO. Both tools can edit the robots.txt file.
If you do not have one of these plugins then check out our guide on installing WordPress plugins.
How to edit a Robots.txt with Yoast
With Yoast installed you can edit your robots.txt file by using the tools section. First, select SEO from the menu and then choose tools:
Then on the tools page select “File editor” from the list of options:
Scroll to the robots.txt section and then click the button “Create robots.txt file”:
This will allow you to edit the file. Once you have made changes you can click save to update the file.
We are not finished yet, jump to the testing your robots.txt section.
How to edit a Robots.txt with All in One SEO
To change the robots.txt file with All in One SEO you need to first activate the feature. To do this first select “All in One SEO” from the side menu:
Then select the “feature manager” link:
On the feature manager page scroll down to the robots.txt section and click activate:
After activating the feature you will get a new robots.txt option on the menu:
Once you select it you will see a screen where you can add new rules:
This form will allow you to add new rules to the robots.txt file. You will need to enter the user agent to target and then the rule and path.
With this plugin, you will not be able to edit the original three rules. You may only add and edit new ones.
Once you have made an edit you want to test it, let's look at this next.
Testing a Robots.txt with Google
If you haven't already make sure that you submit your sitemap to Google. This will give you access to the Google Search Console tools. One of these tools is a robots.txt file checker.
The tool will load your robots.txt file from your website. It will highlight any errors or warnings it finds.
If you are having trouble with an error then use the PageDart robots.txt checker. Copy your robots.txt file contents into the tool and click the “Test robots.txt” button.
In this example, the error is “Syntax not understood” when you click the link you will see the solution:
You will see this error when there is no colon on the line.
This makes it easier to understand and fix.
You may get an alert from Google Search Console letting you know they found an error in your robots.txt file. Two errors you might get are:
These articles should help when you encounter these errors.
Wrapping Up, Where is Robots.txt in WordPress
We have learned where is robots.txt in WordPress. You know that this file can is at the root of a website such as:
https://example.com/robots.txt
We have covered the directives listed in the default robots.txt file. We then covered how you can use two plugins Yoast and All in One SEO to edit the robots.txt file.
Lastly, we tested the changes to make sure that Google was able to read the file and your web pages.