In this article, I will teach you. Best Practice WordPress robots.txt file for SEO. Robots.txt file tells search engines how to crawl your website, making it an extremely powerful SEO tool.
- What is the Purpose of the Robots.txt File?
- Do You Need a Robots.txt File for Your WordPress Website?
- What is the Ideal File Look Like in Robots.txt?
- How to Create a WordPress Robots.txt File?
- How to Test Your Robots.txt File?
What is the Purpose of the Robots.txt File?
Robots.txt is a text file that allows website owners to request search engine bots how to crawl and index pages on their sites.
It is usually located in your website’s root directory, also known as the main folder. The simple file format for the robots.txt file looks like this:
User-agent: [user-agent name]
Disallow: [URL string not to be crawled]
User-agent: [user-agent name]
Allow: [URL string to be crawled]
Sitemap: [URL of your XML Sitemap]
You can add several sitemaps and have multiple lines of instructions to allow or disallow specific URLs. If you do not disallow the URL, the search engine bots assume that they are allowed to crawl it.
The following is an example of a robots.txt file:
Also in the robots.txt example above, we’ve allowed search engines to crawl and index files in our WordPress upload folder.
Then, we’ve disallowed search bots from crawling and indexing plugins and WordPress admin folders.
Finally, the URL of our XML sitemap has been submitted.
Do You Need a Robots.txt File for Your WordPress Website?
If you don’t have a file robots.txt, search engines will always crawl and index your website. However, you won’t be able to tell the search engines which pages or folders they shouldn’t crawl.
This won’t have much of an impact when you start a blog and don’t have a lot of content.
However, as your website grows and you have a lot of content, you might want to have more control over how your website is crawled and indexed.
Find bots have a crawl quota for each website
That means they’re crawling a lot of pages during the crawl session. If they don’t finish crawling all of your pages on your website, they’ll return and start crawling in the next session.
This can delay the indexing rate of your website.
You can resolve this by disallowing search bots from trying to crawl unnecessary pages like your WordPress admin pages, plugin files, and folder themes.
After disallowing unnecessary pages, you save the crawl quota. This actually allows search engines to scan even more pages on your website and index them as easily as possible.
Another good reason to use the WordPress robots.txt file is to avoid search engines from indexing a post or page on your website.
It is not the best way of hiding content from the public, but it will help you avoid it in search results.
What is the Ideal File Look Like in Robots.txt?
A lot of famous blogs use a very simple robots.txt format. Their content may vary depending on the particular site’s needs:
This robots.txt file allows all bots to index all contents and provides a link to the XML sitemaps of the website.
For WordPress sites, the following rules are recommended in the WordPress robots.txt file:
This tells search bots to index all photos and files of WordPress. It disallows search bots from indexing WordPress plugin files, WordPress admin field, WordPress readme file, and affiliate links.
By adding sitemaps to the robots.txt file, it’s easy for Google bots to find all the pages on your website.
Now that you know what the perfect WordPress robots.txt file looks like, first, let’s take a look at how you can create a robots.txt file in WordPress.
How to Create a WordPress Robots.txt File?
There are two different ways to create a WordPress robots.txt file. You can choose the proper method for you.
Method 1: All in one SEO Editing of Robots.txt
The best WordPress SEO plugin on the market. AIOSEO activates more than 2 million websites.
The robots.txt file generator is simple to use.
Bonus Tip: AIOSEO is also available for free and has this feature.
If the plugin has been enabled and activated, you can use it to create and edit your robots.txt file directly from your WordPress admin area.
Simple you can go to All in One SEO » Tools to edit your robots.txt file.
Initially, you will need to turn on the Edit option by clicking the ‘Enable Custom Robots.txt’ button to turn blue.
You can create a custom robots.txt file in WordPress with this toggle.
All in One SEO will show your existing robots.txt file in the Robots.txt Preview section at the bottom of your screen.
This version will show the default rules that WordPress has added.
These default rules tell search engines not to crawl your core WordPress files, but they can index all content and have access to your site’s XML sitemaps.
Now, to boost the SEO robots.txt, you can add your own custom rules.
In the User-Agent field, enter a user agent to add a rule. Using a * will apply the rule to all user agents.
Simply select whether you want to ‘allow’ or ‘disallow’ the search engines to crawl.
Second, enter the file name or directory path in the directory path field.
The rule will be applied to your robots.txt file automatically. You can click the ‘Add Rule’ button to add another rule
We suggest adding rules until you create the ideal robots.txt format we shared above.
Your custom rules generally look like this.
When you’re done, remember to press the ‘Save Changes’ button to save your changes.
Method 2: Manually Edit the Robots.txt file Using FTP
To edit the robots.txt file with this tool, you’ll need an FTP client.
Using an FTP client, link to your WordPress hosting account.
Once inside, Find the robots.txt file in the root folder for your website.
If you don’t see one, your robots.txt file is most likely missing.
You can only go ahead and create one in that situation.
So if Robots.txt is a plain text file, you can save it to your computer and edit it with any plain text editor, such as Notepad or TextEdit.
After saving your changes, you can upload them back to the root folder of your website.
How to Test Your Robots.txt File?
After you’ve created your robots.txt file, it’s always a good idea to test it using the robots.txt tester tool.
There are a lot of robots.txt tester tools out there, but we suggest using one inside the Google Search Console.
Then, you will need to link your website to the Google Search Console. If you haven’t done this yet, so you first connect your website with Google Search Console.
You can use the Google Search Console Robots Testing Tool.
Simply select your property from the drop-down list.
The tool will automatically fetch the robots.txt file on your website and will highlight errors and warnings if it found any.
The main purpose of optimizing your WordPress robots.txt file is to avoid search engines from crawling pages that are not publicly available. For eg, pages in your wp plugins folder or pages in your WordPress admin folder.
Blocking WordPress category, tag, and archive pages, according to SEO experts, will improve crawl rate, result in faster indexing and higher rankings.
It’s not real. It’s also against Google’s webmaster guidelines.
If you have any questions or queries about this article, please comment below, and we’ll get back to you. Check more daily updates to get subscribe to our website and also you can get informational articles and much more etc.