How to Find Your robots.txt File and Ensure it’s Error Free

This brief tutorial serves as a quick sanity check to help you find your robots.txt file and make sure it isn’t stopping your site from getting crawled by search engines.

Video Transcript:

Just want to cover really briefly in this video what the robot’s text file is and where you can find it and just make sure that it looks basically okay. In this video, I’m just going to assume that you have a modest WordPress site, maybe under 500 pages and you don’t need to block or no-index a whole bunch of pages on your site. But WordPress will generate a robot’s text file for you and as long as you just kind of left it as is, it should be fine.

You can definitely delve into this topic if you want. I mean, there’s a lot of information about what it’s intended for versus what actually happens. A real bot will be able to just ignore this file, so there’s other ways to make sure that if you have a need to have things not crawled, there’s other ways to do that, but I’m not going to cover that in this video. I just want to make sure that, just give you a quick sanity check and make sure that you have the file and that it doesn’t look weird and isn’t stopping you from getting crawled as you should be.

So if you go just to your URL and do robots dot text at the end, it’ll show you the WordPress-generated robot’s text file. This is automatically done for you. It’s virtual, meaning you can access it, but there’s other ways to do that through Yoast, if you pay for the Yoast premium version. But again, as long as it basically looks like something like this and it doesn’t have anything crazy in there, you should be fine. I do see people add plugins to there, which is probably a good idea, but I just assumed not mess with it. If it’s not broke, don’t fix it. But that’s basically it. I hope that was just a quick sanity check for you. If you have any other questions, leave them in the comments, and thanks.

Published
Categorized as Tutorial

Leave a comment

Your email address will not be published.