How to Enable Robots.txt file in your website

How enable robots.txt in website

Robots.txt is the first thing a crawler visit in your website. Robots.txt tells search engine bot (spider or crawler) that which url needs to be crawled or which does not. 

Robots.txt Format

Use the format below to create robots.txt file in notepad

User-agent: *



Disallow: /dashboard/retrieve-password/

Disallow: /wp-admin 

“ * “ this sign signifies for all the search engine and their crawlers

We have used allow our sitemap because we have created certain urls that we want to rank on priority. Sitemap is the sum of all the urls in our website that we want to rank.

(To create sitemap follow these steps)

We have disallow few links which we do not want google to showcase. 

Refer –


Step 1- Edit above texts on notepad 

You need to edit and save the below text in notepad and save with .txt format.

You can add several other urls as well if you want to disallow them. That means google’s bot will never crawl and index it in server, and no one will ever find it on SERP.

Save as – robots.txt 

Step 2 – Go to hosting service provider

Step 3 –  Open public Html 

Step 4 – Upload Your Robots.txt file

Step 5 – Check if it is correct

Go to google and search your domain name followed by robots.txt as shown in below example.

I hope it worked. If yes then you can let me know in the comment section and if not then you should definitely let me know in the comment section. Peace out! 

Leave a Comment

Your email address will not be published. Required fields are marked *

14 − = 10