What is robots txt file in seo

What is robots. txt file in seo & How to optimize robots. txt file

Seo On Page Optimization
What is robots txt file in seo
What is robots txt file in seo

What is robots txt file in SEO

It gives instruction to search engines crawlers about indexing and crawling of a webpage, file of a web page, directory, domain or files.

How to check robots.txt is present or not on the website?

So the correct location is www.bloggertarget.com/robots.txt.

This file indicates how deep the spider can crawl into your website. We can also instruct to crawlers to do not crawl the particular webpage, domain or web directory

You can check the bloggertarget robots.txt file

Blogger Target Robots.txt file Example
Blogger Target Robots.txt file Example

You can see here I have mention Disallow and Allow in Robots.txt

user-agent = we write to allow all search bots

Disallow means = you are giving instruction to the Google to do not index the files.

Allow means = You are giving instruction to the google to index the files in search engines

Many people have a doubt we should add or not Ajax shortcode in robots.txt file.

Allow: /wp-admin/admin-ajax.php

Yes, you can add ajax code in robots.txt file. Many WordPress themes use javascript request Ajax to add content to the web pages. This allows Google search engine to crawl the /wp-admin/admin-ajax.php .

Why robots.txt is important in SEO

  • Website owner uses the robots.txt file to give instruction about their site to search engine robots to index the webpage in search engines.
  • For blocking low-quality pages and duplicate content (not to crawl the pages)

How to create robots txt file for a WordPress site

If you are using WordPress website then you can easily install the Yoast SEO WordPress plugins and this plugin will generate the robots txt file for WordPress site automatically.

WordPress Plugin name: Yoast Seo WordPress Plugin

How to create a robots.txt file for Static websites

If you are not creating a website on WordPress or your website is created by the developer. Then create a robots.txt files in notepad and give to the developer to upload the files in root directory of your websites or else you can upload robots.txt files by using the FileZilla FTP.

How to create a robots.txt file in notepad.

Open your notepad and write the small codes which files you want to crawl and which files you don’t want to crawl by search engines to index website pages and save in notepad file name as a robots.txt then upload in root folder directory of your domain.

Step 1: Open notepad and write shortcodes user-agent without giving any space and write Disallow and allow.

Give your website name in the XML sitemap

How to write Robots txt in notepad
How to write Robots txt in notepad

Step 2: To save the file click on the file and select save as and give file name robots.txt

Step 3: Give this file to the web developer to upload in the root folder of your websites. If you are screaming website by own then you can upload it by using Ftp

 

Robots txt file is very important for the optimization of your website. If you have not generated robot. txt file in the website then generate now and add it to your website. If you have robots. txt file in the website then search engine crawlers can easily find your websites to index in search engines.

Read more:

 

Let us know which robots. txt WordPress plugin you are using for your website. If you have any questions regarding robots txt Let us know via comment.

 

Naveen Ekka is an online marketer and entrepreneur. He blogs on Making money online through blogs and seo.

Leave a Reply