hoangvu
New member
- Joined
- Jun 6, 2012
- Messages
- 1,835
- Points
- 0
Create Robots.txt file is a way to speak directly with the search engines when they visit your website. These spiders are simply one program has been programmed to perform one of the commands given
Therefore, good communication with this spider is very useful job, you probably do not want spider to visit a certain web page or folder in your website. Or you want to help control the frequency of visits to spider your website.
A simple robots.txt file as shown below, the first line of the User-Agent: *: This line describes the search engine should follow the commands below, here you are marked *, which means all search engine must follow the rules below. For example:
Dissallow: / images / this means that you do not allow all search engines to visit your image folder for images with SEO does not mean anything.
Disallow: / directory / file.html this means that you do not allow all search engines without reading file.html.
After creating the robots.txt file, you upload this file to the root directory of the webiste.
Therefore, good communication with this spider is very useful job, you probably do not want spider to visit a certain web page or folder in your website. Or you want to help control the frequency of visits to spider your website.
A simple robots.txt file as shown below, the first line of the User-Agent: *: This line describes the search engine should follow the commands below, here you are marked *, which means all search engine must follow the rules below. For example:
Code:
User-Agent: *
Disallow: / images /
Disallow: / directory / file.html
Disallow: / directory / file.html this means that you do not allow all search engines without reading file.html.
After creating the robots.txt file, you upload this file to the root directory of the webiste.