|
What can be specified in the robots.txt file? How to write Robots.txt? In order to write a Robots.txt correctly, it is important to understand the Robots.txt syntax. Let's take Semrush's Robots.txt as an example: Source: We see that the spaces between a rule and the user-agent separate specific indications for each of those user-agents. Let's see what each of the elements of Robots.
txt means: User-agent: Specifies the robot to iran number data which the instructions are directed (e.g. Googlebot). In the Semrush example, we see LinkedInBot; Twitterbot and finally * (which are rules that apply to all robots.) Then we have 3 key indications: Disallow : Indicates the URLs or directories that the robot should not access .
Allow : that the robot can access . Crawl-delay: Although it is not shown in Semrush's Robots.txt, it is a command that can be used to give the server a break and tell the bot to take a brief pause when analyzing and reading the Robots.txt. This indication is ideal to avoid overloading the server and causing problems that slow down the website.
|
|