Discuz! Board

 找回密碼
 立即註冊
搜索
熱搜: 活動 交友 discuz
查看: 1|回復: 0

Indicates the URLs or directories

[複製鏈接]

1

主題

1

帖子

5

積分

新手上路

Rank: 1

積分
5
發表於 6 天前 | 顯示全部樓層 |閱讀模式
What can be specified in the robots.txt file? How to write Robots.txt? In order to write a Robots.txt correctly, it is important to understand the Robots.txt syntax. Let's take Semrush's Robots.txt as an example: Source:  We see that the spaces between a rule and the user-agent separate specific indications for each of those user-agents. Let's see what each of the elements of Robots.


txt means: User-agent: Specifies the robot to iran number data which the instructions are directed (e.g. Googlebot). In the Semrush example, we see LinkedInBot; Twitterbot and finally * (which are rules that apply to all robots.) Then we have 3 key indications: Disallow : Indicates the URLs or directories that the robot should not access .


Allow :  that the robot can access . Crawl-delay: Although it is not shown in Semrush's Robots.txt, it is a command that can be used to give the server a break and tell the bot to take a brief pause when analyzing and reading the Robots.txt. This indication is ideal to avoid overloading the server and causing problems that slow down the website.
回復

使用道具 舉報

您需要登錄後才可以回帖 登錄 | 立即註冊

本版積分規則

Archiver|手機版|自動贊助|GameHost抗攻擊論壇

GMT+8, 2025-3-12 17:08 , Processed in 1.468040 second(s), 21 queries .

抗攻擊 by GameHost X3.4

© 2001-2017 Comsenz Inc.

快速回復 返回頂部 返回列表
一粒米 | 中興米 | 論壇美工 | 設計 抗ddos | 天堂私服 | ddos | ddos | 防ddos | 防禦ddos | 防ddos主機 | 天堂美工 | 設計 防ddos主機 | 抗ddos主機 | 抗ddos | 抗ddos主機 | 抗攻擊論壇 | 天堂自動贊助 | 免費論壇 | 天堂私服 | 天堂123 | 台南清潔 | 天堂 | 天堂私服 | 免費論壇申請 | 抗ddos | 虛擬主機 | 實體主機 | vps | 網域註冊 | 抗攻擊遊戲主機 | ddos |