What is robots.txt?
robots.txt is a plain-text file in your site root that tells search engine crawlers which paths they may or may not request. It uses User-agent to target specific bots, Allow and Disallow for paths, and optionally Sitemap to point to your XML sitemap. It does not block access—it only advises; use server-side protection for sensitive content.