# ✅ Explicitly allow ads.txt for everyone User-agent: * Allow: /Ads.txt # ✅ Allow Googlebot - Google's crawler User-agent: Googlebot Allow: /ads.txt Disallow: /includes/ Disallow: /assets/scripts/ Disallow: /*?* # ✅ Allow Bingbot - Microsoft's crawler User-agent: Bingbot Disallow: /includes/ Disallow: /assets/scripts/ Disallow: /*?* # ❌ Block known bad bots and scrapers User-agent: AhrefsBot Disallow: / User-agent: SemrushBot Disallow: / User-agent: MJ12bot Disallow: / User-agent: dotbot Disallow: / User-agent: Yandex Disallow: / User-agent: Baiduspider Disallow: / User-agent: CCBot Disallow: / User-agent: DataForSeoBot Disallow: / User-agent: Python-requests Disallow: / User-agent: Go-http-client Disallow: / User-agent: okhttp Disallow: / User-agent: curl Disallow: / User-agent: Wget Disallow: / User-agent: libwww-perl Disallow: / # ✅ Default rules for all other bots User-agent: * Disallow: /includes/ Disallow: /assets/scripts/ Disallow: /*?* # ✅ Allow sitemap access to all search engines Sitemap: https://solutionviews.com/sitemap.xml