# AutoDataHub.net Robots.txt - SEO Optimized for Maximum Crawling # Updated: 2025-07-12 # Allow all major search engine bots User-agent: Googlebot Allow: / User-agent: Bingbot Allow: / User-agent: Slurp Allow: / User-agent: DuckDuckBot Allow: / User-agent: Baiduspider Allow: / User-agent: YandexBot Allow: / User-agent: facebookexternalhit Allow: / User-agent: Twitterbot Allow: / User-agent: LinkedInBot Allow: / User-agent: WhatsApp Allow: / User-agent: Applebot Allow: / # Allow all other bots User-agent: * Allow: / # Sitemaps - All section sitemaps for maximum discoverability Sitemap: https://autodatahub.net/sitemap.xml Sitemap: https://autodatahub.net/sitemap_main.xml Sitemap: https://autodatahub.net/sitemap_gas_prices.xml Sitemap: https://autodatahub.net/sitemap_vehicle_parts.xml Sitemap: https://autodatahub.net/sitemap_recalls.xml Sitemap: https://autodatahub.net/sitemap_specs.xml Sitemap: https://autodatahub.net/sitemap_categories.xml Sitemap: https://autodatahub.net/sitemap_maintenance.xml Sitemap: https://autodatahub.net/sitemap_vin.xml Sitemap: https://autodatahub.net/sitemap_fuel_economy.xml Sitemap: https://autodatahub.net/sitemap_vehicle_values.xml # High-priority sections for search engines Allow: /gas-price-pages/ Allow: /vehicle-parts-pages/ Allow: /safety-recalls/ Allow: /vehicle-specifications/ Allow: /category-pages/ # Encourage frequent crawling of time-sensitive content Allow: /gas-price-pages/* Allow: /safety-recalls/* # No crawl delay - we want aggressive crawling for SEO # Crawl-delay: 0 # Optional: Block resource-heavy files if needed # Disallow: /assets/large-files/ # Request index page Request-rate: 1/1s # Additional SEO directives Host: autodatahub.net