mediumseo
Missing robots.txt or sitemap.xml
Checks for essential SEO files that help search engines crawl your site.
Why This Is Bad
These files are the map and invitation for search engine bots. Without them, Google might not index your pages correctly.
How To Fix
Create these files in your public folder:
**public/robots.txt:**
txt
User-agent: *
Allow: /
Disallow: /api/
Disallow: /admin/
Sitemap: https://yoursite.com/sitemap.xml**For Next.js 13+, create app/sitemap.ts:**
typescript
import { MetadataRoute } from 'next';
export default function sitemap(): MetadataRoute.Sitemap {
return [
{
url: 'https://yoursite.com',
lastModified: new Date(),
changeFrequency: 'yearly',
priority: 1,
},
{
url: 'https://yoursite.com/about',
lastModified: new Date(),
changeFrequency: 'monthly',
priority: 0.8,
},
];
}When You Pass This Check
Your robots.txt and sitemap are properly configured!
Check If Your Repo Has This Issue
Our free scanner will detect this and 17 other common issues in your codebase.