Per-check failure index
Missing or Misconfigured robots.txt
A wrong robots.txt can block search engines from your entire site without you knowing.
Audited
80
Failing
5
Passing
75
What this check looks for
robots.txt lives at /robots.txt and tells crawlers which paths they can fetch. The common own-goal: a `Disallow: /` left over from staging that blocks the whole site. The other common miss: no Sitemap directive, so Google has to discover the sitemap on its own. At a minimum: allow what you want indexed, disallow admin/api paths, and add `Sitemap: https://yourdomain.com/sitemap.xml` at the bottom.
Failing (5)
sorted by overall audit score (worst first)
| Brand | Overall score | Audit |
|---|---|---|
|
Buffer.com buffer.com |
36 | View audit → |
|
Zendesk.com zendesk.com |
36 | View audit → |
|
Box.com box.com |
71 | View audit → |
|
Netlify.com netlify.com |
75 | View audit → |
|
Zoom.us zoom.us |
81 | View audit → |
Passing (75)
sorted by overall audit score (best first)
Other SEO checks in the gallery audit
- Missing Meta Description
- Missing or Weak Title Tag
- Slow Page Speed
- Not Mobile Friendly
- No HTTPS / Missing SSL
- Missing Open Graph Tags
- Missing Canonical URL
- Missing XML Sitemap
- Missing Structured Data
- No H1 or Multiple H1s
- Missing Image Alt Tags
See the full breakdown across every site on the SaaS SEO Scoreboard.
Audit your own site for this check
Free, no account, no credit card. Same 12-check engine that scored every site on this page.
Run a free audit