Docs. Read the rules.
Every check Scannica runs, grouped by category and keyed by stable rule codes. Click a code to see what the rule detects, why it matters, and how to fix it.
Every rule, every code.
74 checks across eight categories. Every finding carries the same code in the app, in exports, in tickets, and on this page. Click any code for detail.
- Total
- 74
- Errors
- 17
- Warnings
- 35
- Notices
- 22
SEO
14 rulesCrawl
7 rulesContent
7 rulesPerformance
9 rulesSecurity
15 rulesGDPR
6 rulesGEO
9 rulesBest Practices
7 rulesThe checks we nerd out on.
A sampler of what actually sets Scannica apart once a crawl lands.
Measure the page users actually see.
A headless browser collects real LCP, CLS, and INP per URL — then Scannica diffs the rendered DOM against the static HTML. Pages that only exist after hydration? Caught.
- → SPA and hydrated sites surface rendered-only links and content.
- → Render-blocking scripts flagged with byte counts, not vibes.
- → DOM size and depth checks catch builder-exported monsters.
+ <article data-ssr="false">
+ <h1>Plans built for scale</h1>
+ <a href="/plans/enterprise">...</a>
<section id="faq">
<h2>Frequently asked</h2>
</section> Every hop, named.
Scannica walks every redirect to its terminus and surfaces loops before they surface in Search Console. Chains longer than two, mixed 301/302, HTTP→HTTPS hand-offs — all flagged with the URL that started the mess.
REDIRECT_CHAIN REDIRECT_LOOP HTTP_TO_HTTPS_REDIRECT - 301 http://example.com/products→ https
- 301 https://example.com/products→ www
- 301 https://www.example.com/products→ /store
- 302 https://www.example.com/store→ locale prefix
- 200 https://www.example.com/en/storefinal
Findings your security team will actually read.
HTTPS enforcement, CSP directives, HSTS, cookie flags, mixed content — each flag carries the rule code AND a CWE reference so your engineers can triage without a translator.
- CWE refs
- ✓
- linked on every security rule
- Scope
- per-URL
- not just top-level headers
| Strict-Transport-Security — | max-age=15552000; includeSubDomains |
| Content-Security-Policy SEC_CSP_UNSAFE_EVAL | script-src 'self' 'unsafe-eval' *.vendor.com |
| X-Frame-Options SEC_PERMISSIONS_POLICY_MISSING | missing |
| Set-Cookie: session SEC_COOKIE_NO_HTTPONLY | HttpOnly ✗ · SameSite ✗ |
| Mixed content SEC_MIXED_CONTENT_PASSIVE | 3 http:// resources on https page |
Orphans, found.
Feed in one or many sitemap.xml files.
Scannica diffs them against the live crawl: pages listed but unreachable, pages crawled
but absent from the sitemap, stale entries pointing at 404s. Migration QA without the spreadsheet.
- /
- /about
- /blog/launch
- /blog/roadmap
- /legal/old-tos
- + 122 more
- /
- /about
- /blog/launch
- /blog/v2-notes
- /internal/staging
- + 209 more
Short questions. Honest answers.
If yours isn't here, the team is a real email away — not a ticketing portal.
-
Does Scannica send my crawl data anywhere?
No. The crawl engine runs inside the desktop app. The only outbound network traffic is the HTTP requests your crawl makes to the site you're auditing. There's no Scannica analytics backend, no telemetry, no cloud sync — audit a staging host from a VPN-isolated laptop and nothing leaks. -
Which platforms are supported?
macOS (Apple Silicon and Intel), Windows 10+, and Linux (.deb, .rpm, and AppImage). SHA-256 checksums are published with every release. -
How big a site can it crawl?
The URL frontier uses xxh3 hashing for deduplication, and the session database runs SQLite in WAL mode — the practical ceiling is your disk, not the app. Designed for multi-million-URL sites. -
Can I pause a crawl and resume later?
Session state and frontier persist to the project file — full pause-and-resume is on the near-term roadmap and lands in v1.1. -
Does it render JavaScript?
Yes — Scannica can route each URL through a headless browser, collect Core Web Vitals, and diff the rendered DOM against the static response to surface content that only appears after hydration. -
What file format are the projects in?
A .scannica bundle is a ZIP containing the session SQLite database and the crawl configuration. Portable, archivable, diff-able, and small enough to email. -
Is it open source?
The crawl engine and rule set are proprietary. Full license terms ship with the installer. -
Is there a CLI?
A command-line runner is on the roadmap for headless CI use (e.g. run a crawl per deploy). For now, crawls kick off from the desktop app.