Skip to content
Scannica

Docs. Read the rules.

Every check Scannica runs, grouped by category and keyed by stable rule codes. Click a code to see what the rule detects, why it matters, and how to fix it.

01 — Metrics catalog

Every rule, every code.

74 checks across eight categories. Every finding carries the same code in the app, in exports, in tickets, and on this page. Click any code for detail.

Total
74
Errors
17
Warnings
35
Notices
22

SEO

14 rules
5 err 7 warn 2 notice

Crawl

7 rules
1 err 2 warn 4 notice

Content

7 rules
1 err 5 warn 1 notice

Performance

9 rules
2 err 6 warn 1 notice

Security

15 rules
7 err 6 warn 2 notice

GDPR

6 rules
1 err 4 warn 1 notice

GEO

9 rules
1 warn 8 notice

Best Practices

7 rules
4 warn 3 notice

06 — Deep dives

The checks we nerd out on.

A sampler of what actually sets Scannica apart once a crawl lands.

Rendered DOM · Core Web Vitals

Measure the page users actually see.

A headless browser collects real LCP, CLS, and INP per URL — then Scannica diffs the rendered DOM against the static HTML. Pages that only exist after hydration? Caught.

  • SPA and hydrated sites surface rendered-only links and content.
  • Render-blocking scripts flagged with byte counts, not vibes.
  • DOM size and depth checks catch builder-exported monsters.
/pricing · audited quality 74 / 100
LCP
1.8s
CLS
0.04
INP
312ms
TTFB
520ms
Rendered ↔ Static diff 18 nodes only in rendered
+ <article data-ssr="false">
+   <h1>Plans built for scale</h1>
+   <a href="/plans/enterprise">...</a>
  <section id="faq">
    <h2>Frequently asked</h2>
  </section>
Redirect topology

Every hop, named.

Scannica walks every redirect to its terminus and surfaces loops before they surface in Search Console. Chains longer than two, mixed 301/302, HTTP→HTTPS hand-offs — all flagged with the URL that started the mess.

REDIRECT_CHAIN REDIRECT_LOOP HTTP_TO_HTTPS_REDIRECT
  1. 301 http://example.com/products
    → https
  2. 301 https://example.com/products
    → www
  3. 301 https://www.example.com/products
    → /store
  4. 302 https://www.example.com/store
    → locale prefix
  5. 200 https://www.example.com/en/store
    final
4 hops · flagged as REDIRECT_CHAIN
Security · CWE-tagged

Findings your security team will actually read.

HTTPS enforcement, CSP directives, HSTS, cookie flags, mixed content — each flag carries the rule code AND a CWE reference so your engineers can triage without a translator.

CWE refs
linked on every security rule
Scope
per-URL
not just top-level headers
Response · https://example.com/checkout 3 issues
Strict-Transport-Security
max-age=15552000; includeSubDomains
Content-Security-Policy
SEC_CSP_UNSAFE_EVAL
script-src 'self' 'unsafe-eval' *.vendor.com
X-Frame-Options
SEC_PERMISSIONS_POLICY_MISSING
missing
Set-Cookie: session
SEC_COOKIE_NO_HTTPONLY
HttpOnly ✗ · SameSite ✗
Mixed content
SEC_MIXED_CONTENT_PASSIVE
3 http:// resources on https page
Sitemap diff

Orphans, found.

Feed in one or many sitemap.xml files. Scannica diffs them against the live crawl: pages listed but unreachable, pages crawled but absent from the sitemap, stale entries pointing at 404s. Migration QA without the spreadsheet.

sitemap.xml (127)
  • /
  • /about
  • /blog/launch
  • /blog/roadmap
  • /legal/old-tos
  • + 122 more
crawl (214)
  • /
  • /about
  • /blog/launch
  • /blog/v2-notes
  • /internal/staging
  • + 209 more
4 missing
12 orphan
123 matched
09 — FAQ

Short questions. Honest answers.

If yours isn't here, the team is a real email away — not a ticketing portal.

  • Does Scannica send my crawl data anywhere?

    No. The crawl engine runs inside the desktop app. The only outbound network traffic is the HTTP requests your crawl makes to the site you're auditing. There's no Scannica analytics backend, no telemetry, no cloud sync — audit a staging host from a VPN-isolated laptop and nothing leaks.
  • Which platforms are supported?

    macOS (Apple Silicon and Intel), Windows 10+, and Linux (.deb, .rpm, and AppImage). SHA-256 checksums are published with every release.
  • How big a site can it crawl?

    The URL frontier uses xxh3 hashing for deduplication, and the session database runs SQLite in WAL mode — the practical ceiling is your disk, not the app. Designed for multi-million-URL sites.
  • Can I pause a crawl and resume later?

    Session state and frontier persist to the project file — full pause-and-resume is on the near-term roadmap and lands in v1.1.
  • Does it render JavaScript?

    Yes — Scannica can route each URL through a headless browser, collect Core Web Vitals, and diff the rendered DOM against the static response to surface content that only appears after hydration.
  • What file format are the projects in?

    A .scannica bundle is a ZIP containing the session SQLite database and the crawl configuration. Portable, archivable, diff-able, and small enough to email.
  • Is it open source?

    The crawl engine and rule set are proprietary. Full license terms ship with the installer.
  • Is there a CLI?

    A command-line runner is on the roadmap for headless CI use (e.g. run a crawl per deploy). For now, crawls kick off from the desktop app.
Still exploring

Ready to run your first crawl?