About the SEO Analyzer
Overview
The SEO Analyzer runs on-page and optional limited crawl checks. It evaluates a single URL or a small set of same-host pages for title, meta description, headings, images, links, canonical, robots directives, and more. The report includes a score, checklist items with status (good, recommended, critical), and optional mobile snapshot. Use it to spot technical and content issues before search engines crawl your site.
Results cover basic SEO (title length, meta description, H1, image alt text), advanced signals (canonical URL, noindex, Open Graph, Schema.org), performance hints (caching, minification), security (HTTPS, HSTS), and mobile readiness (viewport). Combine with the Sitemap Generator and Robots.txt tool for a complete technical setup.
When to use it
Run the analyzer when launching a new page or site, after major design or URL changes, or periodically to catch regressions. Use single-URL mode for landing pages, key product pages, or blog posts. Use limited crawl mode when you want to see aggregate issues across several same-host pages (e.g. duplicate titles, missing descriptions). The Schema.org generator helps you add or fix structured data that the analyzer can then detect.
How to use it
Enter the full URL (including https) in the input field and click Analyze. Wait for the report to load; the tool fetches the page and runs checks. Review the overall score and the checklist: expand items to read "What it means" and "How to fix." Use the sidebar to jump to sections (Basic SEO, Advanced SEO, Performance, Security, Mobile). For crawl mode, enter a start URL and optional page limit; the tool follows same-host links and reports issues per page and in aggregate. Export the report as JSON if you need to share or archive it.
Best practices
Fix critical issues first: missing or duplicate titles, missing meta descriptions, broken canonical, or noindex on pages you want indexed. Ensure a unique, descriptive title and meta description per page (length within common guidelines). Use structured data (e.g. via the Schema.org tool) where it fits your content. Check both desktop and mobile views; the mobile snapshot may be blocked by X-Frame-Options on some sites—use "Open in new tab" to inspect. Submit your sitemap in Search Console and ensure Robots.txt allows crawling of important paths.
Common issues
Duplicate title or meta description across pages hurts SEO; make each page unique. Blocking resources or entire pages via robots or noindex when you want them indexed is a frequent mistake. Very long or very short titles and descriptions can trigger warnings. Missing or empty image alt text affects accessibility and image search. If the mobile snapshot does not load, the target site likely sends X-Frame-Options or Content-Security-Policy that prevent embedding; the analyzer still runs all other checks. Crawl mode is limited to same-host URLs and respects robots.txt for safety.