If your pages are not indexed, traffic will not follow.
This often comes down to basics like missing sitemap.xml, robots.txt blocking discovery, missing HTTPS, or canonical problems. Google does not index what it cannot find or trust.
Baseline checks public indexing signals that commonly explain why sites get ignored or only partially indexed. This is not a deep crawl. It is a surface sanity check that catches obvious blockers.