Verifying three custom domains in Google Search Console with Cloudflare DNS

Verifying three custom domains in Google Search Console with Cloudflare DNS

使用 Cloudflare DNS 在 Google Search Console 中验证三个自定义域名

Before Google Search Console is useful — for indexing requests, coverage errors, or query data — you have to prove you own the domain. For three new custom domains on Astro sites, I did this in one afternoon. The fastest path was DNS TXT verification through Cloudflare, and there were a few non-obvious details worth writing down. This is a short notes post, not a full tutorial. Google and Cloudflare both have step-by-step documentation. What follows is the things that tripped me up and what the console actually shows at day one. 在 Google Search Console 发挥作用(用于索引请求、覆盖率错误或查询数据)之前,你必须证明你拥有该域名。对于 Astro 站点上的三个新自定义域名,我在一个下午就完成了此操作。最快的方法是通过 Cloudflare 进行 DNS TXT 验证,其中有一些不太明显的细节值得记录下来。这是一篇简短的笔记,而非完整的教程。Google 和 Cloudflare 都有分步文档。以下是我遇到的坑以及控制台在第一天实际显示的内容。

DNS TXT vs HTML file for Astro SSG

Astro SSG 的 DNS TXT 与 HTML 文件验证对比

Search Console offers four verification methods. For Astro sites deployed to Cloudflare Pages: Search Console 提供了四种验证方法。对于部署到 Cloudflare Pages 的 Astro 站点:

  • HTML file: Download a file, add to public/, commit, push, wait for deploy.
  • HTML 文件: 下载文件,添加到 public/ 目录,提交、推送,等待部署。
  • HTML meta tag: Modify the <head> layout component, commit, push, wait for deploy.
  • HTML 元标签: 修改 <head> 布局组件,提交、推送,等待部署。
  • DNS TXT record: Add a record in Cloudflare DNS. No deploy needed.
  • DNS TXT 记录: 在 Cloudflare DNS 中添加记录。无需部署。
  • Google Analytics or Search Console tag: Requires GA4 already wired up.
  • Google Analytics 或 Search Console 标签: 需要已配置好 GA4。

The DNS TXT method is clearly faster for static sites. You don’t trigger a build, you don’t wait for Cloudflare Pages CI, and you don’t have to modify any source file. The TXT record propagates in minutes, and Search Console confirms it within a couple of minutes of Cloudflare showing the record as “Active.” 对于静态站点,DNS TXT 方法显然更快。你无需触发构建,无需等待 Cloudflare Pages CI,也不必修改任何源文件。TXT 记录在几分钟内即可传播,并且在 Cloudflare 显示记录为“Active”(活跃)后的几分钟内,Search Console 即可完成确认。

One thing that confused me initially: DNS TXT verification proves ownership of the domain root, not of a specific path. A root domain verification covers all subdomains automatically, including www. That’s broader coverage than the HTML file approach, which only verifies the exact URL where the file was placed. 起初让我困惑的一点是:DNS TXT 验证证明的是根域名的所有权,而不是特定路径的所有权。根域名验证会自动覆盖所有子域名,包括 www。这比 HTML 文件方法覆盖范围更广,后者仅验证放置文件的确切 URL。

Adding the record in Cloudflare

在 Cloudflare 中添加记录

In Cloudflare’s DNS dashboard: dash.cloudflare.com → [your domain] → DNS → Records → Add record. Record type is TXT, name is @ (the root), content is the string Search Console provides — something like google-site-verification=<long-string>. Cloudflare defaults to TTL “Auto,” which is 300 seconds. The record shows as “Active” in the Cloudflare dashboard almost immediately after saving. 在 Cloudflare 的 DNS 面板中:dash.cloudflare.com → [你的域名] → DNS → Records → Add record。记录类型为 TXT,名称为 @(根域名),内容为 Search Console 提供的字符串,类似于 google-site-verification=<长字符串>。Cloudflare 的 TTL 默认为“Auto”,即 300 秒。保存后,该记录几乎立即在 Cloudflare 面板中显示为“Active”。

I verified each record was live with dig TXT yourdomain.com before clicking Verify in Search Console. Not strictly required, but it prevents wasting a verification attempt on propagation lag. This is a per-domain operation. Each of the three sites needed its own Search Console property and its own TXT record. There’s no bulk flow; you do it once per domain and move on. 在点击 Search Console 中的“验证”之前,我使用 dig TXT yourdomain.com 确认了每条记录是否生效。虽然不是严格要求,但这可以避免因传播延迟而浪费验证尝试。这是针对每个域名的操作。三个站点中的每一个都需要自己的 Search Console 资源和 TXT 记录。没有批量流程;你只需为每个域名操作一次即可。

Submitting sitemaps for @astrojs/sitemap output

提交 @astrojs/sitemap 输出的站点地图

After verification, submit the sitemap. For Astro sites using @astrojs/sitemap, the output at small site sizes is a /sitemap-0.xml file plus a /sitemap-index.xml that references it. Submit the index file, not the shard: https://yourdomain.com/sitemap-index.xml. Search Console follows the index to discover the shards. Submitting a shard URL directly works, but you’d need to manually add each new shard if the site grows past the single-file limit (50,000 URLs). The index submission handles that automatically. 验证后,提交站点地图。对于使用 @astrojs/sitemap 的 Astro 站点,在站点规模较小时,输出是一个 /sitemap-0.xml 文件以及一个引用它的 /sitemap-index.xml。请提交索引文件,而不是分片文件:https://yourdomain.com/sitemap-index.xml。Search Console 会跟随索引发现分片。直接提交分片 URL 也可以,但如果站点增长超过单文件限制(50,000 个 URL),你需要手动添加每个新分片。提交索引文件则会自动处理这些。

If Search Console shows “Couldn’t fetch” on the sitemap, check two things: that your custom domain deployment is actually live (not still serving the *.pages.dev URL), and that robots.txt isn’t accidentally blocking the /sitemap* path. 如果 Search Console 在站点地图上显示“无法获取”(Couldn’t fetch),请检查两点:你的自定义域名部署是否确实已上线(而不是仍在提供 *.pages.dev URL),以及 robots.txt 是否意外阻止了 /sitemap* 路径。

What the coverage report shows at day one

覆盖率报告在第一天显示的内容

Almost nothing, which is expected. Zero indexed pages is normal for a brand-new domain. Googlebot hasn’t had time to crawl anything meaningfully — the custom domain records are fresh, and the sitemap was just submitted. All URLs will show as “Discovered — currently not indexed,” which means Google knows the pages exist from the sitemap but hasn’t crawled them yet. 几乎什么都没有,这是预料之中的。对于全新的域名,索引页面为零是正常的。Googlebot 还没有时间进行任何有意义的抓取——自定义域名记录是新的,站点地图也才刚刚提交。所有 URL 都会显示为“已发现 - 当前未编入索引”,这意味着 Google 通过站点地图知道这些页面的存在,但尚未抓取它们。

The URL Inspection tool works immediately. You can paste any URL, click “Request Indexing,” and Google queues a crawl for that specific page at elevated priority. I did this for the home page and top two or three category pages on each site. It’s not the same as indexing — it just moves those specific URLs to the front of the crawl queue. Actual indexed status takes days to weeks. URL 检查工具可以立即使用。你可以粘贴任何 URL,点击“请求编入索引”,Google 就会以更高的优先级为该特定页面排队抓取。我为每个站点的首页和前两三个分类页面执行了此操作。这与索引不同——它只是将这些特定的 URL 移到抓取队列的前面。实际的索引状态需要几天到几周的时间。

Worth noting: IndexNow sends signals to Bing’s index, not Google’s. For Google, the URL Inspection request-indexing button is the only manual acceleration available. Running both is not redundant; they’re independent crawlers. The coverage report becomes genuinely useful after roughly a week, when errors start accumulating. The errors section is where you find 404s, redirect chain issues, canonical mismatches, and noindex tags that got applied by mistake. Those are all invisible until Googlebot actually visits. 值得注意的是:IndexNow 向 Bing 的索引发送信号,而不是 Google 的。对于 Google,URL 检查中的“请求编入索引”按钮是唯一可用的手动加速方式。两者都运行并不多余;它们是独立的爬虫。覆盖率报告在大约一周后才会真正变得有用,届时错误开始积累。在错误部分,你可以找到 404、重定向链问题、规范标签不匹配以及被错误应用的 noindex 标签。在 Googlebot 实际访问之前,这些都是不可见的。

I’m planning to check each property weekly for the first month. If you’re running multiple sites, keep the Search Console properties separate rather than trying to look across them in one view. Error patterns for an AI tools directory and an indie games directory look completely different and would obscure each other if combined. 我计划在第一个月每周检查一次每个资源。如果你在运营多个站点,请保持 Search Console 资源分开,而不是试图在一个视图中查看它们。AI 工具目录和独立游戏目录的错误模式看起来完全不同,如果合并在一起会相互掩盖。

Part of an ongoing 6-month experiment running three AI-curated directory sites. The technical claims here are real; this article was AI-assisted. 这是正在进行的为期 6 个月的三个 AI 策划目录网站实验的一部分。文中的技术声明属实;本文由 AI 辅助撰写。