Cheatcode: Autopinging the Google Indexing API with NextJS and GraphCMS

Right. Straight into this one because it's a beast. If you're relying on your sitemap to get pages found, and your internal linking structure isn't A* and then some, chances are Google doesn't give your site enough attention.

Programmatic web apps with hundreds and thousands of regularly updated pages don't go well with Sitemaps. Well, not as well as other options, like automatically pinging the Google Indexing API every time there's an update on a page. Relying on a Sitemap crawl feels kinda dated now, so here's how to tell Google every single time a page has been updated with Nextjs and the headless CMS of your choosing.


  • yarn add googleapis
  • Create new Google Cloud Platform Project
  • Add Google Indexing API
  • Create Service Account
  • Generate new key
  • Add Service Account to Google Search Console
  • Create Next.js API endpoint
  • Point CMS webhook at that endpoint
  • Instant indexing power

Ready? Let's begin.

Prereqs & setting up dependencies

There's not a lot here prerequisites wise but make sure you can tick the following criteria to avoid disappointment.

  • Have a version of Next that can do the whole API endpoint handler malarky
  • a CMS that can fire webhooks on content change; I'm using graphCMS to handle around 3000 brand pages for vouchernaut but you can also look into DatoCMS or any other headless CMS worth the time of day.
  • A Google Cloud Platform Project
  • yarn add googleapis which is an easy to use wrapper around all of the Google APIs.

Setting up Google Cloud Platform for Google Indexing API(GCP)

Creating a new GCP project

  • Head over to the Google Cloud Platform
  • Create a new project (if you dont already have one)

Enabling the Indexing API

  • From the Dashboard, Hit APIs and services
  • ... then at the top, Enable APIs and Services
  • Search for Indexing API
  • .. Then enable it.

Next up, creating a services account

  • On the APIs and services dashboard, hit Credentials then in the Service Accounts area, manage services accounts
  • Fill in the account name account ID and a description (optional)
  • Skip steps 2 and 3 by hitting DONE button below

Create your JSON key

  • Head into the account, then the Keys tab then Add Key > Create new key with Key Type JSON. This is important because we'll importing it for user with googleapis
  • Did a json file just download? Good! If not, you'll need to make a new key - GCP is secure like that.

Adding Service Account to Google Search Console

Your Service Account is going to need access to Google Search Console if you're to start pinging the Indexing service with any hope. Here's how to set that up.

  • Verify Google Search Console with your site if you haven't already. It's super easy and here's the ever helpful Google Support link to do so. How to verify ownership of your Google Search Console property
  • Google Search Console > Settings > Add User > Paste in your Service Account email address. You can grab it from Google Cloud Platform, or the JSON file you downloaded previously under client_email.

After all this admin is out the way, we're finally able to dive into Next.js and make some indexing magic happen.

Create the indexing endpoint in Next.js

  • Create a new file along the lines of src/api/webhooks/indexer.js
  • Add the JSON service key you downloaded earlier to your project (or pull out the private_key and client_email and add them as env variables.
  • Add the following code and be sure to change the url from to your.. domain. You'll need to add a url to the request body, so make sure this makes sense based the payload your CMS webhook sends you.

I'm using GraphCMS in this example, where each page being updated and created has a slug parameter that I can append to my domain to push a URL.

const { google } = require('googleapis')
import services from './services.json' // This is the Service Account JSON you downloaded 

const handler = async (req, res) => {
  if (req.method !== 'POST') return res.end()
  const { data } = req.body

  try {
    // Create new auth object, pass it the client email, private key, and ask for permission to use the indexing service.

    const auth = new google.auth.JWT(

    const indexer = google.indexing({
      version: 'v3',
      auth: auth,

    const indexRequest = await indexer.urlNotifications.publish({
      requestBody: {
        type: 'URL_UPDATED',
        url: `${data.slug}`,

  } catch (error) {
    console.log('error :>> ', error)

export default handler

There's only one step left after this, and that's to set up your CMS webhook so that it pings this endpoint. What you do next is really down to how your CMS structure, but I'm sure you'll be able to work it out from here.

Apologies for the abrupt ending but you've got this. Teaching fishing and all that.


So there you have it. Now you can stop relying solely on Sitemaps and start pinging the Google Indexing API like a pro in what, half an hour of clicking about. By actively telling Google when your site content has updated, you'll focus your sites crawl budget onto the pages that matter and speed up the rate that your content makes it into the SERPs (allegedly.)

It can be a little tricky to navigate your way around GCP, search console etcetc so I hope I've covered everything you'll need to get going. It's been a pleasure. Let me know how you get on!


Want more?

By clicking the button, I agree with the collection and processing of my personal data as described in the Privacy Policy.