Technical SEO Tips for The Events Calendar WordPress Plugin

July 8, 2025
by Bart Platteeuw
in SEO

I have been using The Events Calendar Pro (a WordPress plugin) for just over a year.

While it’s a great plugin – and I continue to use it – it does come with a few technical SEO challenges.

In this post, I’ll share the most important technical SEO tips, tricks, and configurations I’ve discovered along the way. These will help you avoid common pitfalls I encountered when optimizing your event website for search engines.

Most importantly, I’ll show you how to prevent thousands of unnecessary URLs from being generated and indexed by Google – a common issue with The Events Calendar that can seriously bloat your website.

✅ Key Takeaways

  • The Events Calendar can create 1000s of unnecessary URLs.
  • These cause crawl issues and index bloat.
  • Combination of parameterized URLs and URL subfolders.
  • Fix: block in robots.txt and add noindex tags.
  • Improve SEO by keeping your index clean.

My WordPress Setup Using The Events Calendar

Let’s start with some context around my setup.

The site I’m working with currently lists around 200 events – a number that’s expected to grow steadily over the coming years.

Importantly, the issues I’m about to describe can occur even if you’re listing just one event. But as the number of events grows, the problems compound – making it critical to address them early.

I’m using The Events Calendar Pro alongside the Divi Builder, the Divi Events Calendar extension, and a custom Divi Child Theme.

Beyond that, I keep the WordPress environment lean – minimizing plugins to avoid conflicts and reduce bloat.

The site also runs Rank Math SEO and a few other plugins, but I’ve ruled them out as contributing factors to the issues discussed here.

While the problems I’ll cover likely apply to the free version of The Events Calendar, some may be specific to the Pro version and/or the Divi integration I’m using.

URLs Generated by The Events Calendar

Let’s start with the standard URLs generated by The Events Calendar. These are clean, logical, and SEO-friendly:

Page TypeURL Slug
Event Page/event/event-name/
Event Organizer Page/organizer/organizer-name/
Event Venue Page/venue/venue-name/

So far, so good – this is the kind of clean URL structure any SEO would appreciate.

However, the plugin also generates a large number of additional URLs, often with URL parameters attached. These can easily balloon into thousands of low-value pages, leading to a range of SEO and performance problems.

Here’s some examples of how they look:

gsc urls with parameters

These parameterized URLs create:

  • Crawl inefficiencies
  • Thin or duplicate content issues
  • Index bloat

Worse still, if you don’t take any action (as I initially didn’t), Google will happily crawl and index all of them.

For example, my site – with about 200 events – should realistically have 300 to 400 indexable pages (including events, static pages, blog posts, and a few organizer and venue pages).

But as you can see in the screenshot below, over 1,400 pages are currently indexed – and that number keeps climbing.

You’ll also see 6,570 pages not indexed.

indexed urls in gsc

All of these are parameterized URLs generated by the plugin.

While Google initially chooses not to index them (and it shouldn’t) – if you don’t take action, more and more will start slipping through the cracks until you end up with a large index bloat.

As an SEO, this kind of index bloat is something I want to avoid at all costs. My goal is always a clean, tightly controlled index – with zero unnecessary URLs if I can help it.

To their credit, The Events Calendar team does acknowledge this issue in their own knowledge base. They even offer part of the solution, which I’ll share below – along with a few additional steps I recommend.

Disallow URL Parameters in Robots.txt

I don’t have an issue with these parameterized URLs existing – they can serve internal purposes – but there’s rarely a good reason for them to appear in Google’s index by default.

That said, some sites might have legitimate use cases for specific parameters (e.g., event filters or calendars used for logged-in users). So before you make any changes, evaluate your own setup and make an informed decision.

If you’re confident these URLs shouldn’t be indexed, read on.

The first step is to block unnecessary parameterized URLs from being crawled by search engines. The Events Calendar team provides a recommended list of URL parameters to disallow in your robots.txt file:

Disallow: *post_type=tribe_events*
Disallow: *hide_subsequent_recurrences=*
Disallow: *tribe-bar-date=*
Disallow: *tribe-venue=*
Disallow: *eventDisplay=*
Disallow: *eventDate=*
Disallow: *paged=*
Disallow: *pagename=*
Disallow: *shortcode=*
Disallow: *ical=*
Disallow: *outlook-ical=*
Disallow: *related_series=*
Disallow: *tribe_geofence=*
Disallow: *tribe_organizer=*

You can copy and paste this list to easily add it to your robots.txt file.

From my experience, this list is comprehensive – I haven’t found any additional parameters that require blocking beyond what’s listed here.

Here’s what a sample robots.txt file might look like with these rules in place:

sample robots txt for events calendar

How to Edit robots.txt in Rank Math or Yoast

Both Rank Math and Yoast SEO allow you to edit your robots.txt file directly from the WordPress dashboard:

  • In Rank Math: Go to Rank Math > General Settings > Edit robots.txt.
  • In Yoast SEO: Go to Yoast > Tools > File Editor, then scroll to the robots.txt section.

A Quick Reminder About robots.txt

Blocking URLs in robots.txt prevents crawling, not indexing. If Google already knows about the URLs (e.g. from past crawls, backlinks, or sitemaps), they may still appear in search results with a note in Google Search Console like “indexed, though blocked by robots.txt.”

So if you’re applying this fix to an existing website, you may need to follow up with additional actions like using the URL Removal Tool in Google Search Console or adding noindex directives where possible (more on those steps below).

But if you’re launching a new site using The Events Calendar plugin, this is a quick win:

Add these disallow rules before publishing any pages, and you can avoid the entire issue from the start.

This is a great start towards fixing these issues, but there’s a few additional considerations and steps I recommend.

The Events Calendar Also Generates URL Subfolders

In addition to the parameterized URLs, The Events Calendar also generates a set of URL subfolders – at least in my case.

It’s not entirely clear whether these come from The Events Calendar itself or are a result of using the Divi Events Calendar extension, but the impact is the same – and worth addressing.

Here are some of the most common The Events Calendar subfolders I’ve seen:

  • /list/
  • /summary/
  • /week/
  • /month/
  • /photo/

Each of these corresponds to a different calendar view – like a list of upcoming events, a week-at-a-glance, or a photo-style grid. They’re automatically generated by the plugin to display various filtered layouts of your events.

On top of that, pagination is added to each view, multiplying the number of URLs even further.

Here’s how a random weekly view looks on my site:

the events calendar weekly view

Why I Don’t Want These Pages Indexed

In my specific setup, these pages are not helpful for users or search engines. Here’s why:

  1. They often show empty schedules – especially if no events fall within that view.
  2. They’re thin content – low-value pages with little or no unique information.
  3. They duplicate the core content – users can already access a full schedule via the homepage or primary event pages.

If you’ve decided you want to get rid of them, your approach should be twofold:

  • Block them from being crawled using robots.txt (covered earlier).
  • Prevent them from being indexed, even if Google discovers them.

Read on to learn how to fully prevent these pages from being indexed by Google.

How to Noindex Pages with Unnecessary URL Paths & Parameters

Blocking pages via robots.txt prevents future crawling, but it doesn’t remove URLs that are already indexed.

To clean up your index, you’ll want to add a noindex meta robots tag to the URLs you don’t want appearing in search results.

Noindexing Using Plugins Like Rank Math or Yoast

SEO plugins like Rank Math and Yoast make it easy to noindex pages – but only when those pages are tied to post types, taxonomies, or archive templates that WordPress recognizes.

Unfortunately, the problem URLs generated by The Events Calendar (like /week/, /photo/, and URLs with parameters like eventDate=) don’t fall into these categories.

As a result, they won’t appear in your SEO plugin’s settings, and you can’t easily noindex them through the UI.

The Solution: Add a noindex tag via functions.php

The best workaround I’ve found is adding custom logic to your theme’s functions.php file.

This lets you dynamically insert a noindex tag based on the presence of specific subfolders or URL parameters.

While it’s possible to do this using Rank math or Yoast, I opted to edit my functions.php file directly.

Before you continue: important precautions

  1. Use a Child Theme – Always add custom code to a child theme, not your main theme. This protects your changes from being overwritten during theme updates.
  2. Back Up Your functions.php File – If you make a mistake, a backup gives you a quick way to recover.

Here’s an example of code you can add to your functions.php:

function custom_noindex_multiple_paths() {
    $request_uri = $_SERVER['REQUEST_URI'];
    if (
        strpos($request_uri, '/week/') !== false ||
        strpos($request_uri, '/month/') !== false ||
        strpos($request_uri, '/list/') !== false ||
        strpos($request_uri, '/photo/') !== false ||
        strpos($request_uri, '/summary/') !== false ||
        strpos($request_uri, 'eventDate=') !== false
    ) {
        echo '<meta name="robots" content="noindex, nofollow">';
    }
}
add_action('wp_head', 'custom_noindex_multiple_paths');

This code checks the current page’s URL. If it matches any of the unwanted patterns – like /week/ or a parameter like eventDate= – it inserts a <meta name=”robots” content=”noindex, nofollow”> tag into the page’s <head>.

Credit to SeoBatter for providing this example code in their excellent article that covers similar issues discussed here.

Notes & Cautions

  • The above code is not exhaustive and is just an example – to block all unnecessary URL paths and URL parameters The Events Calendar generates, you need to add all variations to this code.
  • You can expand the list by adding more strpos() lines – just remember to omit the || from the last condition in the list.
  • This method applies globally to any URL matching the defined strings. Be careful not to block URLs you actually want indexed!
  • After deploying this change, give Google some time to re-crawl affected URLs and update the index.

Remove URLs from the Index Using Google Search Console

If you’ve followed the previous steps – adding robots.txt disallow rules and noindex meta tags –  Google will eventually drop the unwanted URLs from its index.

However, this can take weeks or even months, especially if a large number of URLs are involved.

To accelerate the process, you can use the Removals tool in Google Search Console.

How the Removals Tool Works

When you submit a removal request, Google will temporarily hide the URL(s) from search results for approximately 6 months.

But don’t worry – if you’ve correctly implemented noindex and/or blocked crawling via robots.txt, those URLs won’t come back once the temporary removal expires.

This makes the Removals tool a great way to clean things up quickly while your long-term fixes take effect.

gsc url removal tool

When to Use It

  • You’ve implemented noindex/robots.txt changes but don’t want to wait for natural de-indexing.
  • You see significant numbers of thin or duplicate pages still appearing in Google search results.
  • You’re preparing for a site launch, relaunch, or cleanup and want to get ahead of potential indexing issues.

Bonus Tip: Clean Up Existing Internal Links

If you’ve added noindex meta tags to unwanted URLs, those pages shouldn’t remain in Google’s index – even if other pages on your site still link to them.

So technically, you don’t have to remove internal links pointing to these URLs.

But…

If you’re like me and prefer keeping your site’s architecture and crawl paths as clean as possible, it’s worth going the extra mile.

Why This Matters

  • Internal links are signals of importance – and you don’t want to waste that equity on URLs you’ve actively noindexed.
  • Internal links may cause bots to crawl those URLs more frequently, despite the noindex.
  • It helps prevent confusion or unwanted behavior from plugins, analytics tools, or internal site search engines.

How to Find and Fix These Links

Use a crawler like Screaming Frog SEO Spider to scan your site and find internal links that point to:

  • URLs with The Events Calendar parameters (like ?eventDate=, eventDisplay=, etc.)
  • Subfolder paths generated by The Events Calendar like /week/, /month/, /photo/, etc.

Then, update those links to point to the cleanest, most useful version of the page – or remove them entirely if they aren’t adding value.

Further Reading: Common Internal Linking Inefficiencies & How to Fix Them.

Keep Your Index Clean & Unlock the Full SEO Potential of The Events Calendar

The Events Calendar is a powerful WordPress plugin for managing and listing events – but like many plugins, it comes with some technical SEO baggage.

I’m a strong believer in maintaining a clean, focused index. That means eliminating unnecessary URLs, so that the pages you do want indexed get the attention (and ranking power) they deserve.

Cleaning up your site structure helps avoid duplicate content, thin content issues, and crawling inefficiencies – all of which can quietly drag down your SEO performance over time.

Contact me if you need help configuring your events website using The Events Calendar plugin to make sure your site is set up and optimized for long-term SEO success.

Hi, I'm Bart.

I’m an SEO enthusiast, strategist, consultant – you name it. I help businesses get in front of their ideal customers in search engines. In my free time, I like to travel and explore new places.

Hi, I'm Bart.

Hi, I'm Bart.

I'm an SEO enthusiast, strategist, consultant - you name it. I help businesses get in front of their ideal customers in search engines.

With 9+ years of SEO experience working with a wide variety of clients, agencies, business types, and verticals, I know a thing or two about how to rank.

In my free time, I like to travel, explore new places, meet interesting people, and try good food.

Contact me to have a chat.