Crawl Rate - On Page SEO - Tag Manager
How To: De-Index Time Sensitive Content Using Google Tag Manager
Whilst doing some Google Tag Manager consultancy recently, I was working on a client’s website where a large number of events had been listed over a number of years. The vast majority of these had since passed so the content was effectively redundant.
The client wanted to keep the events listed to show that their venue was a vibrant hub of activity (and because no one could be bothered taking the time to remove them all and redirect the old URLs).
From my point of view, there was no reason to keep these events indexed as they were no longer relevant and de-indexing these would keep the indexed content on the site “fresh” and minimal, rather than having search engines crawl and index unnecessary pages.
It was therefore decided to noindex any event that had now passed, though to go through each event in the CMS and noindex them individually would be an arduous task, and no doubt one which would soon be forgotten about leaving more redundant content indexed.
So, I came up with a solution to use Google Tag Manager to detect the date of the event, and if it had passed, then add a noindex tag to the page.
The first step was to grab the event date and time from each page. This was done by getting the datetime attribute from the particular div on the page – to do this, we set a variable which pulled in the DOM element using a CSS selector as followed:
I then created a trigger that detected when an event was being viewed as follows:
And that in turn fired the following custom html tag:
The code used is as follows:
<script>
var dateInPast = function(firstDate, secondDate)
{
if (firstDate.setHours(0, 0, 0, 0) <= secondDate.setHours(0, 0, 0, 0)) {
return true;
}
return false;
};
var event_date = new Date({{Event Date}});
var today_date = new Date();
var in_the_past = dateInPast(event_date, today_date);
if (in_the_past)
{
jQuery('meta[name="robots"]').remove();
var meta = document.createElement('meta');
meta.name = 'robots';
meta.content = 'noindex, follow, noarchive';
jQuery('head').append(meta);
}
</script>
Here, a function is created which passed in the event date and compares it to todays date. If that date is in the past, it returns true, and in turn, removes any existing robots tag and inserts a new one.
If you site doesn’t support JQuery, you can use this code instead to add the new robots tag:
var metatag = document.createElement('meta');
metatag.setAttribute('name', 'robots');
metatag.content = 'noindex,follow';
document.getElementsByTagName('head')[0].appendChild(metatag);
And that’s it. For us, this was a good all round solution that meant minimal housekeeping for the client, and kept it’s index fresh and up to date.
categories
- Analytics Insights
- Crawl Rate Insights
- Ecommerce SEO Insights
- International SEO Insights
- Keyword Research Insights
- Local SEO Insights
- Migrations Insights
- News Insights
- On Page SEO Insights
- Performance Analysis Insights
- SEO Audits Insights
- Site Speed Insights
- Structured Data Insights
- Tag Manager Insights
- Uncategorised Insights
- Wordpress SEO Insights
latest posts
Google Update: Why You Are Seeing a Drop In Impressions & Increase In Average Position
How To: Use GSC & RegEx To Find Out The Questions Your Users Are Asking
Full List Of All Google My Business (GMB) Profile Categories
Why I've Rebranded as Dave Ashworth
A Rough Guide To: Server-Side Tracking
get in touch
If you need some expert website optimisation and configuration, want to find out more, or even just have a question, fill in the form below as I will always be happy to help