Hi Martijn,
Thank you for responding. I think canonical tags are the best way forward, I am looking forward to explain to the web developer that we need several hundred tags implementing!
Many thanks
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Martijn,
Thank you for responding. I think canonical tags are the best way forward, I am looking forward to explain to the web developer that we need several hundred tags implementing!
Many thanks
Our client is a recruitment agency and their website used to contain a substantial amount of duplicate content as many of the listed job descriptions were repeated and recycled. As a result, their rankings rarely progress beyond page 2 on Google. Although they have started using more unique content for each listing, it appears that old job listings pages are still indexed so our assumption is that Google is holding down the ranking due to the amount of duplicate content present (one software returned a score of 43% duplicate content across the website).
Looking at other recruitment websites, it appears that they block the actual job listings via the robots.txt file.
Would blocking the job listings page from being indexed either by robots.txt or by a noindex tag reduce the negative impact of the duplicate content, but also remove any link juice coming to those pages?
In addition, expired job listing URLs stay live which is likely to be increasing the overall duplicate content. Would it be worth removing these pages and setting up 404s, given that any links to these pages would be lost? If these pages are removed, is it possible to permanently deindex these URLs?
Any help is greatly appreciated!
Hi Jared,
That's very helpful and your response is greatly appreciated! From your experience, what sort of time frame would you expect from implementing these signals and the pages being reindex to seeing an effect in the ranking?
Many thanks
Is the time on page more than 0 seconds? If so, it is unlikely to be bot traffic
To check that this isn't bot traffic, what are the engagement stats for these sessions? (I.e. session duration/ pages per session).
If these metrics show zero engagment, it is possible that this is bot traffic.
Alternatively, could this be someone within your organisation or your web developer testing some features?
For one of my SEO campaigns, Google is using the website's home page as the landing page for the majority of search terms being tracked. The website splits its products by region and so we want specific region pages to rank for search terms related to that region, rather than the home page. We have optimised each regional page to a reasonably high standard and we have ensured that there is a good amount of internal linking and sign-posting to those region pages, however, Google is still using the home page. The only complication is that for the first few months there were canonical tags on these pages to the home page. These were removed around 3 months ago and we've checked that the region pages are indexed properly.
Is there anything we are missing?
Has anyone had any success in getting Google to change its landing pages?