Potential downside of removing past years' calendar events
-
hi there. my website is for a school. We have several calendars for athletic events, school events, etc. there are thousands of events each year that link to pages for each event. the URLs/pages are generated dynamically, and the urls and pages are very similar from year to year, so we're being penalized for duplicate content. I can delete past years events in bulk, but there's no way to redirect them in bulk. am i taking a big chance by deleting events that occurred prior to 1/1/2019?
-
Thank you for your response.
The old pages give us very little traffic--as few as ONE page view over the past six month
Here's an example:
https://www.landmarkschool.org/calendar/day/2018-01-06 1 page view (obviously NOT a lot of traffic)
https://www.landmarkschool.org/calendar/day/2019-01-06
Another issue is that these events are generated from the calendar but don't have /calendar/ in the url, and I'm getting penalized for duplicate content.
https://www.landmarkschool.org/blue-day-47
https://www.landmarkschool.org/blue-day-48
https://www.landmarkschool.org/performing-arts-tech-weekend
https://www.landmarkschool.org/performing-arts-tech-weekend-0
https://www.landmarkschool.org/performing-arts-tech-weekend-1
https://www.landmarkschool.org/performing-arts-tech-weekend-2
Any advice is appreciated!
-
You got any example do the URL structures? Also do the old URLs give you any traffic?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I remove pages to concentrate link juice?
So our site is database powered and used to have up to 50K pages in google index 3 years ago. After re-design that number was brought down to about 12K currently. Legacy URLs that are now generating 404 have mostly been redirected to appropriate pages (some 13K 301 redirects currently). Trafficked content accounts for about 2K URLs in the end so my question is should I in context of concentrating link juice to most valuable pages: remove non-important / least trafficked pages from site and just have them show 404 no-index non-important / least trafficked pages from site but still have them visible 1 or 2 above plus remove from index via Webmaster Tools none of the above but rather something else? Thanks for any insights/advice!
Intermediate & Advanced SEO | | StratosJets0 -
Dev Subdomain Pages Indexed - How to Remove
I own a website (domain.com) and used the subdomain "dev.domain.com" while adding a new section to the site (as a development link). I forgot to block the dev.domain.com in my robots file, and google indexed all of the dev pages (around 100 of them). I blocked the site (dev.domain.com) in robots, and then proceeded to just delete the entire subdomain altogether. It's been about a week now and I still see the subdomain pages indexed on Google. How do I get these pages removed from Google? Are they causing duplicate content/title issues, or does Google know that it's a development subdomain and it's just taking time for them to recognize that I deleted it already?
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
Using the same content on different TLD's
HI Everyone, We have clients for whom we are going to work with in different countries but sometimes with the same language. For example we might have a client in a competitive niche working in Germany, Austria and Switzerland (Swiss German) ie we're going to potentially rewrite our website three times in German, We're thinking of using Google's href lang tags and use pretty much the same content - is this a safe option, has anyone actually tries this successfully or otherwise? All answers appreciated. Cheers, Mel.
Intermediate & Advanced SEO | | dancape1 -
Best way to permanently remove URLs from the Google index?
We have several subdomains we use for testing applications. Even if we block with robots.txt, these subdomains still appear to get indexed (though they show as blocked by robots.txt. I've claimed these subdomains and requested permanent removal, but it appears that after a certain time period (6 months)? Google will re-index (and mark them as blocked by robots.txt). What is the best way to permanently remove these from the index? We can't use login to block because our clients want to be able to view these applications without needing to login. What is the next best solution?
Intermediate & Advanced SEO | | nicole.healthline0 -
Can I use rel=canonical and then remove it?
Hi all! I run a ticketing site and I am considering using rel=canonical temporary. In Europe, when someone is looking for tickets for a soccer game, they look for them differently if the game is played in one city or in another city. I.e.: "liverpool arsenal tickets" - game played in the 1st leg in 2012 "arsenal liverpool tickets - game played in the 2nd leg in 2013 We have two different events, with two different unique texts but sometimes Google chooses the one in 2013 one before the closest one, especially for queries without dates or years. I don't want to remove the second game from our site - exceptionally some people can broswer our website and buy tickets with months in advance. So I am considering place a rel=canonical in the game played in 2013 poiting to the game played in a few weeks. After that, I would remove it. Would that make any sense? Thanks!
Intermediate & Advanced SEO | | jorgediaz0 -
Removing Dynamic "noindex" URL's from Index
6 months ago my clients site was overhauled and the user generated searches had an index tag on them. I switched that to noindex but didn't get it fast enough to avoid being 100's of pages indexed in Google. It's been months since switching to the noindex tag and the pages are still indexed. What would you recommend? Google crawls my site daily - but never the pages that I want removed from the index. I am trying to avoid submitting hundreds of these dynamic URL's to the removal tool in webmaster tools. Suggestions?
Intermediate & Advanced SEO | | BeTheBoss0 -
How to get around Google Removal tool not removing redirected and 404 pages? Or if you don't know the anchor text?
Hello! I can’t get squat for an answer in GWT forums. Should have brought this problem here first… The Google Removal Tool doesn't work when the original page you're trying to get recached redirects to another site. Google still reads the site as being okay, so there is no way for me to get the cache reset since I don't what text was previously on the page. For example: This: | http://0creditbalancetransfer.com/article375451_influencial_search_results_for_.htm | Redirects to this: http://abacusmortgageloans.com/GuaranteedPersonaLoanCKBK.htm?hop=duc01996 I don't even know what was on the first page. And when it redirects, I have no way of telling Google to recache the page. It's almost as if the site got deindexed, and they put in a redirect. Then there is crap like this: http://aniga.x90x.net/index.php?q=Recuperacion+Discos+Fujitsu+www.articulo.org/articulo/182/recuperacion_de_disco_duro_recuperar_datos_discos_duros_ii.html No links to my site are on there, yet Google's indexed links say that the page is linking to me. It isn't, but because I don't know HOW the page changed text-wise, I can't get the page recached. The tool also doesn't work when a page 404s. Google still reads the page as being active, but it isn't. What are my options? I literally have hundreds of such URLs. Thanks!
Intermediate & Advanced SEO | | SeanGodier0