Sitemap Question - Should I exclude or make a separate sitemap for Old URL's
-
So basically, my website is very old... 1995 Old. Extremely old content still shows up when people search for things that are outdated by 10-15+ years , I decided not to drop redirects on some of the irrelevant pages. People still hit the pages, but bounce...
I have about 400 pages that I don't want to delete or redirect. Many of them have old backlinks and hold some value but do interfere with my new relevant content.
If I dropped these pages into a sitemap, set the priority to zero would that possibly help? No redirects, content is still valid for people looking for it, but maybe these old pages don't show up above my new content?
Currently the old stuff is excluded from all sitemaps.. I don't want to make one and have it make the problem worse. Any advise is appreciated.
Thx
-
Sending you a PM
-
You are welcome!
Still get that traffic in the move It's free traffic, try to make the most out of it. Find the best way to point them in the direction you need them to go always keeping an eye in being as friendly and natural as possible.
-
Good plan actually, I appreciate it. I dev'd my own sitemap script but agree xml-sitemaps works great. I suggest that to friends & clients needing an easy solution.
Giving the analytics... I did't want to update roughly 400 pages. However, you handed me my resolution... I'll wrap the old pages with my up to date header/footer & just make some banners that direct traffic to the updated website.
Note: Making a basketball/shoe analogy... Just assume I'm selling Nike Shoes & traffic lands on my 1995,1996,1997 etc Charles Barkley pages. I don't sell shoes, and my query reports & analytics show people arent searching for Barkley but because of the age and trust of my page, engines still point them there.
Anyway, I appreciate it a lot. Over complicated things this time !
-
I don't think messing with your sitemap will work. Google serves what they think is better to the user, even if it is old content.
You have several options here to go for:
- Make a full sitemap automatically that will assign priority automatically like the one provided by xml-sitemaps.com (incredible software in my personal opinion and well worth the money).
- Update the content on those pages you say it's outdated. I think Google prefers serving pages that have huge value instead of "new", therefore, updating the content of those pages may decrease your bounce rate.
- While on the old pages, link to the new posts that include the new info. You can even put something like "This content is outdated, for the up-to-date version, click here" and link to the most appropriate new page, you keep the page, no 301s and pass some juice to the new page.
I think the best would be to use the 1st and 2nd options in conjunction. Or 1st and 3rd if the content of the "old" pages have something that updating them will loose their value.
In any case, I wouldn't leave pages out of the sitemap. The software I mentioned automatically assigns priority as to "how deep the page is in your site" (links it needed to follow to reach that page, older pages will surely need more clicks to reach to them).
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What kind of impact does a 404 have in a sitemap regarding ranking?
We recently had a site update where our robots file disallowed our sitemap for about two weeks. When we found the problem and resubmitted the sitemap to Google Search Console, it found a 404 error. Does this have any impact on ranking or visibility if we are still recovering from the disallow?
Algorithm Updates | | GaryBlanchard0 -
Google's stand on LSI keywords?
Hi all, So the keywords which appear while typing some keywords and suggested keywords at the bottom of the search results page are refereed as LSI keywords. I been noticing some of the LSI keywords for years related to our industry and Google now suddenly changed them. I wonder why it would be. I can see competitors are started using those LSI keywords widely, is that the reason Google changed them? Thanks
Algorithm Updates | | vtmoz0 -
Does Google’s Algorithm Populate Answer Boxes with Its Own Independent Research?
If you search 'best games to play for youtube' you get an answer box with answers pulled independently from the article at hand. Here's an image: http://imgur.com/a/S0j9B Here are all the games from my article, in the order in which they appear. Google's chosen games for Answer Box are bolded: Battlefield 1 Bloodborne GTA V FiFA 16 TrackMania Turbo Garry’s Mod League of Legends Call of Duty: Black Ops III Tom Clancy’s The Division Overwatch Just Cause 3 Counter-Strike: Global Offensive Brawlhalla Rocket League Dark Souls III Unravel Firewatch GoldenEye 007 (this was put in as a joke, but coded as an H2 nonetheless) Destiny Dead by Daylight Fallout 4 Undertale No Man’s Sky Minecraft As you can see, Google is choosing which games to display to its searchers. My Crazy Egg data shows that these were not picked by click volume (each of these H2s are hyperlinked), which means Google must be using some other popularity metric, such as its own search volume data or external sales data. I wrote this up in a post on my site, for anybody who's curious.
Algorithm Updates | | Edward_Sturm1 -
One of my pages doesn't appear in Google's search
Our page has been indexed (I just checked) but literally doesn't exist in the first 300 results despite having a respectable DA & PA. Is there something I can do? There's no reason why this specific page doesn't rank, as far as I can see. It's not a new page. Cheers, Rhys
Algorithm Updates | | SwanseaMedicine0 -
Case Sensitive URL Redirects for SEO
We want to use a 301 redirect rule to redirect all pages to a lower case url format. A 301 passes along most of the link juice... most. Will we even see a negative impact in PageRank/SERPS when we redirect every single page on our site?
Algorithm Updates | | tcanders0 -
Changing in website design reduce traffic? I don't think so.
HI, Around the month of Nov I was working on the website. Due to some reasons I have to change the design of website. I saw my traffic going down and down(70 - 100/day) so roll back it on previous one. after that it improve little bit but not as on previously. (traffic 250 - 300/day). Question: All Urls, content and links are same then how that can effect on the traffic. We have removed all the errors that was shown in the seomoz report.But traffic is still the issue here. We are working on SEO area enough and try to recover from it. Your suggestion may be helpful for us.So I am looking forward for your answers. how i can over come with it. Thanks Regards
Algorithm Updates | | lucidsoftech0 -
Google indexing my website's Search Results pages. Should I block this?
After running the SEOmoz crawl test, i have a spreadsheet of 11,000 urls of which 6381 urls are search results pages from our website that have been indexed. I know I've read that /search should be blocked from the engines, but can't seem to find that information at this point. Does anyone have facts behind why they should be blocked? Or not blocked?
Algorithm Updates | | Jenny10 -
Do we need to worry about where our domain is hosted anymore?does it make a difference anymore?
I went to a really interesting conference last week and one of the speakers who has been working in the SEO industry for 15 years now said that it doesn't make a difference anymore ranking wise. I would like to see what the community thinks on this subject? Thanks Ari
Algorithm Updates | | dublinbet0