Sitemap Question - Should I exclude or make a separate sitemap for Old URL's
-
So basically, my website is very old... 1995 Old. Extremely old content still shows up when people search for things that are outdated by 10-15+ years , I decided not to drop redirects on some of the irrelevant pages. People still hit the pages, but bounce...
I have about 400 pages that I don't want to delete or redirect. Many of them have old backlinks and hold some value but do interfere with my new relevant content.
If I dropped these pages into a sitemap, set the priority to zero would that possibly help? No redirects, content is still valid for people looking for it, but maybe these old pages don't show up above my new content?
Currently the old stuff is excluded from all sitemaps.. I don't want to make one and have it make the problem worse. Any advise is appreciated.
Thx
-
Sending you a PM
-
You are welcome!
Still get that traffic in the move It's free traffic, try to make the most out of it. Find the best way to point them in the direction you need them to go always keeping an eye in being as friendly and natural as possible.
-
Good plan actually, I appreciate it. I dev'd my own sitemap script but agree xml-sitemaps works great. I suggest that to friends & clients needing an easy solution.
Giving the analytics... I did't want to update roughly 400 pages. However, you handed me my resolution... I'll wrap the old pages with my up to date header/footer & just make some banners that direct traffic to the updated website.
Note: Making a basketball/shoe analogy... Just assume I'm selling Nike Shoes & traffic lands on my 1995,1996,1997 etc Charles Barkley pages. I don't sell shoes, and my query reports & analytics show people arent searching for Barkley but because of the age and trust of my page, engines still point them there.
Anyway, I appreciate it a lot. Over complicated things this time !
-
I don't think messing with your sitemap will work. Google serves what they think is better to the user, even if it is old content.
You have several options here to go for:
- Make a full sitemap automatically that will assign priority automatically like the one provided by xml-sitemaps.com (incredible software in my personal opinion and well worth the money).
- Update the content on those pages you say it's outdated. I think Google prefers serving pages that have huge value instead of "new", therefore, updating the content of those pages may decrease your bounce rate.
- While on the old pages, link to the new posts that include the new info. You can even put something like "This content is outdated, for the up-to-date version, click here" and link to the most appropriate new page, you keep the page, no 301s and pass some juice to the new page.
I think the best would be to use the 1st and 2nd options in conjunction. Or 1st and 3rd if the content of the "old" pages have something that updating them will loose their value.
In any case, I wouldn't leave pages out of the sitemap. The software I mentioned automatically assigns priority as to "how deep the page is in your site" (links it needed to follow to reach that page, older pages will surely need more clicks to reach to them).
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO threats of moving from [.com.au] domain to [.com] domain for a 15yr old SAAS company.
Hey Guys. I work for a 15 yr old SAAS company which originally started with a country-specific [.com.au] domain and later got a [.com] domain as the business grew. The AU website has a DA:56 while the [.com] has as DA: 25. Now we are looking to have everything migrated to the [.com] domain. But, my concern is that we might lose the SEO value of the AU domain. I was wondering if anyone has any experience in this or recommend a case study on this topic. Thanks! Allan
Algorithm Updates | | allanhenryjohn0 -
Help Me Change My Client's Mind
My client wants to build a second site to provide targeted links for SEO to his main site. He's interested in buying a TLD with some near topic authority/links and then build the second site's authority up from there. He is clear that this could get him in trouble for a link scheme, but thinks it can all be hidden from Google. Off the top of my head I was able to recall a few of the pain-in-the-neck things you'd have to do to not get caught, but he seemed unconvinced. I recall you'd have to have: Different registrar Different contact/WhoIs Different site host Different G/A, GWT Logging into second's site's G/A, GWT with different IP address not used for main domain With the exception of the last one, he didn't seem to think it would be too hard. Aren't there more difficult maneuvers required for hiding this from Google? I want to be able to point out to him how ridiculous this low integrity effort will be, without losing the client. Thanks! Best... Darcy
Algorithm Updates | | 945010 -
Should we use brand name of product in URL
Hi all, What is best for SEO. We sell products online. Is it good to mention the brand in the product detail page URL key if (part of) the brand is also in the home url? So our URL is: www.brandXstore.com Is it best to do: www.brandXstore.com/brandX-productA.html of just do: www.brandXstore.com/ProductA.html Thanks for quick answering 😉
Algorithm Updates | | RetailClicks1 -
SERP Question - Site showing up for national term over local term recently
Hey Moz, This has been happening to me with a couple of clients recently and I wanted to kick it out to the community and see if anyone else has experienced it and might be able to shed some light on why. (Disclaimer: Both clients are in the elective healthcare space)
Algorithm Updates | | Etna
Scenario: Client's site is optimized for a fairly competitive "procedural keyword + location" phrase. Historically, the site had been ranking on the first page for a while until it suddenly dropped off for that query. At the same time, the page now ranks on the first page for just the procedural term, without the location modifier (obviously much more competitive than with the location modifier). Searches on Google were set to the city in which the client was located. Not that I'm complaining, but this seems a little weird to me. Anyone have a similar situation? If so, any theories about what might have caused it? TL;DR - Site ranked on 1st page for "keyword + location modifier" historically, now ranking on 1st page for "keyword" only and not found with "keyword + location modifier" TRQd9Hu0 -
Changing in website design reduce traffic? I don't think so.
HI, Around the month of Nov I was working on the website. Due to some reasons I have to change the design of website. I saw my traffic going down and down(70 - 100/day) so roll back it on previous one. after that it improve little bit but not as on previously. (traffic 250 - 300/day). Question: All Urls, content and links are same then how that can effect on the traffic. We have removed all the errors that was shown in the seomoz report.But traffic is still the issue here. We are working on SEO area enough and try to recover from it. Your suggestion may be helpful for us.So I am looking forward for your answers. how i can over come with it. Thanks Regards
Algorithm Updates | | lucidsoftech0 -
Is it allowed to put a word in all domains URLs to get higher in SERP?
Hello, What good or bad could happen if someone put the same keyword in all site's URL's? (i.e. I would be selling cars and my domain isn't included any word cars, so i put all of my pages in one folder like domain.com/cheap-cars/etc)
Algorithm Updates | | komeksimas0 -
Is it OK to 301 redirect the index page to a search engine friendly url
Is it OK to 301 redirect the index page to a search engine friendly url.
Algorithm Updates | | WinningInch0 -
Yet another Panda question
Hi Guys, I'm just looking for confirmation on something..... In the wake of Panda 2.2 one of my pages has plummeted in the rankings whilst other similar pages have seen healthy improvements. Am I correct in thinking that Panda effects individual pages and doesn't tar an entire site with the same brush? Really I'm trying to see if Panda is the reason in the drop on one page or whether it could be something else. The page in question has dropped 130 positions - not just a general fluctuation. Thanks in advance for your responses!!!
Algorithm Updates | | A_Q0