Sitemap Question - Should I exclude or make a separate sitemap for Old URL's
-
So basically, my website is very old... 1995 Old. Extremely old content still shows up when people search for things that are outdated by 10-15+ years , I decided not to drop redirects on some of the irrelevant pages. People still hit the pages, but bounce...
I have about 400 pages that I don't want to delete or redirect. Many of them have old backlinks and hold some value but do interfere with my new relevant content.
If I dropped these pages into a sitemap, set the priority to zero would that possibly help? No redirects, content is still valid for people looking for it, but maybe these old pages don't show up above my new content?
Currently the old stuff is excluded from all sitemaps.. I don't want to make one and have it make the problem worse. Any advise is appreciated.
Thx
-
Sending you a PM
-
You are welcome!
Still get that traffic in the move It's free traffic, try to make the most out of it. Find the best way to point them in the direction you need them to go always keeping an eye in being as friendly and natural as possible.
-
Good plan actually, I appreciate it. I dev'd my own sitemap script but agree xml-sitemaps works great. I suggest that to friends & clients needing an easy solution.
Giving the analytics... I did't want to update roughly 400 pages. However, you handed me my resolution... I'll wrap the old pages with my up to date header/footer & just make some banners that direct traffic to the updated website.
Note: Making a basketball/shoe analogy... Just assume I'm selling Nike Shoes & traffic lands on my 1995,1996,1997 etc Charles Barkley pages. I don't sell shoes, and my query reports & analytics show people arent searching for Barkley but because of the age and trust of my page, engines still point them there.
Anyway, I appreciate it a lot. Over complicated things this time !
-
I don't think messing with your sitemap will work. Google serves what they think is better to the user, even if it is old content.
You have several options here to go for:
- Make a full sitemap automatically that will assign priority automatically like the one provided by xml-sitemaps.com (incredible software in my personal opinion and well worth the money).
- Update the content on those pages you say it's outdated. I think Google prefers serving pages that have huge value instead of "new", therefore, updating the content of those pages may decrease your bounce rate.
- While on the old pages, link to the new posts that include the new info. You can even put something like "This content is outdated, for the up-to-date version, click here" and link to the most appropriate new page, you keep the page, no 301s and pass some juice to the new page.
I think the best would be to use the 1st and 2nd options in conjunction. Or 1st and 3rd if the content of the "old" pages have something that updating them will loose their value.
In any case, I wouldn't leave pages out of the sitemap. The software I mentioned automatically assigns priority as to "how deep the page is in your site" (links it needed to follow to reach that page, older pages will surely need more clicks to reach to them).
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Sitemap really matters today?
Hi. Our website has multiple subdomians and domain platform is wordpress. But we haven't submitted sitemap to Google. Is this Okay? Or it's mandatory to submit sitemap? Is submitting a sitemap gonna help us in ranking improvement?
Algorithm Updates | | vtmoz0 -
Flat Structure URL vs Structured Sub-directory URL
We are finally taking our classifieds site forward and moving into a much improved URL structure, however, there is some disagreement over whether to go with a Flat URL structure or a structured sub-directory. I've browsed all of the posts and Q&A's for this going back to 2011, and still don't feel like I have a real answer. Has anyone tested this yet, or is there any consensus over ranking? I am in a disagreement with another SEO manager about this for our proposed URL structure redesign who is for it because it is what our competitors are doing. Our classifieds are geographically based, and we group by state, county, and city. Most of our traffic comes from state and county based searches. We also would like to integrate categories into the URL for some of the major search terms we see. The disagreement arises around how to structure the site. I prefer the logical sub-directory style: [sitename]/[category]/[state]/[county]/
Algorithm Updates | | newspore
mysite.com/for-sale/california/kern-county/
or
[sitename]/[category]/[county]-county-[stateabb]/
mysite.com/for-sale/kern-county-ca/ I don't mind the second, except for when you look at it in the context of the whole site: Geo Landing Pages:
mysite.com/california/
mysite.com/los-angeles-ca-90210/ Actual Search Pages:
mysite.com/for-sale/orange-ca/[filters] Detail Pages:
mysite.com/widget-type/cool-product-name/productid I want to make sure this flat structure performs better before sacrificing my analytics sanity (and ordered logic). Any case studies, tests or real data around this would be most helpful, someone at Moz must've tackled this by now!0 -
Hi guys, I have a question about linking to a product page for linkbuilding. Does that count adversely vs. linking to a homepage?
Hi - so until now we have been building links via blog posts and articles and linking them to the homepage. It seems the ranking of some of my top keywords has fallen so had a few questions/concerns: Does it affect the rankings adversely if I link to the product page vs the homepage? What is rule of thumb for increasing rankings of inside pages/keywords and building links to them? Thanks
Algorithm Updates | | DGM0 -
How can a site with two questionable inbound links outperform sites with 500-1000 links good PR?
Our site for years was performing at #1 for but in the last 6 months been pushed down to about the #5 spot. Some of the domains above us have a handful of links and they aren't from good sources. We don't have a Google penalty. We try to only have links from quality domains but have been pushed down the SERP's? Any suggestions?
Algorithm Updates | | northerncs0 -
Google Site Links question
Are Google site links only ever shown on the top website? Or is it possible for certain queries for the site in position #2 or #3 or something to have site links but the #1 position not have them? If there are any guides, tips or write ups regarding site links and their behavior and optimization please share! Thanks.
Algorithm Updates | | IrvCo_Interactive0 -
Geo Target Location in your URL Structure
Hello everyone at SEOMOZ 😄 I have a question if you would be as kind as to inform me of which direction that I should take on this matter would be the more desirable approach for my seo strategy I have been using my location in my URL structure since I started doing SEO 5 years ago and I have always benefited from including my city in the URL. My question is, since the SEO landscape has change so drastically over the past 2 years and the Search Engines have become much more end user friendly and list suggestions for users as they type would it be more beneficial in 2013 to have the "Keyword" before or after the Geo Targeted Location in the URL structure? I own a computer repair business for the past 6 years now and I know that when i check to see where I am ranking for a particular keyword phrase such as "Computer Repair" GOOGLE detects my location and provides suggestions as I start typing out "Computer Repair" for the search query. One of the suggestions is "Computer Repair Wilmington NC" so I am starting to wonder if placing the Geo Targeted City after the Keyword would be the wiser choice instead of before it like a couple of years ago? Working Example: Here is a site that I am building out right now to re-brand my business. Currently I have one of the Silo Category Slugs set as seen below using the Location before the Keyword The First Example has the Geo Target Location before the Keyword and looks more natural to visitors on the site (at least to me) however I'm afraid that I may be shooting myself in the foot not placing the keyword before the Target Location? But if I do that, It does not read or flow fluently to the average looker so kinda confused and torn on how to deal with this>! FIRST EXAMPLE: Location Before Keyword Silo Parent Category = "Computer Repair" http://www.pcmedicsoncall.com/wilmington-nc-computer-repair/ Silo Child Category = "Laptop" http://www.pcmedicsoncall.com/wilmington-nc-computer-repair/laptop-repair/ Silo Grand Child Category = "LCD Replacement" http://www.pcmedicsoncall.com/wilmington-nc-computer-repair/laptop/lcd-screen-replacement/ **SECOND EXAMPLE: ** Keyword Before Location Silo Parent Category = "Computer Repair" http://www.pcmedicsoncall.com/computer-repair-wilmington-nc/ Silo Child Category = "Laptop" http://www.pcmedicsoncall.com/computer-repair-wilmington-nc/laptop-repair/ Silo Grand Child Category = "LCD Replacement" http://www.pcmedicsoncall.com/computer-repair-wilmington-nc/laptop-repair/lcd-screen-replacement/ Which would be the more favorable of the 2 examples that I have given please? Keyword before or After the Geo Targeted Location? thank you
Algorithm Updates | | MarshallThompson310 -
Is URL appearance defined by crawling or by XML sitemap
I am having a problem developing a sitemap because I have long URLs that are made by zend. They go like this: http://myagingfolks.com/professionals/20661/social-workers/pennsylvania-civi-stanger Because these URL's are long and are fed by Zend when I try to call them all up, to put on the sitemap, the system runs out of memory and crashes. Do you know what part of a search result, in google, say, comes from the URL? Would it be fine for me to submit to google only www.myagingfolks.com/professionals/20661. Does the crawler find that the URL is indeed http://myagingfolks.com/professionals/20661/social-workers/pennsylvania-civi-stanger or does it go with just what the sitemap tells it?
Algorithm Updates | | Jordanrg0 -
Google changing case of URLs in SERPs?
Noticed some strange behavior over the last week or so regarding our SERPs and I haven't been able to find anything on the web about what might be happening. Over the past two weeks, I've been seeing our URLs slowly change from upper case to lower case in the SERPs. Our URLs are usually /Blue-Fuzzy-Widgets.htm but Google has slowly been switching them to /blue-fuzzy-widgets.htm. There has been no change in our actual rankings nor has it happened to anyone else in the space. We're quite dumbfounded as to why Google would choose to serve the lower case URL. To be clear, we do not build links to these lower case URLs, only the upper. Any ideas what might be happening here?
Algorithm Updates | | Natitude0