Sitemap Question - Should I exclude or make a separate sitemap for Old URL's
-
So basically, my website is very old... 1995 Old. Extremely old content still shows up when people search for things that are outdated by 10-15+ years , I decided not to drop redirects on some of the irrelevant pages. People still hit the pages, but bounce...
I have about 400 pages that I don't want to delete or redirect. Many of them have old backlinks and hold some value but do interfere with my new relevant content.
If I dropped these pages into a sitemap, set the priority to zero would that possibly help? No redirects, content is still valid for people looking for it, but maybe these old pages don't show up above my new content?
Currently the old stuff is excluded from all sitemaps.. I don't want to make one and have it make the problem worse. Any advise is appreciated.
Thx
-
Sending you a PM
-
You are welcome!
Still get that traffic in the move It's free traffic, try to make the most out of it. Find the best way to point them in the direction you need them to go always keeping an eye in being as friendly and natural as possible.
-
Good plan actually, I appreciate it. I dev'd my own sitemap script but agree xml-sitemaps works great. I suggest that to friends & clients needing an easy solution.
Giving the analytics... I did't want to update roughly 400 pages. However, you handed me my resolution... I'll wrap the old pages with my up to date header/footer & just make some banners that direct traffic to the updated website.
Note: Making a basketball/shoe analogy... Just assume I'm selling Nike Shoes & traffic lands on my 1995,1996,1997 etc Charles Barkley pages. I don't sell shoes, and my query reports & analytics show people arent searching for Barkley but because of the age and trust of my page, engines still point them there.
Anyway, I appreciate it a lot. Over complicated things this time !
-
I don't think messing with your sitemap will work. Google serves what they think is better to the user, even if it is old content.
You have several options here to go for:
- Make a full sitemap automatically that will assign priority automatically like the one provided by xml-sitemaps.com (incredible software in my personal opinion and well worth the money).
- Update the content on those pages you say it's outdated. I think Google prefers serving pages that have huge value instead of "new", therefore, updating the content of those pages may decrease your bounce rate.
- While on the old pages, link to the new posts that include the new info. You can even put something like "This content is outdated, for the up-to-date version, click here" and link to the most appropriate new page, you keep the page, no 301s and pass some juice to the new page.
I think the best would be to use the 1st and 2nd options in conjunction. Or 1st and 3rd if the content of the "old" pages have something that updating them will loose their value.
In any case, I wouldn't leave pages out of the sitemap. The software I mentioned automatically assigns priority as to "how deep the page is in your site" (links it needed to follow to reach that page, older pages will surely need more clicks to reach to them).
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
One of our top visited page (login page) missing primary keyword, does this makes ranking drop of our homepage for same keyword?
Hi all, So, I have removed the "primary keyword" from login page, which is most visited page on our website to avoid keywords in non related pages. I noticed our homepage ranking dropped for same "primary keyword". Visitors of this login page directly land without searching with "primary keyword". Then how removing it from such page drops our ranking? Thanks
Algorithm Updates | | vtmoz0 -
Are SEO Friendly URLS Less Important Now That Google Is Indexing Breadcrumb Markup?
Hi Moz Community and staffers, Would appreciate your thoughts on the following question: **Are SEO friendly URLS less important now that Google is indexing breadcrumb markup in both desktop and mobile search? ** Background that inspired the question: Our ecommerce platform's out of the box functionality has very limited "friendly url" settings and would need some development work to setup an alias for more friendly URLS. Meanwhile, the breadcrumb markup is implemented correctly and indexed so it seems there's no longer an argument for improved CTR with SEO friendly URLS . With that said I'm having a hard time justifying the URL investment, as well as the 301 redirect mapping we would need to setup, and am wondering if more friendly URLs would lead to a significant increase in rankings for level of effort? Sidenote: We already rank well for non-brand and branded searches since we are brand manufacturer with an ecommerce presence. Our breadcrumbs are much cleaner & concise than our URL structure. Here are a couple examples. Category URL: http://www.mysite.com/browse/category1/subcat2/subcat3/_/N-7th
Algorithm Updates | | jessekanman
Breadcrumb: www.mysite.com > category1 > subcat2 > subcat3 Product URL: http://www.mysite.com/product/product-name/_/R-133456E112
Breadcrumb: www.mysite.com > category1 > subcat2 > subcat3 > product name The "categories" contain actual keywords just hiding them here in the example. According to my devs they can't get rid of the "_" but could possible replace it with a letter. Also they said it's an easier fix to make the URLs always lower case. Lastly some of our product URLS contain non-standard characters in the product name like "." and "," which is also a simpler fix according to my developers. Looking forward to your thoughts on the topic! Jesse0 -
Anchor name URLs & anchor blocks: how Google sees them?
Hi guys, Anchor name URLs & anchor blocks: how Google sees them? As far as I know Google hasn't ever recommended anchor name URLs and anchor blocks, mostly when you have one page site, but I have ran into an organic result with an hyper-link to an anchor name URL. anchor name link There is a proper link and there aren't on the page and the code the words "Jump to". It means Google has put those words there and it has also taken the header of that block as anchor text. Why has Google placed that link? The query is "faqs umbrella company", so I thought that Google has seen "faqs umbrella company" like "what is the most popular faq about umbrella companies?" and therefore perhaps the correct answer could be "Is an umbrella company the only option I have? What are the alternatives?". Although, IMHO the most popular FAQ on Umbrella Companies should always be "what is an umbrella company". Unfortunately, that page is only worthy of third Google organic result page and there is no hint of rich snippet or any kind of conversational/KBT optimisation on its source code. no-rich-snippet Someone has any idea of why Google shows that link and if it's something that we can optimise in our pages? Cheers Pierpaolo IhwGwkb.jpg VWORt5F.jpg
Algorithm Updates | | madcow780 -
301'ing old (2000), high PR, high pages indexed domain
Hi, I have an old (2000), very high PR, 20M+ pages indexed by goog domain which... got adsense banned. The domain has taken a few hits over the years from penguin/panda, but come out pretty well compared to many competitors. The problem is it was adsense banned in the big adsense acct ban of 2012 for invalid activity. No, I still have no idea what the issue was. I'd like to start using a new domain if I can safely get goog to pass the PR & indexing love so I can run adsense & Adx. What are your initial thoughts? Am I out of my mind to try?
Algorithm Updates | | comfortsteve1 -
Duplicate pages in language versions, noindex in sitemap and canonical URLs in sitemap?
Hi SEO experts! We are currently in the midst of reducing our amount of duplicate titles in order to optimize our SEO efforts. A lot of the "duplicate titles" come from having several language versions of our site. Therefore, I am wondering: 1. If we start using "" to make Google (and others) aware of alternative language versions of a given site/URL, how big a problem will "duplicate titles" then be across our domains/site versions? 2. Is it a problem that we in our sitemap include (many) URL's to pages that are marked with noindex? 3. Are there any problems with having a sitemap that includes pages that includes canonical URL's to other pages? Thanks in advance!
Algorithm Updates | | TradingFloor.com0 -
Why am i not ranking in the top 50 for the keyword 'cocktails' even though all my other cocktail related keywords are in the first 2 pages of Google???
I have checked the first 50 pages of google for my website www.socialandcocktail.co.uk using the keyword 'cocktails'. It is NOT to be found. However, if I search for other keyword combinations eg cocktail recipes, cocktail bars etc they are all in the first 2 pages! What is going on????????
Algorithm Updates | | cocktailboss0 -
Question about Local / Regional SEO
Good Morning Moz Community, I have a local SEO/regional SEO question. I apologize if this question is duplicated from another area on this forum but, a query of the term Regional SEO showed no results, as did similar queries. Please preference this entire question with "Knowing what we know about the most recent changes to local search" I know what has worked in the past, my concern is Now. Working with a heavily regulated client that is regional, mostly East Coast US. They are in Financial Services and state licensing is a requirement. They are licensed in 15 states. Obviously, it would look foolish, in this day in age, to Title Tag individual pages with local modifiers and have numerous pages covering a similar topic with not much difference than localized modifiers in front of the keyword. I've never found that SE's can understand broad regional terms such as New England or Mid Atlantic or Southeast or Northeast, if someone knows different please share. Aside from an exact match search. The client does have 7 offices in various states. Perfectly matching and consistent listings in G Places, Bing Local and Yahoo Local was step one and all their locations are now in those services and there are many more smaller local citation listings are in the works. We have also successfully implemented a plan to generate great reviews from actual customers, for each location, they're receiving a few a day right now. Their local places listings, where they have physical locations, are doing very well but: 1. What would the community's suggestion be on generating more targeted traffic in the 8 states where they have no physical location? 2. The client wants to begin creating smaller blogs that are highly localized to the states and major population centers that they do not have a physical location in. There is an open check book to dedicate to this effort however, I do a lot of work in this industry so I want to offer the best possible, most up to date advice, my concern is that these efforts will have two results: a. be obscured by the ”7 pack" by companies with local brick and mortar b. would detract from the equity built in their existing blog by generating content in other domains, I would prefer to continue growing the main blog. 3. As a follow up, it has been documented that Google is now using the same algorithm for local, personal and personalized, that being the case, is there any value in building links to you Places page? Can you optimize your Places page by using the same off site techniques as you would traditionally? Sorry to kill you with such a long question on a Sunday 🙂
Algorithm Updates | | dogflog1 -
What's the best way to discover which search terms competitors are highly-ranked for?
I'd like to know for which search terms competitors appear in the top 10, but I haven't found an efficient way to do so. Any help is appreciated...thanks!
Algorithm Updates | | actionagainsthunger0