Sitemap Question - Should I exclude or make a separate sitemap for Old URL's
-
So basically, my website is very old... 1995 Old. Extremely old content still shows up when people search for things that are outdated by 10-15+ years , I decided not to drop redirects on some of the irrelevant pages. People still hit the pages, but bounce...
I have about 400 pages that I don't want to delete or redirect. Many of them have old backlinks and hold some value but do interfere with my new relevant content.
If I dropped these pages into a sitemap, set the priority to zero would that possibly help? No redirects, content is still valid for people looking for it, but maybe these old pages don't show up above my new content?
Currently the old stuff is excluded from all sitemaps.. I don't want to make one and have it make the problem worse. Any advise is appreciated.
Thx
-
Sending you a PM
-
You are welcome!
Still get that traffic in the move It's free traffic, try to make the most out of it. Find the best way to point them in the direction you need them to go always keeping an eye in being as friendly and natural as possible.
-
Good plan actually, I appreciate it. I dev'd my own sitemap script but agree xml-sitemaps works great. I suggest that to friends & clients needing an easy solution.
Giving the analytics... I did't want to update roughly 400 pages. However, you handed me my resolution... I'll wrap the old pages with my up to date header/footer & just make some banners that direct traffic to the updated website.
Note: Making a basketball/shoe analogy... Just assume I'm selling Nike Shoes & traffic lands on my 1995,1996,1997 etc Charles Barkley pages. I don't sell shoes, and my query reports & analytics show people arent searching for Barkley but because of the age and trust of my page, engines still point them there.
Anyway, I appreciate it a lot. Over complicated things this time !
-
I don't think messing with your sitemap will work. Google serves what they think is better to the user, even if it is old content.
You have several options here to go for:
- Make a full sitemap automatically that will assign priority automatically like the one provided by xml-sitemaps.com (incredible software in my personal opinion and well worth the money).
- Update the content on those pages you say it's outdated. I think Google prefers serving pages that have huge value instead of "new", therefore, updating the content of those pages may decrease your bounce rate.
- While on the old pages, link to the new posts that include the new info. You can even put something like "This content is outdated, for the up-to-date version, click here" and link to the most appropriate new page, you keep the page, no 301s and pass some juice to the new page.
I think the best would be to use the 1st and 2nd options in conjunction. Or 1st and 3rd if the content of the "old" pages have something that updating them will loose their value.
In any case, I wouldn't leave pages out of the sitemap. The software I mentioned automatically assigns priority as to "how deep the page is in your site" (links it needed to follow to reach that page, older pages will surely need more clicks to reach to them).
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can Yoast plugin change back old title automatically?
I have a problem with the first page on my website. Im using Wordpress and Yoast SEO plugin. After I have changed and updated some information on the page, I also wanted to change and make a better page title. That I understand is to be changed under pages, by scrolling down to the Yoast setting for the specific page. Is that right? However, I have changed the title over and over again, and asked Google to re-index the page. Everything is fine directly after that. But when I check again after 24-48 hours, the page have automatically changed back to the old title? How is that possible? I´ve tried about 5-10 times, but it does the same thing after 24-48 hours every time. Hope you people with great knowledge can help me out here. 🙂
Algorithm Updates | | Masse0 -
Optimize for separate words or combined word.
I can't find good answers to this question so I'm asking here. Thanks for any help you can give. Most people, 4 out of 5, search for our product using two separate words, while the trademarked name of the product is one word. Think: CleanCar(tm) vs Clean Car. However our product is a leader in the industry so it would be like searching for perhaps "Play Station" vs "Playstation" if people were looking for a gaming console in general. Google separates them in the search volumes so I am assuming it does not see Clean Car in the same way it sees CleanCar. I (obviously) want to rank as highly as possible in both while keeping brand integrity in mind. Should I SEO for just the CleanCar or both? Perhaps using CleanCar in the title and Clean Car in the description? Does Google distinguish? Thanks! bnew
Algorithm Updates | | mcampanaro0 -
Bad Grammar's Effect on Rankings
Mozzers, I have a client who's brand style guide dictates that they write in all lowercase letters. Do you think this will hurt rankings? Nails
Algorithm Updates | | matt.nails1 -
One of my pages doesn't appear in Google's search
Our page has been indexed (I just checked) but literally doesn't exist in the first 300 results despite having a respectable DA & PA. Is there something I can do? There's no reason why this specific page doesn't rank, as far as I can see. It's not a new page. Cheers, Rhys
Algorithm Updates | | SwanseaMedicine0 -
What to do with old, outdated and light content on a blog?
So there's a blog I recently took over - that over the past 2 years has great content. However, with their 800+ published posts. I'd say that 250-300 posts are light in content, that's nothing more than a small paragraph with no real specificity on what its about - more like general updates. Now what would best practice be; optimizing all of the posts or deleting the posts and 301'ing the URL to another post/the root?
Algorithm Updates | | simplycary0 -
Google is forcing a 301 by truncating our URLs
Just recently we noticed that google has indexed truncated urls for many of our pages that get 301'd to the correct page. For example, we have:
Algorithm Updates | | mmac
http://www.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html as the url linked everywhere and that's the only version of that page that we use. Google somehow figured out that it would still go to the right place via 301 if they removed the html filename from the end, so they indexed just: http://www.eventective.com/USA/Massachusetts/Bedford/107/ The 301 is not new. It used to 404, but (probably 5 years ago) we saw a few links come in with the html file missing on similar urls so we decided to 301 them instead thinking it would be helpful. We've preferred the longer version because it has the name in it and users that pay attention to the url can feel more confident they are going to the right place. We've always used the full (longer) url and google used to index them all that way, but just recently we noticed about 1/2 of our urls have been converted to the shorter version in the SERPs. These shortened urls take the user to the right page via 301, so it isn't a case of the user landing in the wrong place, but over 100,000 301s may not be so good. You can look at: site:www.eventective.com/usa/massachusetts/bedford/ and you'll noticed all of the urls to businesses at the top of the listings go to the truncated version, but toward the bottom they have the full url. Can you explain to me why google would index a page that is 301'd to the right page and has been for years? I have a lot of thoughts on why they would do this and even more ideas on how we could build our urls better, but I'd really like to hear from some people that aren't quite as close to it as I am. One small detail that shouldn't affect this, but I'll mention it anyway, is that we have a mobile site with the same url pattern. http://m.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html We did not have the proper 301 in place on the m. site until the end of last week. I'm pretty sure it will be asked, so I'll also mention we have the rel=alternate/canonical set up between the www and m sites. I'm also interested in any thoughts on how this may affect rankings since we seem to have been hit by something toward the end of last week. Don't hesitate to mention anything else you see that may have triggered whatever may have hit us. Thank you,
Michael0 -
“Service Location” in Lieu of Separate NAP to Avoid Merge on Google+Local?
A client has two businesses out of the same address, same phone: an eat-in restaurant and a catering service. He has a separate website for each. He’s dying to optimize the catering, although long-term wants to optimize both. For the moment, Google only knows this restaurant and his only social media presence is set up as the restaurant as well -- thus the links to his social media even off of the catering site link to his restaurant accounts. I think he has two options: 1. Really do separate them. Get a different address (suite # or use his home address?) and phone. Set up new, separate social media. Register both, separately, at all the directories, etc. 2. Merge them both into the restaurant site and have the restaurant offer both eat-in and catering. Have some pages on the site optimized for lunch and others for catering, with the home page saying both. Register the one domain with all the directories, social media under the restaurant, but with a description that includes both lunch and catering as services offered. Variation on #2: Continue to have Google show the address, since it’s a restaurant, but add the “service location” area to show as well, for the catering part. My questions are: 1. If he kept the two websites separate, would hiding the address and just using a “service location” area for the catering one keep Google happy? I mean, could he keep the same address -- although I suppose he’d still have to get a new phone -- and set up the catering entry to show only the service area? And if he did that, would Google not merge them then? In directories, though, he’d still be listing both the restaurant and the catering separately but under the same address, so maybe this is a silly scenario anyway. What do you think? 2. Which option would you choose? 3. Are there any other better options? 4. In the #2 scenario, if a directory allows registry under one category, would you choose “restaurant” or “catering” -- or sometimes one and sometimes the other? Thank you for your insight!
Algorithm Updates | | rayvensoft0 -
Mobi sites and sitemaps
Hi all, How does should one treat mobi sites which have a separate set of files to the main site - with regards to the sitemap? Doe we tell Google about them?
Algorithm Updates | | gazza7770