Sitemap Question - Should I exclude or make a separate sitemap for Old URL's
-
So basically, my website is very old... 1995 Old. Extremely old content still shows up when people search for things that are outdated by 10-15+ years , I decided not to drop redirects on some of the irrelevant pages. People still hit the pages, but bounce...
I have about 400 pages that I don't want to delete or redirect. Many of them have old backlinks and hold some value but do interfere with my new relevant content.
If I dropped these pages into a sitemap, set the priority to zero would that possibly help? No redirects, content is still valid for people looking for it, but maybe these old pages don't show up above my new content?
Currently the old stuff is excluded from all sitemaps.. I don't want to make one and have it make the problem worse. Any advise is appreciated.
Thx
-
Sending you a PM
-
You are welcome!
Still get that traffic in the move It's free traffic, try to make the most out of it. Find the best way to point them in the direction you need them to go always keeping an eye in being as friendly and natural as possible.
-
Good plan actually, I appreciate it. I dev'd my own sitemap script but agree xml-sitemaps works great. I suggest that to friends & clients needing an easy solution.
Giving the analytics... I did't want to update roughly 400 pages. However, you handed me my resolution... I'll wrap the old pages with my up to date header/footer & just make some banners that direct traffic to the updated website.
Note: Making a basketball/shoe analogy... Just assume I'm selling Nike Shoes & traffic lands on my 1995,1996,1997 etc Charles Barkley pages. I don't sell shoes, and my query reports & analytics show people arent searching for Barkley but because of the age and trust of my page, engines still point them there.
Anyway, I appreciate it a lot. Over complicated things this time !
-
I don't think messing with your sitemap will work. Google serves what they think is better to the user, even if it is old content.
You have several options here to go for:
- Make a full sitemap automatically that will assign priority automatically like the one provided by xml-sitemaps.com (incredible software in my personal opinion and well worth the money).
- Update the content on those pages you say it's outdated. I think Google prefers serving pages that have huge value instead of "new", therefore, updating the content of those pages may decrease your bounce rate.
- While on the old pages, link to the new posts that include the new info. You can even put something like "This content is outdated, for the up-to-date version, click here" and link to the most appropriate new page, you keep the page, no 301s and pass some juice to the new page.
I think the best would be to use the 1st and 2nd options in conjunction. Or 1st and 3rd if the content of the "old" pages have something that updating them will loose their value.
In any case, I wouldn't leave pages out of the sitemap. The software I mentioned automatically assigns priority as to "how deep the page is in your site" (links it needed to follow to reach that page, older pages will surely need more clicks to reach to them).
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GSC Performance completely dropped off, but Google Analytics is steady. Why can't GSC track my site anymore?
Hey everyone! I'm having a weird issue that I've never experienced before. For one of my clients, GSC has a complete drop-off in the Performance section. All of the data shows that everything fell flat, or almost completely flat. But in Google Analytics, we have steady results. No huge drop-off in traffic, etc. Do any of you know why GSC would all of a sudden be unable to crawl our site? Or track this data? Let me know what you think!
Algorithm Updates | | TaylorAtVelox
Thanks!2 -
New Website Old Domain - Still Poor Rankings after 1 Year - Tagging & Content the culprit?
I've run a live wedding band in Boston for almost 30 years, that used to rank very well in organic search. I was hit by the Panda Updates August of 2014, and rankings literally vanished. I hired an SEO company to rectify the situation and create a new WordPress website -which launched January 15, 2015. Kept my old domain: www.shineband.com Rankings remained pretty much non-existent. I was then told that 10% of my links were bad. After lots of grunt work, I sent in a disavow request in early June via Google Wemaster Tools. It's now mid October, rankings have remained pretty much non-existent. Without much experience, I got Moz Pro to help take control of my own SEO and help identify some problems (over 60 pages of medium priority issues: title tag character length and meta description). Also some helpful reports by www.siteliner.com and www.feinternational.com both mentioned a Duplicate Content issue. I had old blog posts from a different domain (now 301 redirecting to the main site) migrated to my new website's internal blog, http://www.shineband.com/best-boston-wedding-band-blog/ as suggested by the SEO company I hired. It appears that by doing that -the the older blog posts show as pages in the back end of WordPress with the poor meta and tile issues AS WELL AS probably creating a primary reason for duplicate content issues (with links back to the site). Could this most likely be viewed as spamming or (unofficial) SEO penalty? As SEO companies far and wide daily try to persuade me to hire them to fix my ranking -can't say I trust much. My plan: put most of the old blog posts into the Trash, via WordPress -rather than try and optimize each page (over 60) adjusting tagging, titles and duplicate content. Nobody really reads a quick post from 2009... I believe this could be beneficial and that those pages are more hurtful than helpful. Is that a bad idea, not knowing if those pages carry much juice? Realize my domain authority not great. No grand expectations, but is this a good move? What would be my next step afterwards, some kind of resubmitting of the site, then? This has been painful, business has fallen, can't through more dough at this. THANK YOU!
Algorithm Updates | | Shineband1 -
We recently transitioned a site to our server, but Google is still showing the old server's urls. Is there a way to stop Google from showing urls?
We recently transitioned a site to our server, but Google is still showing the old server's urls. Is there a way to stop Google from showing urls?
Algorithm Updates | | Stamats0 -
What's the correct format when you Disavow a single page? with or without www.?
Hi Y'all. Can't seem to find an article on disavowing a single page. Do i use A, B, or submit both A and B? Example: A. http://disavowexample.com B. http://www.disavowexample.com Which one does Google prefer? I know for some I just find the canonical url of the page (which show www,) but wanted your expert advice! Thanks
Algorithm Updates | | Shawn1240 -
URL Importance In Search
This may have been addressed before. If it is, please link me to the thread. I'm trying to SEO for local surrounding cities my client services. It was suggested I purchase domains relevant to those cities and create separate pages optimized for those local keywords. Wondering if this is a good tactic. For example my client's business is located in Chicago, but services the surrounding suburbs of Chicago. Whats the current, best way to SEO?
Algorithm Updates | | severitydesign0 -
Video SEO: Youtube, Vimeo PRO, Wistia, Longtail BOTR Experience and questions
Obviously Video SEO is changing, Google is figuring out how to do it themselves. We are left wondering… Below we have tried to explain what we have learned and how the different sites work and their characteristics (links to graphics provided) Our problem is: We are not getting congruent Google site:apalytics.tv Video filter results. We are wondering how duplicate content may be affecting our results… and if so, why will Youtube not be duplicate and prevent your own site SEO efforts from working. Is Youtube special? Does that include Vimeo too? We see our own duplicate videos on multiple sites in Google results, so it seems it is not duplicate related…? We’d appreciate your experience or add to our questions and work as a community to get this figured out more definitively. Thanks! We’ve tried four video hosting solutions at quite a cost monetarily and in time. 1.) Youtube, which gets all the SEO Juice and gets our clients on to other subjects or potentially competitive content. Iframes just don’t get the results we are looking for. 2.) See Vimeo Image: Vimeo PRO, a $200 year plus solution that allows us to do many video carousels on our own domains hosted on Vimeo, but are very limited in HTML as only CSS content changes are allowed. While we were using Vimeo we allowed the Vimeo.com community to SEO our content directly and they come up often in search results. Due to duplicate content concerns we have disallowed Vimeo.com from using our content and SEOing our content to their domain. However, we have many “portfolios” (micro limited carousal sites on our domains) that continue to carry the content. The Vimeo hosted micro site shows only three videos on Google: site:apalytics.tv During our testing we are concerned that duplicate content is causing issues too, so we are getting ready to shut off the many microsite domains hosted at Vimeo. (Vimeo has an old embed code that allows a NON-iframe embed – but has discontinued it recently) That makes it difficult if not impossible to retain SEO juice for anything other than their simple micro sites that are very limited! 3.) See Wistia Image: Wistia, a $2000 year plus solution that only provides private video site hosting embedding various types of video content on one’s site/s. Wistia has a free account now for three videos and limited plays – it’s a nice interface for SEO but is still different than BOTR. We opted for BOTR because of many other advertising related options, but are again trying Wistia with the free version to see if we can figure out why our BOTR videos are not showing up as hoped. We know that Google does not promise to index and feature every video on a sitemap, but why some are there and others are not and when remains a mystery that we are hoping to get some answers about. 4.) See Longtail Image: Longtail, Bits On The Run, (JW Player author) a $1,000 year plus like Wistia provides private hosting, but it allows a one button YouTube upload for the same SEO meta data and content – isn’t that duplicate content? BOTR creates and submits video sitemaps for your content, but it has not been working for us and it has been impossible to get a definitive answer as I think they too are learning or are not wanting the expose their proprietary methods (which are not yet working for us!) 2O9w0.png 0eiPv.png O9bXV.png
Algorithm Updates | | Mark_Jay_Apsey_Jr.0 -
Redirected old domain to new, how long before seeing the external links under the new domain?
Before contracting SEO services, my client decided to change his established root domain to one more customer-friendly. Since he had no expertise on board, no redirects were set up until 6 months later. I ran stats right before the old domain was redirected and have a report showing that he had roughly 750 external links from 300 root domains. We redirected the old domain to the new domain in mid Jan 2012. Those external links are still not showing in Open Site Explorer for the new domain. I've tested it a dozen times, and the old domain definitely points to the new domain. How long should it take before the new domain picks up those external links? Should I do anything else to help the process along?
Algorithm Updates | | smsinc0 -
Strange Refferral URL coming in from Google
Hi, I've been monitoring my referral URL's coming in and today noticed they had changed. Previously when I clicked one it would be the google search result page - however now they all seem to be like this: http://www.google.co.uk/url?sa=t&source=web&cd=7&sqi=2&ved=0CHEQFjAG&url=http://www.mysite.com&rct=j&q=my%20keyword&ei=Bvc3TrbgB5G0hAfvqoSvAg&usg=AFQjCNFONDCPJDl3d2PYceYvale_cL7s4Q All these URL's immediately redirect to my website pages. Do you know what they are - they seem to be tracking URL's of some sort I am thinking?? Are they trying to analyse my site with respect to certain keywords?? Thanks
Algorithm Updates | | James770