Sitemap Question - Should I exclude or make a separate sitemap for Old URL's
-
So basically, my website is very old... 1995 Old. Extremely old content still shows up when people search for things that are outdated by 10-15+ years , I decided not to drop redirects on some of the irrelevant pages. People still hit the pages, but bounce...
I have about 400 pages that I don't want to delete or redirect. Many of them have old backlinks and hold some value but do interfere with my new relevant content.
If I dropped these pages into a sitemap, set the priority to zero would that possibly help? No redirects, content is still valid for people looking for it, but maybe these old pages don't show up above my new content?
Currently the old stuff is excluded from all sitemaps.. I don't want to make one and have it make the problem worse. Any advise is appreciated.
Thx
-
Sending you a PM
-
You are welcome!
Still get that traffic in the move It's free traffic, try to make the most out of it. Find the best way to point them in the direction you need them to go always keeping an eye in being as friendly and natural as possible.
-
Good plan actually, I appreciate it. I dev'd my own sitemap script but agree xml-sitemaps works great. I suggest that to friends & clients needing an easy solution.
Giving the analytics... I did't want to update roughly 400 pages. However, you handed me my resolution... I'll wrap the old pages with my up to date header/footer & just make some banners that direct traffic to the updated website.
Note: Making a basketball/shoe analogy... Just assume I'm selling Nike Shoes & traffic lands on my 1995,1996,1997 etc Charles Barkley pages. I don't sell shoes, and my query reports & analytics show people arent searching for Barkley but because of the age and trust of my page, engines still point them there.
Anyway, I appreciate it a lot. Over complicated things this time !
-
I don't think messing with your sitemap will work. Google serves what they think is better to the user, even if it is old content.
You have several options here to go for:
- Make a full sitemap automatically that will assign priority automatically like the one provided by xml-sitemaps.com (incredible software in my personal opinion and well worth the money).
- Update the content on those pages you say it's outdated. I think Google prefers serving pages that have huge value instead of "new", therefore, updating the content of those pages may decrease your bounce rate.
- While on the old pages, link to the new posts that include the new info. You can even put something like "This content is outdated, for the up-to-date version, click here" and link to the most appropriate new page, you keep the page, no 301s and pass some juice to the new page.
I think the best would be to use the 1st and 2nd options in conjunction. Or 1st and 3rd if the content of the "old" pages have something that updating them will loose their value.
In any case, I wouldn't leave pages out of the sitemap. The software I mentioned automatically assigns priority as to "how deep the page is in your site" (links it needed to follow to reach that page, older pages will surely need more clicks to reach to them).
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are you seeing 404's from utililab.mysearchguardian.com?
I've been noticing a lot of 404's popping up in my Google Webmaster accounts coming from utililab.mysearchguardian.com. Utililab itself seems to be some sort of malware, but why is Google indexing it and sending 404's?
Algorithm Updates | | EthanThompson0 -
Does Google's Information Box Seem Shady to you?
So I just had this thought, Google returns information boxes for certain search terms. Recently I noticed one word searches usually return a definition. For example if you type in the word "occur" or "happenstance" or "frustration" you get a definition information box. But what I didn't see is a reference to where they are getting or have gotten this information. Now it could very well be they built their own database of definitions, and if they did great, but here is where it seems a bit grey to me... Did Google hire a team of people to populate the database, or did they just write an algorithm to comb a dictionary website and stick the information in their database. The latter seems more likely. If that is what happened then Google basically stole the information from somebody to claim it as their own, which makes me worry, if you coin a term, lets say "lumpy stumpy" and it goes mainstream which would entail a lot of marketing, and luck. Would Google just add it to its database and forgo giving you credit for its creation? From a user perspective I love these information boxes, but just like Google expects us webmasters to do, they should be giving credit where credit is due... don't you think? I'm not plugged in to the happenings of Google so maybe they bought the rights, or maybe they bought or hold a majority of shares in some definition type company (they have the cash) but it just struck me as odd not seeing a reference to a site. What are your thoughts?
Algorithm Updates | | donford1 -
How is this possible? #2 ranking with NO on-page keywords, no backlinks, no sitemap...
Hi everybody. I have a question ... I'm totally stumped. This question is being asked today (November 16th, 2015) just after Google updated something in their algorithm. Nobody seems to know what they did. and it has something to do with the new "Rank Brain" system they're now using. My niche is Logo Design Software (https://www.thelogocreator.com). I had the keywords "logo creator" on the page roughly 7 times. After Google updated, I lost about 10 spots and as of this writing, I've dropped to #15. So, maybe I over optimized. fine. Noticing that for the keyword "logo creator" ... NONE of the top 14 spots actually have "logo creator" in their page title and NONE of them have more that 2 instances (if any) of the keyword "logo creator" on the actual page. So I removed ALL instances of my keyword "logo creator" from my home page - used the Webmaster's Fetch Tool and moved up a few spots instantly. So what the heck? And the #2 spot for that keyword is www.logomakr.com - they have NO words at all on their pages, no blog, no sitemap and far fewer links than anybody in the top 10. Can anybody reading this shed some light? Marc Marc Sylvester
Algorithm Updates | | Laughingbird
Laughingbird Software0 -
With Google's Location Based Searches, Should I Include a City Name with My Keywords?
What I mean is when you search on Google it seems to pull information by your location so would it be helpful including the city name + keyword still for SEO or would it be just as helpful using just the keyword? For example, a client is in Alexandria, VA and has a computer repair shop so would "Alexandria computer repair" be as good or better than "computer repair"? Just a little curious. Thanks!
Algorithm Updates | | CodyOelker-AMICreativeStudio2 -
Video SEO <video:uploader>sitemap optional tag for Google+</video:uploader>
Anyone know the specifics or using the video:uploaderoptional tag for Google+ for rel=”author” attribution. for video sitemap?</video:uploader> Related post has some info, but no specific example. http://www.distilled.net/blog/video/getting-video-results-in-google/ Quote from above link: "Good practice is to ensure that the
Algorithm Updates | | Packetman007
video:uploaderelement links to a Google+ profile or a blog profile
page with rel=”author” attribution. "</video:uploader> This is what it seems it should look like in the video sitemap: <video:uploader info="<a href=" https:="" plus.google.com="" 111123738944093379428"="" target="_blank">https://plus.google.com/111123738944093379428">Bill
Alderson</video:uploader> If you know this works and is worth editing video sitmaps to add the optional tag, let me know your experience. Alternately, my site (and each page, thanks to Yoast SEO for WP) does have the rel="author" linked to Google+ for every page, which may make the sitemap entry moot, but I have not yet seen this work in that manner. If you know it does or does not work, please let me know. Please let me know if you have any better information or specific experience. Also, if I elect to edit my sitemaps (provided by Wistia.com and BitsontheRun) to include this tag, what XML Sitemap Tool might work well to add these tags properly? Seems there is lots of XML Sitemap tools, but few really address Video Sitemap options specifically. Thanks, Bill@apalytics.com www.apalytics.com0 -
Let's talk about link networks
With the recent deindexing of blog/link networks, I was hoping to get the Q&A's take on what defines a link network. Are all link building services using link networks? Would you consider something like: submitedge.com thehoth.com To use link networks? They generate links for you, but most of the time they will do it with "decent" content, on sites like Wordpress, Blogger, Squidoo and other similar sites. I don't think that most of their link sources are owned internally, but I could be wrong. Some of them use profile links to send links to their articles, which is garbage. Would you suggest staying away from services like this all together? I'd say that 90% of the services offered on submitedge might be junk, but a few look useful. I've seen a few people at my company have success with them, but fully understand that it could be short term, and potentially inevitable that those links get deindexed. I'd like to potentially find a good link building service that could bridge the gaps between when I have time to write content and do link building, as I know the engines like to see a steady stream of both. Any thoughts? Any other services you guys have used with some success? I am not looking for sites like fiverr or anything quick/cheap. I'd be willing to spend the appropriate money occasionally when I think I could use a few extra links, but don't think I need a regular link builder (as that's something I like to do). I also don't want to go the route of outright buying links from other websites. Cheers, Vinnie
Algorithm Updates | | vforvinnie2 -
Question about Local / Regional SEO
Good Morning Moz Community, I have a local SEO/regional SEO question. I apologize if this question is duplicated from another area on this forum but, a query of the term Regional SEO showed no results, as did similar queries. Please preference this entire question with "Knowing what we know about the most recent changes to local search" I know what has worked in the past, my concern is Now. Working with a heavily regulated client that is regional, mostly East Coast US. They are in Financial Services and state licensing is a requirement. They are licensed in 15 states. Obviously, it would look foolish, in this day in age, to Title Tag individual pages with local modifiers and have numerous pages covering a similar topic with not much difference than localized modifiers in front of the keyword. I've never found that SE's can understand broad regional terms such as New England or Mid Atlantic or Southeast or Northeast, if someone knows different please share. Aside from an exact match search. The client does have 7 offices in various states. Perfectly matching and consistent listings in G Places, Bing Local and Yahoo Local was step one and all their locations are now in those services and there are many more smaller local citation listings are in the works. We have also successfully implemented a plan to generate great reviews from actual customers, for each location, they're receiving a few a day right now. Their local places listings, where they have physical locations, are doing very well but: 1. What would the community's suggestion be on generating more targeted traffic in the 8 states where they have no physical location? 2. The client wants to begin creating smaller blogs that are highly localized to the states and major population centers that they do not have a physical location in. There is an open check book to dedicate to this effort however, I do a lot of work in this industry so I want to offer the best possible, most up to date advice, my concern is that these efforts will have two results: a. be obscured by the ”7 pack" by companies with local brick and mortar b. would detract from the equity built in their existing blog by generating content in other domains, I would prefer to continue growing the main blog. 3. As a follow up, it has been documented that Google is now using the same algorithm for local, personal and personalized, that being the case, is there any value in building links to you Places page? Can you optimize your Places page by using the same off site techniques as you would traditionally? Sorry to kill you with such a long question on a Sunday 🙂
Algorithm Updates | | dogflog1 -
Is URL appearance defined by crawling or by XML sitemap
I am having a problem developing a sitemap because I have long URLs that are made by zend. They go like this: http://myagingfolks.com/professionals/20661/social-workers/pennsylvania-civi-stanger Because these URL's are long and are fed by Zend when I try to call them all up, to put on the sitemap, the system runs out of memory and crashes. Do you know what part of a search result, in google, say, comes from the URL? Would it be fine for me to submit to google only www.myagingfolks.com/professionals/20661. Does the crawler find that the URL is indeed http://myagingfolks.com/professionals/20661/social-workers/pennsylvania-civi-stanger or does it go with just what the sitemap tells it?
Algorithm Updates | | Jordanrg0