301 / 404 & Getting Rid of Keyword Pages
-
I had a feeling that my keyword focused pages were causing my site not to rank well. I do not have that many keywords. I have 2 main keyword phrases along with 6 city locations. For example (fake) "tea house tampa" "tea house clearwater" "tea house sarasota" and "tea room tampa" "tea room cleawater" "tea house sarasota". So, I don't feel that I need that many pages. I feel like I can optimize my home page and maybe 1 or 2 topic pages. Right now, I have a keyword for each of those phrases. These are all internal pages on 1 domain. Not multiple domains.
Sooo... I tested it by 301ing a few of my "tea house" KW pages to the home page. And low and behold... my home page rose BIG TIME! Major improvement! I'm talking like 13th to 2nd!
Here is my question... how should I proceed? My SEO has warned me against 301ing too many pages all pointing to the home page. He says that will negatively impact my ratings. Should I 404 some pages? Should I build a "tea room" topic page and 301 that set there? What is worse? 301 or 404? How many is too many?
I'm really excited by these results, but I'm scare to move forward and hurt what has happened. Thanks in advance!
-
If it's just a few pages then I don't think it should be something to worry about. I've 301'ed thousands and it never affected me before.
The problem with this though is you might have some trouble ranking for some (different) location based searches, or at least, to consistently improve after the 301's initial benefits stop.
I always stay away from 404s as much as I can. Although it doesnt really affect your SEO if the dead link is nowhere on your pages but I still dislike it, personal preference.
I won't be much help as this is all still trivial without seeing the actual pages and their functions.
Treat it as a test though, you'll learn a lot from it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Getting rid of duplicate content remaining from old misconfiguration
Hi Friends,We have recently (about a month ago) launched a new website, and during the review of that site spotted a serious misconfiguration of our old terrible WP siteThis misconfiguration, which may have come from either sitemaps or internal links or both lead to displaying our french german and english sites on each others’ domains. This should be solved now, but they still show in SERPS: The big question is: What’s the best way to safely remove those from SERPS?We haven’t performed as well as we wanted for a while and we believe this could be one of the issues:Try to search for instance“site:pissup.de stag do -junggesellenabschied” to find english pages on our german domain, each link showing either 301 or 404.This was cleaned to show 301 or 404 when we launched our new site 4 weeks ago, but I can still see the results in SERPS, so I assume they still count negatively?Cheers!
Intermediate & Advanced SEO | | pissuptours0 -
Google crawling 200 page site thousands of times/day. Why?
Hello all, I'm looking at something a bit wonky for one of the websites I manage. It's similar enough to other websites I manage (built on a template) that I'm surprised to see this issue occurring. The xml sitemap submitted shows Google there are 229 pages on the site. Starting in the beginning of December Google really ramped up their intensity in crawling the site. At its high point Google crawled 13,359 pages in a single day. I mentioned I manage other similar sites - this is a very unusual spike. There are no resources like infinite scroll that auto generates content and would cause Google some grief. So follow up questions to my "why?" is "how is this affecting my SEO efforts?" and "what do I do about it?". I've never encountered this before, but I think limiting my crawl budget would be treating the symptom instead of finding the cure. Any advice is appreciated. Thanks! *edited for grammar.
Intermediate & Advanced SEO | | brettmandoes0 -
Fetch as Google -- Does not result in pages getting indexed
I run a exotic pet website which currently has several types of species of reptiles. It has done well in SERP for the first couple of types of reptiles, but I am continuing to add new species and for each of these comes the task of getting ranked and I need to figure out the best process. We just released our 4th species, "reticulated pythons", about 2 weeks ago, and I made these pages public and in Webmaster tools did a "Fetch as Google" and index page and child pages for this page: http://www.morphmarket.com/c/reptiles/pythons/reticulated-pythons/index While Google immediately indexed the index page, it did not really index the couple of dozen pages linked from this page despite me checking the option to crawl child pages. I know this by two ways: first, in Google Webmaster Tools, if I look at Search Analytics and Pages filtered by "retic", there are only 2 listed. This at least tells me it's not showing these pages to users. More directly though, if I look at Google search for "site:morphmarket.com/c/reptiles/pythons/reticulated-pythons" there are only 7 pages indexed. More details -- I've tested at least one of these URLs with the robot checker and they are not blocked. The canonical values look right. I have not monkeyed really with Crawl URL Parameters. I do NOT have these pages listed in my sitemap, but in my experience Google didn't care a lot about that -- I previously had about 100 pages there and google didn't index some of them for more than 1 year. Google has indexed "105k" pages from my site so it is very happy to do so, apparently just not the ones I want (this large value is due to permutations of search parameters, something I think I've since improved with canonical, robots, etc). I may have some nofollow links to the same URLs but NOT on this page, so assuming nofollow has only local effects, this shouldn't matter. Any advice on what could be going wrong here. I really want Google to index the top couple of links on this page (home, index, stores, calculator) as well as the couple dozen gene/tag links below.
Intermediate & Advanced SEO | | jplehmann0 -
Ranking for keyword I don't optimize for & Other oddities
Hi Moz Community! I've been working with a clients website for about a year now. They were hit with the original Panda update because of some spammy links from a shady SEO firm. We've made a decent climb back but not a full recovery. There are some weird things happening that I would love some insight into. 1. Ranking for keywords we don't optimize for: I noticed some low keyword volume for a keyword term that is close to our main term, but is slightly different. We don't optimize for this term at all on our website. We rank third for this term, and actually show site links in the result, which doesn't happen for any of our other pages. 2. Index not found when doing site: search: Other oddity is that when you search site:www.mywebsite.com, I see all the pages within the site except the homepage. Not sure whats going on here, but when I fetch the homepage in GWMT, it returns the homepage. When you query the homepage by itself, it also ranks. Any help would be appreciated! Regards, J
Intermediate & Advanced SEO | | artscienceweb0 -
Show parts of page A on page B & C?
Good afternoon,
Intermediate & Advanced SEO | | rayvensoft
A quick question. I am working on a website which has a large page with different sections. Lets say: Page 1
SECTION A
SECTION B
SECTION C Now, they are adding a new area where they want to show only certain sections, so it would look like this: Page 2
SECTION A Page 3
SECTION C Page 4
SECTION D So my question is, would a rel='canonical' tag back to Page 1 be the correct way of preempting any duplicate content issues? I do not need Page 2-4 to even be indexed, it is just a matter of usability and giving the users what they are looking for without all the rest of the extra stuff. Gracias. Tesekürler. Salamat Ko. Thanks. (bonus thumbs up for anybody who knows which languages each of those are) 🙂0 -
301 redirect or Link back from old to new pages
Hi all, We run a ticket agent, and have multiple events that occur year after year, for example a festival. The festival has a main page with each event having a different page for each year like the below: Main page
Intermediate & Advanced SEO | | gigantictickets
http://www.gigantic.com/leefest-tickets (main page) Event pages:
http://www.gigantic.com/leefest-2010-tickets/hawksbrook-lane-beckenham/2009-08-15-13-00-gce/11246a
http://www.gigantic.com/leefest-2010-tickets/highhams-hill-farm-warlingham/2010-08-14-13-00-gce/19044a
http://www.gigantic.com/leefest-2011-tickets/highhams-hill-farm-warlingham/2011-08-13-13-00-gce/26204a
http://www.gigantic.com/leefest-2012-tickets/highhams-hill-farm-warlingham/2012-06-29-12-00-gce/32168a
http://www.gigantic.com/leefest-2013/highhams-hill-farm/2013-07-12-12-00 my question is: Is it better to leave the old event pages active and link them back to the main page, or 301 redirect these pages once they're out of date? (leave them there until there is a new event page to replace it for this year) If the best answer is to leave the page there, should i use a canonical tag back to the main page? and what would be the best way to link back? there is a breadcrumb there now, but it doesn't seem to obvious for users to click this. Keywords we're aming for on this example are 'Leefest Tickets', which has good ranking now, the main page and 2012 page is listed. Thanks in advance for your help.0 -
301 Redirect how to get those juices flowing
HI Guys Following on from my previous posts i have still not got my rankings back, http://www.seomoz.org/q/301-redirect-have-no-ranking i am beginning to think that i do have a underlying issue in the site which is restricting me My old site www.economyleasinguk.co.uk was moved to www.economy-car-leasing.co.uk, as mentioned the 301 seemed to go really well and all pages updated within 48 hours, however over 5 months on and the juice from the old site is still not pushed over and i hardly rank at all for anything. here are a list of things i have tried 1:Swapped the original 301 which was PHP for an Htaccess 2: added canonical tag to all pages 3: Turned on internal links as per this post by Everett Sizemore http://www.seomoz.org/blog/uncrawled-301s-a-quick-fix-for-when-relaunches-go-too-well number 3 was only done 5 days ago and initially bot traffic was immense, and may need a bit more time to see any results. I still think i have another underlying issue due to the below reasons 1: Page rank on home page is one but inner pages mixture of 1, 2 and 3 sporadically 2: If I copy text from home page no results 3: Open site explorer still has the old site at with a PA of 60 compared to 42 for the new site 4: Checked server logs and Google is visiting old site 5: Header responses are all correct for the canonicals and see no chaining of the 301’s 6: All pages are do follow and no robots restrictions 7: site:has only in the last few days removed the old site from the index naturally it could be that its just a matter of time however 5 months for a 301 is a very long time and 80% traffic loss is immense I would really appreciate it if someone can give the site a once over and see if i have missed anything obvious. Thanks in advance
Intermediate & Advanced SEO | | kellymandingo0 -
How get rid of duplicate content, titles, etc on php cartweaver site?
my website http://www.bartramgallery.com was created using php and cartweaver 2.0 about five years ago by a web developer. I was really happy with the results of the design was inspired to get into web development and have been studying ever since. My biggest problem at this time is that I am not knowledgable with php and the cartweaver product but am learning as I read more. The issue is that seomoz tools are reporting tons of duplicate content and duplicate title pages etc. This is likely from the dynamic urls and same pages with secondary results etc. I just made a new sitemap with auditmypc I think it was called in an attempt to get rid of all the duplicate page titles but is that going to solve anything or do I need to find another way to configure the site? There are many pages with the same content competing for page rank and it is a bit frustrating to say the least. If anyone has any advice it would be greatly appreciated even pointing me in the right direction. Thank you, Jesse
Intermediate & Advanced SEO | | WSOT0