301 Question - issue
-
A while back we had a 'bleed' on one of our sites, which basically meant one of our sites started to leak across pages to another and that site started to rank for the same pages and now we have hundreds of pages ranking for urls that do not exists. It's hard to explain, bare with me.
If you were to click on the cached view in Google for the ranked page it would show you the main site, but if you were to click it as usual, then you would be taken to the site but a 404 would show as the intended page was not for that site.
We believe we fixed the 'bleed' and have setup 301s for all the affected pages to go to the home page for the site it affected. But these pages have not been removed from Google, which we thought a 301 would do. So we still have hundreds of pages being ranked but are redirected to the home page.
Why hasn't these pages been removed?
-
It's probably just taking Google a while to process all the changes. Really your 301s should point to the same content, not just all go to the homepage. If you had pages showing on two sites, the pages do 'really' exist on one site but weren't supposed to exist on the other. Correct the 301s so that they point from the URLs on the affected site, to the exact same pieces of content on the site where they were originally located (where they were supposed to be located)
If that fails use the HTTP header and X-robots (not no-index tags, fire the no-index directive from the HTTP header instead of the HTML) to tell Google not to index those URLs on the 'affected' website. In conjunction with that, alter the status code of all bogus URLs on the 'affected' site to 410, which is stronger than 404 (it means: GONE - not coming back, 404 just means temporarily gone but will return...)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Asking a natural question in H tags ?
Hello, I read that in H tags it is more natural to write the question a user would ask, does it really have any benefits in terms of seo For example instead of "Tour map" writing "what are the villages you visit ?" or instead of "Activity level" write " "what is the level like ?" Does it help in anyway ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
301 redirects Ruby on Rails
Can anyone point me to the best way to implement 301 redirects on a Ruby on Rails website?
Intermediate & Advanced SEO | | brianvest0 -
Canonical questions
Hi, We are working on a site that sells lots of variations of a certain type of product. (Car accessories) So lets say there are 5 products but each product will need a page for each car model so we will potentially have a lot of variations/pages. As there are a lot of car models, these pages will have pretty much the same content, apart from the heading and model details. So the structure will be something like this; Product 1 (landing page) Audi (model selection page)
Intermediate & Advanced SEO | | davidmaxwell
---Audi A1 (Model detail page)
---Audi A2 (Model detail page)
---Audi A3 (Model detail page) BMW (model selection page)
---BMW 1 Series (Model detail page)
---BMW 3 Series (Model detail page) Product 2 (landing page) Audi (model selection page)
---Audi A1 (Model detail page)
---Audi A2 (Model detail page)
---Audi A3 (Model detail page) BMW (model selection page)
etc
etc The structure is like this as we will be targeting each landing page for AdWords campaigns. As all of these pages could look very similar to search engines, will simply setting up each with a canonical be enough? Is there anything else we should do to ensure Google doesn't penalise for duplicate page content? Any thoughts or suggestions most welcome.
Thanks!0 -
Technical 301 question
Howdy all, this has been bugging me for a while and I wanted to know the communities ideas on this. We have a .com website which has a little domain authority and is growing steadily. We are a UK business (but have a US office which we will be adapting too soon) We are ranking better within google.com than we do on google.co.uk probably down to our TLD. Is it a wise idea to 301 our .com to .co.uk for en-gb enquiries only? Is there any evidence that this will help improve our position? will all the link juice passed from 301s go to our .co.uk only if we are still applying the use of .com in the US? Many thanks and hope this isn't too complicated! Best wishes,
Intermediate & Advanced SEO | | TVFurniture
Chris0 -
Questions About Link Detox
Greetings: In April of 2014 an SEO firm ran a link removal campaign (identified spammy links and uploaded a disavow). The overall campaign was ineffective and MOZ domain rank has fallen to 24 from about 30 in the last year and traffic is 20% lower. I purchased a basic package for Link Detox and ran a report today (see enclosed) to see if toxic links could be contributing to our mediocre rankings. As a novice I have a few questions for you regarding this the use of Link Detox: -We scored a domain wide detox risk of 1,723. The site has referring root domains with 7113 links to our site. 121 links were classified as high audit priority. 56 as medium audit priority. 221 links were previously disavowed and we uploaded a spreadsheet containing the names of the previously disavowed links. We had LinkDetox include an analysis of no-follow links as they recommend this. Is our score really bad? If we remove the questionable links should we see some benefit in ranking? -Some of the links we disavowed last year are still linking to our site. Is it worthwhile to include those links again in our new disavow file? -Prior to filing a disavow we will request that Webmaster remove offending links. LinkDetox offers a package called Superhero for $469.00 that automates the process. Does this package effectively help with the entire process of writing and tracking the removal requests? Do you know of any other good alternatives? -A feature called "Boost" is included in the LinkDetox Super Hero package. It is suppose to expedite Google's processing of the disavow file. I was told by the staff at Link Detox that with Boost Google will process the disavow within a week. Do you have any idea if this claim is valid??? It would be great if it were true. -We never experienced any manual penalty from Google. Will uploading a disavow help us under the circumstances? Thanks for your feedback, I really appreciate it!!! Alan p2S6H7l
Intermediate & Advanced SEO | | Kingalan10 -
Http to https question (SSL)
Hi, I recently made two big changes to a site - www.aerlawgroup.com (not smart, I know). First, I changed from Weebly to Wordpress (WP Engine hosting with CDN + Cloudflare - is that overkill?) and I added SSL (http to https). From a technical perspective, I think I made a better site: (1) blazing fast, (2) mobile responsive, (3) more secure. I'm seeing the rankings fluctuate quite a bit, especially on the important keywords. I added SSL to my other sites, and saw no rankings change (they actually all went up slightly). I'm wondering if anyone has had experience going to SSL and can give me feedback on something I might have overlooked. Again, it's strange that all the other sites responded positively, but the one listed above is going in the opposite direction. Maybe there are other problems, and the SSL is just a coincidence. Any feedback would be appreciated. I followed this guide: http://moz.com/blog/seo-tips-https-ssl - which helped tremendously (FYI).
Intermediate & Advanced SEO | | mrodriguez14400 -
SEO issues with Magento
Hi Everyone, We use Magento CMS for our site and we are having a frustrating time resolving our SEO issues. The site was very poorly managed in years past and in the past year I have redesigned and cleaned up many things. However we are recently having trouble with indexing and keyword ranking. Issue #1: Our main keyword ranking has dropped quite a bit while our other less important keywords have steadily risen. I suspect a very strict robots.txt implemented back in early January may have been the culprit. We have since been modifying it with out much luck. Many of our pages are still blocked. 12/05/12 : ranked 12th 1/09/13: ranked 19th 1/16/13: ranked 35th Now: out of top 50 (52nd) Issue #2: Not a single image is being indexed. We are 0 for 582 according to Webmaster tools. Not sure why... Any help and advice would be greatly appreciated as I have great determination and interest in learning the correct way to fix/do this. Site: www.scojo.com Thanks
Intermediate & Advanced SEO | | t_parrish0 -
Robots.txt Question
For our company website faithology.com we are attempting to block out any urls that contain a ? mark to keep google from seeing some pages as duplicates. Our robots.txt is as follows: User-Agent: * Disallow: /*? User-agent: rogerbot Disallow: /community/ Is the above correct? We are wanting them to not crawl any url with a "?" inside, however we don't want to harm ourselves in seo. Thanks for your help!
Intermediate & Advanced SEO | | BMPIRE0