Server requests: 302 followed by a 200
-
Hi,
On an IIS system clicking a particular link the following response codes are returned:
GET /nl/nl/process?Someparameter1=1&Someparameter2=2
302 found
GET /nl/nl/SomeOtherPage.cms
200 OK
What concerns me, besides the obvious 302 and the cAmeLcAse canonical issues is the 200 response without a redirect.
What page will then be indexed, ranked and what effect does this have on the pagerank flow, if the 302 was to be changed into a 301?
Also would extention .cms be an issue?Thanks for any answers.
Edit. I contacted the developer. He says it's a rewrite, not a meta redirect.
I still think, this rewrite is an issue? Canonical maybe? -
So why is the rewrite not an issue?
Google sees the GET /nl/nl/process?Someparameter1=1&Someparameter2=2, never mind the 302 (which is a very obvious issue).
Then it sees the GET /nl/nl/SomeOtherPage.cms
To Googlebot it might as well be a meta redirect, which is an issue, as this will not pass pagerank. Server response is not different from a meta redirect....Or should I interprete the last GET in some other way?
I agree on the .cms
-
The rewrite is not an issue but you should change from 302 to 301 in order to pass the link equity to the new page.
As for the page name format, cms extension is not an issue from google's point of view. However from a user point of view that is not really friendly (not only the extension but the name in general). Since you can re-write the name as you want I would consider changing those into a more friendly look.
Hope it helps.
-
IIS loves 302s... Ask your developer to change the 302 to a 301 instead.
The indexed page will then be "/nl/nl/SomeOtherPage.cms" and the "link juice" will flow to it.
Also stick with lowercase in the urls.
The .cms extension is not an issue imo.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mass Removal Request from Google Index
Hi, I am trying to cleanse a news website. When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts. This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012. So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
Intermediate & Advanced SEO | | ioannisa
https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove. Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404? I believe this is very wrong. As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?0 -
Unnatural Links Warning, but nowhere to submit a reconsideration request.
More than a year ago (August 2013) I got an "Unnatural Links Warning," I ignored it because I thought it was erroneously sent and that it was odd that there was no place for me to submit a reconsideration request in the Manual Actions section of Webmaster Tools. This happened for several of my domains. I am now noticing a lost in ranking (but not a loss in "ability" to rank). It led me to post this question in the Webmaster Help Forum, I really didn't get an answer though. Here is a link to the Google Export of my links from zachrussell.net and protechig.com. Any idea of what I can do related to this? Even If I did disavow/remove any questionable links, there is no place for me to submit a reconsideration request.
Intermediate & Advanced SEO | | Zachary_Russell0 -
Any solutions for implementing 301s instead of 302 redirects in SharePoint 2010?
We have an issue with Google indexing multiples of each page in our sitemap (www.upmc.com). We've tried using rel_canonical, but it appears that GoogleBot is not honoring our canonicals. Specifically, any of the pages Google indexes that end without a file extension, such as .aspx are 302 redirected to a .aspx page. Example - The following pages all respond as 302 redirects to http://www.upmc.com/services/pages/default.aspx http://www.upmc.com/services/ http://www.upmc.com/services http://www.upmc.com/Services/ http://www.upmc.com/Services Has anyone been able to correct this inherent issue with Sharepoint so that the redirects are at least 301s?
Intermediate & Advanced SEO | | Jessdyl0 -
How long should you wait between submitting link removal requests?
I'm in the process of trying to clear up a spammy link profile for a site I'm working on. I'm using the excellent data from MOZ and the list of links from Google Webmaster Tools to come up with a list of sites and Remove'em to manage the process and before I go to Google I want to make sure the file I am going to submit for the disavow process is as strong as possible. I am aware that I need to contact webmasters about three times to do the removal request properly. How long between requests should there be and how long should I wait between submitting a final removal request and submitting the file to the disavow tool? Any advice welcome. Thanks.
Intermediate & Advanced SEO | | johanisk0 -
How to remove non-requested, non-desired backlinks
Dear Mosers, Before Penguin update we start a link back profile study about who and why are linking and we found hundred of garbage sites like these: http://rakeback-blogger.com/links/ http://personalinjuryattorney-fl.org/get-a-fl-personal-injury-attorney-instead-of-crying-over-spilt-milk and hundred more... They don't have contact form or email address, so what is the best way to remove our link from there (there are any quick way), these sites are damaging our rank. Thank you for your help Claudio
Intermediate & Advanced SEO | | SharewarePros0 -
Wordpress: Too Many Links + Trackback 302
Hey, I see that all the blogposts that we have done to date (6) are being showing as having too many on page links in the seomoz crawl but I am quite confused about this as I cannot count more than 30 (including side bar, footer and header) per post. Can anyone shed any light on why this may be occuring and/or how I can check which links are being picked up? Secondly I have a number of temporary redirect warnings all related to the blog. The 'trackback' URL of each post to date has a 302 direct to it's respective blog post. What is the best solution here? Change to a 301 possibly? Any help will be greatly appreciated. Thanks in advance.
Intermediate & Advanced SEO | | jannkuzel0 -
Member request pages, indexed or no indexed?
We run a service website and basically users of the site post their request to get certain items fixed/serviced. Through Google Analytics we have found that we got lots of traffic to these request pages from people searching for those particular items. E.g. A member's request page: "Cost to fix large Victorian oven" has got many visits from searchers searching for "large Victorian oven". The traffic to these pages is about 40% of our Google organic traffic but didn't covert to more users/requests well and has roughly 67% bounce rate. So my question is: should we keep these pages indexed and if yes what can we do to improve the conversion rate/reduce bounce rate? Many thanks guys. David
Intermediate & Advanced SEO | | sssrpm0 -
Paging. is it better to use noindex, follow
Is it better to use the robots meta noindex, follow tag for paging, (page 2, page 3) of Category Pages which lists items within each category or just let Google index these pages Before Panda I was not using noindex because I figured if page 2 is in Google's index then the items on page 2 are more likely to be in Google's index. Also then each item has an internal link So after I got hit by panda, I'm thinking well page 2 has no unique content only a list of links with a short excerpt from each item which can be found on each items page so it's not unique content, maybe that contributed to Panda penalty. So I place the meta tag noindex, follow on every page 2,3 for each category page. Page 1 of each category page has a short introduction so i hope that it is enough to make it "thick" content (is that a word :-)) My visitors don't want long introductions, it hurts bounce rate and time on site. Now I'm wondering if that is common practice and if items on page 2 are less likely to be indexed since they have no internal links from an indexed page Thanks!
Intermediate & Advanced SEO | | donthe0