Duplicate Content Home Page http Status Code Query
-
Hi All,
We have just redone a site wide url migration (from old url structure to new url structure) and set up our 301's etc but have this one issue whereby I don't know if' it's a problem of not.
We have 1 url - www.Domain.co.uk**/** which has been set up to 301 redirect back to www.domain.co.uk
However, when I check the server response code, it comes back as 200.
So although it appears to visually 301 redirect if I put the url in the tool bar, the status code says different.
Could this be seen as a potential duplicate home page potentially and if so , any idea how I could get around it if we can't solve the root cause of it. This is on a cake php framework,
thanks
PEte
-
Try using something like LiveHTTPHeaders to view all of the HTTP requests and responses involved. You should see the request going to the redirected domain (GET domain.co.uk) and then the response such as HTTP/1.1 301 Moved Permanently followed by a new request to the new domain (GET domain2.co.uk) and the response will naturally be HTTP/1.1 200 OK because the server at the new domain has answered the request successfully.
-
Hi Pete,
Have you set a canonical URL to the 'original homepage? If not, that's what I would recommend to do.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have a site that has a 302 redirect loop on the home page (www.oncologynurseadvisor.com) i
i am trying to do an audit on it using screaming frog and the 302 stops it. My dev team says it is to discourage Non Human Traffic and that the bots will not see it. Is there any way around this or what can I tell the dev team that shows them it is not working as they state.
Web Design | | HayMktVT0 -
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
2 Menu links to same page. Is this a problem?
One of my clients wants to link to the same page from several places in the navigation menu. Does this create any crawl issues or indexing problems? It's the same page (same url) so there is no duplicate content problems. Since the page is promotional, the client wants the page accessible from different places in the nav bar. Thanks, Dino
Web Design | | Dino640 -
Should i not use hyphens in web page titles? Google Penalty for hyphens?
all the page titles in my site have hyphens between the words like this: http://texas.com/texas-plumbers.html I have seen tests where hyphenated domain names ranked lower than non hyphenated domain names. Does this mean my pages are being penalized for hyphens or is this only in the domain that it is penalized? If I create new pages should I not use hyphens in the page titles when there are two or more words in the title? If I changed all my page titles to eliminate the hyphens, I would lose all my rankings correct? My site is 12 years old and if I changed all these titles I'm guessing that each page would be thrown in the google sandbox for several months, is this true? Thanks mozzers!
Web Design | | Ron100 -
Hi, I have a doubt. If we want to hide unwanted text in a web page its possible with "" tag. And my question "does a search engine crawl those text? help me.
I want to hide a lot of text behind my site page. I know its possible with that tag. But in what way a search engine looks at those text? Hidden or they are crawled and indexed.
Web Design | | FhyzicsBCPL0 -
Changing Page Extenstions
Hi all, I have a 10 year old website done in Classic ASP, which is fast becoming outdated and we are going to convert it to PHP so all of our pages would be changed from a '/page.asp' extension to a '/page.php' extension. I am familiar with the need to setup 301 redirects for this and I understand there will probably be a short term drop in our Google rankings. Naturally, I don't want to have to go through this again in the future so here is my question. Is having NO page extension, like '/aboutus/history' the wave of the future? Does having no page extension effect SEO at all? I have seen more websites using this technique in 2013 goes on so I am thinking this is the way we should plan our site update. I haven't looked into how to actually do this yet, but it would seem to make sense to me so that if we needed to change from PHP to say .NET or something else later on, we would not have to do 301 redirects again or have another drop in our rankings. Do any of you have an opinion or experience with this?
Web Design | | jacksghost0 -
Parallax, SEO, and Duplicate Content
We are working on a project that uses parallax to provide a great experience to the end user, and we are also trying to create a best case scenario for SEO. We have multiple keywords we are trying to optimize. We have multiple pages with the parallax function built into it. Basically each member of the primary navigation is it's own page, with all subpages built below it using the parallax function. Our navigation currently uses the hashbang method to provide custom URL's for each subpage. And the user is appropriately directed to the right section based on that hashbang. www.example.com/About < This is its own page www.example.com/about/#/history < This is a subpage that you scroll to on the About page We are trying to decide what the best method will be for trying to optimize each subpage, but my current concern is that because each subpage is really a part of the primary page, will all those URL's be seen as duplicate content? Currently the site can also serve each subpage as it's own page as well, so without the parallax function. Should I include those as part of the sitemap. There's no way to navigate to them unless I include them in the sitemap, but I don't want Google to think I'm disingenuous in providing them links that don't exist, solely for the purpose of SEO, but truthfully all of the content exists and is available to the user. I know that a lot of people are asking these questions, and there really are no right answers yet, but I'm curious about everyone else's experience so far.
Web Design | | PaulRonin2 -
Local Versions of Pages
I have a site that offers services across two states and was wondering if I would see any benefit from creating pages such as: SERVICE in CITY, TX Would I need to change the content on the pages completely or could I simply swap out the city/state if I have roughly 3-5 combos I want target?
Web Design | | nusani0