Redirects Not Working / Issue with Duplicate Page Titles
-
Hi all
We are being penalised on Webmaster Tools and Crawl Diagnostics for duplicate page titles and I'm not sure how to fix it.We recently switched from HTTP to HTTPS, but when we first switched over, we accidentally set a permanent redirect from HTTPS to HTTP for a week or so(!).We now have a permanent redirect going the other way, HTTP to HTTPS, and we also have canonical tags in place to redirect to HTTPS.Unfortunately, it seems that because of this short time with the permanent redirect the wrong way round, Google is confused as sees our http and https sites as duplicate content.Is there any way to get Google to recognise this new (correct) permanent redirect and completely forget the old (incorrect) one?Any ideas welcome!
-
Thanks for this. I think you're probably right - but it's very frustrating!
-
Did you set up the 301 redirects in the HTaccess file? If you did, then I would remove both redirects and set them up correctly. A 301 tells Google that a page went from one place to another permanently. Eventually it will stop ranking for a certain page. If you have 2 301 redirects going on right now you are confusing everything. I would remove both and set it up correctly.
-
In my opinion is just a matter of waiting for google crawler to visit each page individually and update serp. In the past when I had similar issues with bad redirect was just a matter of fix it and wait for google to notice the change. Check last time google crawler visited each url, if you have historical data on google crawler visit and you know on average how often it does visit each page you also know how long you have to wait.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question #1: Does Google index https:// pages? I thought they didn't because....
generally the difference between https:// and http:// is that the s (stands for secure I think) is usually reserved for payment pages, and other similar types of pages that search engines aren't supposed to index. (like any page where private data is stored) My site that all of my questions are revolving around is built with Volusion (i'm used to wordpress) and I keep finding problems like this one. The site was hardcoded to have all MENU internal links (which was 90% of our internal links) lead to **https://**www.example.com/example-page/ instead of **http://**www.example.com/example-page/ To double check that this was causing a loss in Link Juice. I jumped over to OSE. Sure enough, the internal links were not being indexed, only the links that were manually created and set to NOT include the httpS:// were being indexed. So if OSE wasn't counting the links, and based on the general ideology behind secure http access, that would infer that no link juice is being passed... Right?? Thanks for your time. Screens are available if necessary, but the OSE has already been updated since then and the new internal links ARE STILL NOT being indexed. The problem is.. is this a volusion problem? Should I switch to Wordpress? here's the site URL (please excuse the design, it's pretty ugly considering how basic volusion is compared to wordpress) http://www.uncommonthread.com/
Web Design | | TylerAbernethy0 -
Google changed the title?
I've been reading that Google will sometimes change the home page title, and I have finally seen this in action. Search for and on the second page you'll find but the title says Academy Dog Training. Which is not what's in the title tag for this page. Did Google change the title tag on us and if so why? How can we get them to use our title?
Web Design | | CFSSEO0 -
Custom 404 Page Indexing
Hi - We created a custom 404 page based on SEOMoz recommendations. But.... the page seems to be receiving traffic via organic search. Does it make more sense to set this page as "noindex" by its metatag?
Web Design | | sftravel0 -
Does page speed worth for SEO?
I always broken my head to try to follow all pagespeed guidelines. I increase my pagespeed significantly, but i didnt saw any effect in my SEO performance. In my keywords, my concorrents are crap on it (I have score of 90 and they are at 60-70).Does google gives importance to it?
Web Design | | Naghirniac0 -
Sudden dramatic drops in SERPs along with no snippet and no cached page?
We are a very stable, time tested domain (over 15 yrs old) with thousands of stable, time tested inbound links. We are a large catalog/e commerce business and our web team has over a decade's experience with coding, seo etc. We do not engage in link exchanges, buying links etc and adhere strictly to best white hat seo practices. Our SERPs have generally been very stable for years and years. We continually update content, leverage user generated content etc, and stay abreast of important algorithm and policy changes on Google's end. On Wednesday Jan 18th, we noticed dramatic, disturbing changes to our SERPs. Our formerly very stable positions for thousands of core keywords dropped. In addition, there is no snippet in the SERPs and no cached page for these results. Webmaster tools shows our sitemap most recently successfully downloaded by Google on Jan 14th. Over the weekend and monday the 16th, our cloud hosted site experienced some downtime here and there. I suspect that the sudden issues we are seeing are being caused by one of three possibilities: 1. Google came to crawl when the site was unavailable.
Web Design | | jamestown
However, there are no messages in the account or crawl issues otherwise noted to indicate this. 2. There is a malicious link spam or other attack on our site. 3. The last week of December 2011, we went live with Schema.org rich tagging on product level pages. The testing tool validates all but the breadcrumb, which it says is not supported by Schema. Could Google be hating our Schema.org microtagging and penalizing us? I sort of doubt bc category/subcategory pages that have no such tags are among those suffering. Whats odd is that ever since we went live with Schema.org, Google has started preferring very thin content pages like video pages and articles over our product pages. This never happened in the past. the site is: www.jamestowndistributors.com Any help or ideas are greatly, greatly appreciated. Thank You DMG0 -
Does listing my customer's address, phone number, and a contact form on "every page" count as duplicate content that they'd be penalized for?
I work with small local businesses (like Tree Farms, Feed Stores, Counselors, etc) doing web design, seo, etc. I encourage them to have their contact information visible at all times on their websites. I'm also delving into the world of contact forms. I want to have this info on every page - is this detrimental? Here's an example: http://www.trinityescape.net/marriage-couples-counselors-therapy-clermont-florida/ Thank you!
Web Design | | mikjgens1 -
How to work with US Website and UK Website?
We currently have two websites. Our headquarters in the US and our other in the UK. We currently rank very well for both websites in both countries. Currently, US is hosted in the states and the UK is hosted in the UK. I would like to keep it this way. However, I am going to be converting them to a CMS and redsign both of them. We need our main website to have a feature that they can choose which location they prefer. We also offer other locations through the US. Which we would like our customers to know that we have a few different locations. Also, we will be expanding to Austraillia and maybe others as well. We need to make the websites to look almost identical, yet, different in a way that they can tell one is US, one is UK and so on. Some will only have some of our products, so they will be smaller websites. What is the best way to go about doing this? I know some CMS offer a copy feature that allows you to make the same website. However, how can you do this if they need to be hosted in the different countries? Do I need to do them all individually on each of there servers, or can this be done differently? Also, if they are hosted in there own countries, can the content be the same, or is that still considered duplicate content?
Web Design | | hfranz0 -
Do iFrames embedded in a page get crawled?
Do iFrames embedded in a page get crawled? I have an iFrame which prints a page hosted by another company embedded in my page. Their links don't include rel=nofollow attributes, so I don't want Google to see them. Do spiders crawl the content in iFrames, or do I have to ensure that the links on this page include the nofollow attribute?
Web Design | | deuce1s0