After adding a ssl certificate to my site I encountered problems with duplicate pages and page titles
-
Hey everyone!
After adding a ssl certificate to my site it seems that every page on my site has duplicated it's self. I think that is because it has combined the www.domainname.com and domainname.com. I would really hate to add a rel canonical to every page to solve this issue. I am sure there is another way but I am not sure how to do it. Has anyone else ran into this problem and if so how did you solve it?
Thanks and any and all ideas are very appreciated.
-
Hi Donford,
Excuse the delay. We have been very busy but we are working on updating our .htaccess file. Thank you for the quick response. We will let you know if it resolves our issues.
-
**`Don is right If on Nginx use`**
<code>server { server_name [example.com](http://example.com/); rewrite ^/(.*) [https://example.com/$1](https://example.com/$1) permanent; }</code>
<code>server { listen 80; server_name my.domain.com; return 301 https://$server_name$request_uri; } server { listen 443 ssl; server_name my.domain.com; [....] }</code>
<code>server { listen 80; server_name yourdomain.com www.yourdomain.com; return 301 https://$server_name$request_uri; } server { listen 443 ssl spdy; server_name yourdomain.com www.yourdomain.com; ssl on; ssl_certificate /var/www.yourdomain.com/cert/ssl-bundle.crt; ssl_certificate_key /var/www/yourdomain.com/cert/yourdomain.com.key; access_log /var/log/nginx/yourdomain.com.access.log rt_cache; error_log /var/log/nginx/yourdomain.com.error.log; root /var/www/yourdomain.com./htdocs; index index.php index.htm index.html; include common/wpfc.conf; include common/wpcommon.conf; include common/locations.conf; }</code>
`https://christiaanconover.com/blog/how-to-redirect-http-to-https-in-nginx`
https://moz.com/learn/seo/redirection
https://wp-mix.com/htaccess-redirect-http-to-https/
https://www.digitalocean.com/community/questions/http-https-redirect-positive-ssl-on-nginx
-
HI Travis
Yes, you can use your .htaccess file to fix this. (This file should be located in your home directory)
A simple example would be
RewriteEngine On
RewriteCond %{HTTP_HOST} ^YourDomain.com
RewriteRule ^(.*)$ http://www.YourDomain.com/$1 [R=301,L]What this does is looks for any request for non-www urls, if found it 301 redirects it to the www version of the page.
Hope this helps,
Don
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If a page ranks in the wrong country and is redirected, does that problem pass to the new page?
Hi guys, I'm having a weird problem: A new multilingual site was launched about 2 months ago. It has correct hreflang tags and Geo targetting in GSC for every language version. We redirected some relevant pages (with good PA) from another website of our client's. It turned out that the pages were not ranking in the correct country markets (for example, the en-gb page ranking in the USA). The pages from our site seem to have the same problem. Do you think they inherited it due to the redirects? Is it possible that Google will sort things out over some time, given the fact that the new pages have correct hreflangs? Is there stuff we could do to help ranking in the correct country markets?
Intermediate & Advanced SEO | | ParisChildress1 -
Adding Canonical Tags in WYSIWYG Section of Subscription Based Sites
Our company has a paid subscription-based site that only allows us to add HTML in the WYSIWYG section, not in the backend of each individual page. Because we are an e-commerce site, we have many duplicate page issues. Is there a way for us to add or hide the canonical code in the WYSIWYG section instead of us having to make all of our pages significantly different?
Intermediate & Advanced SEO | | expobranders0 -
Is a different location in page title, h1 title, and meta description enough to avoid Duplicate Content concern?
I have a dynamic website which will have location-based internal pages that will have a <title>and <h1> title, and meta description tag that will include the subregion of a city. Each page also will have an 'info' section describing the generic product/service offered which will also include the name of the subregion. The 'specific product/service content will be dynamic but in some cases will be almost identical--ie subregion A may sometimes have the same specific content result as subregion B. Will the difference of just the location put in each of the above tags be enough for me to avoid a Duplicate Content concern?</p></title>
Intermediate & Advanced SEO | | couponguy0 -
How To Detect Primary Site With Duplicate Domains?
I'm working with some backlink data, and I've run into different domains that host the same exact content on the same IP. They're not redirecting to each other, just looks like they're hosting the same content on differnet virtual hostnames. One example is: borealcanada.ca borealcanada.com borealcanada.org www.borealcanada.ca www.borealcanada.com www.borealcanada.org www.borealecanada.ca I'm trying to consolidate this data and choose which is the primary domain. In this example, it appears www.borealcanada.ca has a high number of indexed pages and also ranks first for "boreal canada". However, I'm trying to think of a metric I can use to definitively/systematically handle this (using SEO Tools or something like it). Anyone have ideas on which metric might help me determine this for a large number of sites?
Intermediate & Advanced SEO | | brettgus0 -
Can we retrieve all 404 pages of my site?
Hi, Can we retrieve all 404 pages of my site? is there any syntax i can use in Google search to list just pages that give 404? Tool/Site that can scan all pages in Google Index and give me this report. Thanks
Intermediate & Advanced SEO | | mtthompsons0 -
Temporary Duplicate Sites - Do anything?
Hi Mozzers - We are about to move one of our sites to Joomla. This is one of our main sites and it receives about 40 million visits a month, so the dev team is a little concerned about how the new site will handle the load. Dev's solution, since we control about 2/3 of that traffic through our own internal email and cross promotions, is to launch the new site and not take down the old site. They would leave the old site on its current URL and make the new site something like new.sub.site.com. Traffic we control would continue to the old site, traffic that we detect as new would be re-directed to the new site. Over time (the think about 3-4 months) they would shift the traffic all to the new site, then eventually change the URL of the new site to be the URL of the old site and be done. So this seems to be at the outset a duplicate content (whole site) issue to start with. I think the best course of action is try to preserve all SEO value on the old URL since the new URL will eventually go away and become the old URL. I could consider on the new site no-crawl/no-index tags temporarily while both sites exist, but would that be risky since that site will eventually need to take those tags off and become the only site? Rel=canonical temporarily from the new site to the old site also seems like it might not be the best answer. Any thoughts?
Intermediate & Advanced SEO | | Kenn_Gold0 -
Adding Orphaned Pages to the Google Index
Hey folks, How do you think Google will treat adding 300K orphaned pages to a 4.5 million page site. The URLs would resolve but there would be no on site navigation to those pages, Google would only know about them through sitemap.xmls. These pages are super low competition. The plot thickens, what we are really after is to get 150k real pages back on the site, these pages do have crawlable paths on the site but in order to do that (for technical reasons) we need to push these other 300k orphaned pages live (it's an all or nothing deal) a) Do you think Google will have a problem with this or just decide to not index some or most these pages since they are orphaned. b) If these pages will just fall out of the index or not get included, and have no chance of ever accumulating PR anyway since they are not linked to, would it make sense to just noindex them? c) Should we not submit sitemap.xml files at all, and take our 150k and just ignore these 300k and hope Google ignores them as well since they are orhpaned? d) If Google is OK with this maybe we should submit the sitemap.xmls and keep an eye on the pages, maybe they will rank and bring us a bit of traffic, but we don't want to do that if it could be an issue with Google. Thanks for your opinions and if you have any hard evidence either way especially thanks for that info. 😉
Intermediate & Advanced SEO | | irvingw0 -
Site navigation menu in head of page for SEO
We are considering expanding our site navigation menu (horizontal) at the top of our pages. However, once implemented, this would include around 30-40 links at the top of the page; before the content of the page. How much effect (good/bad) would this have on SEO? Are their any other options? (perhaps rendering the menu after the main content with CSS)? Any thoughts, suggestions or ideas would be greatly appreciated.
Intermediate & Advanced SEO | | Peter2640