I added an SSL certificate this morning and now I noticed duplicate content
-
Ok, so Im a newbie, therefor I make mistakes! Lots of them.
I added an SSL certificate this morning bc it was free and I read it can help my rankings. Now I just checked it in screaming frog and saw two duplicate content pages due to the https.
So im panicking! What's the easiest way to fix this?? Can I undue an SSL certificate?
I guess what's the easiest that will also be best for ranking.
Thank you!!
Rena
-
Since you are WordPress, install "Really Simple SSL" plugin https://really-simple-ssl.com/
You have a mixed content warning as well as the redirect problem. Really Simple SSL will fix that pretty painlessly. Worth the $25 for the premium version but the free version is also great.
Also looks like your host may be WP Engine? They can work with you to help as well.
I see the mixed content warning if I go directly to the page: https://intercallsystems.com/nurse-call-manufacturer/
-
Hi Brian, Im sorry to bug you. But if you don't mind... Im still confused and having a hard time wrapping my brain around this. This is what I do know.. when i type in intercallsystems.com it automatically goes to https://intercallsystems.com.
But if I type in any other page, like: http://intercallsystems.com/nursecallsystems/the-equinoxlegend-systems/ it doesnt automatically go.
Only the homepage does..
Also,
When I put my site through screaming frog I get duplicate title issues and duplicated H1 tags and what not for some pages like:
http://intercallsystems.com/nurse-call-manufacturer/
https://intercallsystems.com/nurse-call-manufacturer/
So do i need to redirect to http or do i need to do rel canonical?
This is what my current htaccess file looks like:BEGIN WordPress
<ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule>END WordPress
Redirect 301 /intercallsystems.com/intercall-nurse-call-systems/ http://intercallsystems.com/nursecallsystems
Redirect 301 /about-us http://intercallsystems.com/nurse-call-manufacturer
Redirect 301 /cat/avstas.html http://intercallsystems.com/
Redirect 301 /contact.html http://intercallsystems.com/contact-us/
Redirect 301 /cat/product.html http://intercallsystems.com/nursecallsystems
Redirect 301 /legal.html http://intercallsystems.com
Redirect 301 /8345spec.html http://intercallsystems.com
Redirect 301 /patsta.html http://intercallsystems.com
Redirect 301 /employment.html http://intercallsystems.com/about-us/employment/
Redirect 301 /is/index.html http://intercallsystems.com/nursecallsystems/
Redirect 301 /intercall-nurse-call-systems/the-equinoxlegend-systems/ http://intercallsystems.com/nursecallsystems/the-equinoxlegend-systems/
Redirect 301 /the-audio-visual-system/ http://intercallsystems.com/nursecallsystems/the-equinoxlegend-systems/
Redirect 301 /nurse-call-systems/the-ultra-series/ http://intercallsystems.com/nursecallsystems/ultra-system/
Redirect 301 /systems/ http://intercallsystems.com/nursecallsystems/
Redirect 301 /about-intercall-systems/ http://intercallsystems.com/nurse-call-manufacturer/
Redirect 301 /ultra-touch-screen-master/map-1550/ http://intercallsystems.com/nursecallsystems/ultra-system/
Redirect 301 /the-vista-series/ http://intercallsystems.com/nursecallsystems/vista-series/
Redirect 301 /intercall-systems/ http://intercallsystems.com/nursecallsystems/
Redirect 301 /nurse-call-systems/the-ultra-system/ http://intercallsystems.com/nursecallsystems/ultra-system/
Redirect 301 /nurse-call-systems/the-audio-visual-system http://intercallsystems.com/nursecallsystems/the-equinoxlegend-systems/I really appreciate the help!
Rena
-
Thank you! Actually I suspected something was up bc it has more than usual down time. I wasn't sure what to do. Thanks
-
My pleasure, glad you were able to get that fixed rather quickly!
Yes, I would set up a new property in Search Console with the https version and resubmit the new sitemap and all that fun stuff. Then you can delete the old property to keep things neat.
One thing I want to mention, I noticed your site is on a shared hosting server with BlueHost. You may want to see about moving onto a dedicated server with them to play it safe. You run into malware issues and the possibility of the server being slowed down when loaded with sites like that. Run your site through this tool and you will see that there are several other sites that share the same IP address as your site. I am not sharing this to make you panic because there is no reason to, just so you are aware and can make an informed decision.
http://www.ipfingerprints.com/reverseip.php
Here is an article on the topic that can help shed more light of the risks. I am super picky about where my sites are hosted and page speed so I always steer clear from shared hosting environments.
-
Thank you Brian,
It looks like my hosting provider automatically did it. when I go to the homepage it goes directly to https version and when I look at in Moz Bar I see:
|
HTTP/1.1 301 Moved Permanently
http://intercallsystems.com/ HTTP/1.1 200 OK
|
So now my new question is do I have to create a https version in webmaster tools, submit the sitemap, and do the data highlighter all over again?
Thank you for the help!
-
Sorry I just went back and read that you were a new to SEO! My apologies. Check this article out for more info on htaccess redirects.
-
Next task is 301 redirection as http to https.
-
Hey Rena!
I would just redirect that duplicate page to the new https version and call it a day!
Keep the SSL, just go through like you are and check to make sure everything is directing properly. You should be good to go. Hope this helps! Cheers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix Duplicate Content Warnings on Pagination? Indexed Pagination?
Hi all! So we have a Wordpress blog that properly has pagination tags of rel="prev" and rel="next" set up for pages, but we're still getting crawl errors with MOZ for duplicate content on all of our pagination pages. Also, we are having all of our pages indexed as well. I'm talking pages as deep as page 89 for the home page. Is this something I should ignore? Is it hurting my SEO potentially? If so, how can I start tackling it for a fix? Would "noindex" or "nofollow" be a good idea? Any help would be greatly appreciated!
Intermediate & Advanced SEO | | jampaper0 -
Duplicate content due to parked domains
I have a main ecommerce website with unique content and decent back links. I had few domains parked on the main website as well specific product pages. These domains had some type in traffic. Some where exact product names. So main main website www.maindomain.com had domain1.com , domain2.com parked on it. Also had domian3.com parked on www.maindomain.com/product1. This caused lot of duplicate content issues. 12 months back, all the parked domains were changed to 301 redirects. I also added all the domains to google webmaster tools. Then removed main directory from google index. Now realize few of the additional domains are indexed and causing duplicate content. My question is what other steps can I take to avoid the duplicate content for my my website 1. Provide change of address in Google search console. Is there any downside in providing change of address pointing to a website? Also domains pointing to a specific url , cannot provide change of address 2. Provide a remove page from google index request in Google search console. It is temporary and last 6 months. Even if the pages are removed from Google index, would google still see them duplicates? 3. Ask google to fetch each url under other domains and submit to google index. This would hopefully remove the urls under domain1.com and doamin2.com eventually due to 301 redirects. 4. Add canonical urls for all pages in the main site. so google will eventually remove content from doman1 and domain2.com due to canonical links. This wil take time for google to update their index 5. Point these domains elsewhere to remove duplicate contents eventually. But it will take time for google to update their index with new non duplicate content. Which of these options are best best to my issue and which ones are potentially dangerous? I would rather not to point these domains elsewhere. Any feedback would be greatly appreciated.
Intermediate & Advanced SEO | | ajiabs0 -
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
Pages with Duplicate Page Content (with and without www)
How can we resolve pages with duplicate page content? With and without www?
Intermediate & Advanced SEO | | directiq
Thanks in advance.0 -
Glossary index and individual pages create duplicate content. How much might this hurt me?
I've got a glossary on my site with an index page for each letter of the alphabet that has a definition. So the M section lists every definition (the whole definition). But each definition also has its own individual page (and we link to those pages internally so the user doesn't have to hunt down the entire M page). So I definitely have duplicate content ... 112 instances (112 terms). Maybe it's not so bad because each definition is just a short paragraph(?) How much does this hurt my potential ranking for each definition? How much does it hurt my site overall? Am I better off making the individual pages no-index? or canonicalizing them?
Intermediate & Advanced SEO | | LeadSEOlogist0 -
Noindex Valuable duplicate content?
How could duplicate content be valuable and why question no indexing it? My new client has a clever african safari route builder that you can use to plan your safari. The result is 100's of pages that have different routes. Each page inevitably has overlapping content / destination descriptions. see link examples. To the point - I think it is foolish to noindex something like this. But is Google's algo sophisticated enough to not get triggered by something like this? http://isafari.nathab.com/routes/ultimate-tanzania-kenya-uganda-safari-july-november
Intermediate & Advanced SEO | | Rich_Coffman
http://isafari.nathab.com/routes/ultimate-tanzania-kenya-uganda-safari-december-june0 -
Duplicate content on subdomains
Hi All, The structure of the main website goes by http://abc.com/state/city/publication - We have a partnership with public libraries to give local users access to the publication content for free. We have over 100 subdomains (each for an specific library) that have duplicate content issues with the root domain, Most subdomains have very high page authority (the main public library and other local .gov websites have links to this subdomains).Currently this subdomains are not index due to the robots text file excluding bots from crawling. I am in the process of setting canonical tags on each subdomain and open the robots text file. Should I set the canonical tag on each subdomain (homepage) to the root domain version or to the specific city within the root domain? Example 1:
Intermediate & Advanced SEO | | NewspaperArchive
Option 1: http://covina.abc.com/ = Canonical Tag = http://abc.com/us/california/covina/
Option 2: http://covina.abc.com/ = Canonical Tag = http://abc.com/ Example 2:
Option 1: http://galveston.abc.com/ = Canonical Tag = http://abc.com/us/texas/galveston/
Option 2: http://galveston.abc.com = Canonical Tag = http://abc.com/ Example 3:
Option 1: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/us/kansas/hutchinson/
Option 2: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/ I believe it makes more sense to set the canonical tag to the corresponding city (option 1), but wondering if setting the canonical tag to the root domain will pass "some link juice" to the root domain and it will be more beneficial. Thanks!0 -
Real Estate MLS listings - Does Google Consider duplicate content?
I have a real estate website. The site has all residential properties for sale in a certain State (MLS property listings). These properties also appear on 100's of other real estate sites, as the data is pulled from a central place where all Realtors share their listings. Question: will having these MLS listings indexed and followed by Google increase the ratio of duplicate vs original content on my website and thus negatively affect ranking for various keywords? If so, should I set the specific property pages as "no index, no follow" so my website will appear to have less duplicate content?
Intermediate & Advanced SEO | | khi50