I have a client whose site has each page in multiple languages. each is in specific directories. Needless to say each page is showing up with the same site title, meta data, and content. When my campaigns are crawled they show up as thousands of page errors. Should i add each of these into robots.txt? would this fix the issue of duplicate content?
- Home
- gkellyiii
Latest posts made by gkellyiii
-
Foreign Language Directories
-
Secure Vs Non-Secure Redirects
I have a client who has a lot of duplicate pages on their site. The pages are secure and then non secure counterparts. Not sure why they have this in place but i recomended that they redirect on to the other or vice versa using 301 redirects.
I am getting some questions as to why they should do this. Does anyone have a good document outlining the reasoning behind this? For me its just a matter of cleaning up duplicate content but wondering if there is any technical data out there.
Looks like your connection to Moz was lost, please wait while we try to reconnect.