Wich is the best way to manage dup content in a intenational Portal?
-
We have a portal wich is only in spain and we started to internazionalized it to Argentina, Mexico and Colombia.
Before we had a .com domain with content only for spain and now that domain is going to be global. so..
.com contains all the content and you can filter for country
.es contains spanish content
.com.ar contanis argenitian content
Every thing is ok but the problem is that there is a content (online courses) that is in every country.
What we thougt to do is:
-online contect url canonical to .com domain
-Geo content url canonical to .es, .com.ar domain (depending on the geo)
Filters besidese .com and .es can give similar resoults we do not use canonical url or we will follow the rule above (if there is geo in .com filter then canonical to geo domain and if the filter is (online courses) then canonical to .com domain)
What do you think about that?
Thank you in advance.
-
Do the geotargeting in Google Webmaster Tools. You don't need to canonical to the .com. Also make use of the meta language and country attributes, and make sure your pages are adjusted for the local version of Spanish - it's best to get a native speaker to check as there can be a surprising amount of differences.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Defining duplicate content
If you have the same sentences or paragraphs on multiple pages of your website, is this considered duplicate content and will it hurt SEO?
Intermediate & Advanced SEO | | mnapier120 -
What's the best way of crawling my entire site to get a list of NoFollow links?
Hi all, hope somebody can help. I want to crawl my site to export an audit showing: All nofollow links (what links, from which pages) All external links broken down by follow/nofollow. I had thought Moz would do it, but that's not in Crawl info. So I thought Screaming Frog would do it, but unless I'm not looking in the right place, that only seems to provide this information if you manually click down each link and view "Inlinks" details. Surely this must be easy?! Hope someone can nudge me in the right direction... Thanks....
Intermediate & Advanced SEO | | rl_uk0 -
Best strategy for images filenames?
Hi community! For this case, which would be the best strategy for image filenames? This is a funiture company, with its own brand. What they sell is what they have created and designed Let's think on a kitchen. And we have a page we want to rank for the primary Kw "modern kitchen", and secondaries "white modern kitchen", "modern minimalist kitchen", "modern kitchen designs". Would you use the brand name in the filenames? I mean:
Intermediate & Advanced SEO | | teconsite
---------- white-modern-kitchen-brandname.jpg
---------- modern-minimalist-kitchen-brandname.jpg
---------- modern-kitchen-brandname.jpg Would you use just the kws in the filename and the brand in the alt text?
---------- filename: white-modern-kitchen.jpg Alt: "White Modern Kitchen, Brand" or should we use the brand in both items: filename and Alt? ¿Which would be the best way to do it in this case? Any suggestions? Thank you!0 -
Best Way To Go About Fixing "HTML Improvements"
So I have a site and I was creating dynamic pages for a while, what happened was some of them accidentally had lots of similar meta tags and titles. I then changed up my site but left those duplicate tags for a while, not knowing what had happened. Recently I began my SEO campaign once again and noticed that these errors were there. So i did the following. Removed the pages. Removed directories that had these dynamic pages with the remove tool in google webmasters. Blocked google from scanning those pages with the robots.txt. I have verified that the robots.txt works, the pages are longer in google search...however it still shows up in in the html improvements section after a week. (It has updated a few times). So I decided to remove the robots.txt file and now add 301 redirects. Does anyone have any experience with this and am I going about this the right away? Any additional info is greatly appreciated thanks.
Intermediate & Advanced SEO | | tarafaraz0 -
Duplicate Content www vs. non-www and best practices
I have a customer who had prior help on his website and I noticed a 301 redirect in his .htaccess Rule for duplicate content removal : www.domain.com vs domain.com RewriteCond %{HTTP_HOST} ^MY-CUSTOMER-SITE.com [NC]
Intermediate & Advanced SEO | | EnvoyWeb
RewriteRule (.*) http://www.MY-CUSTOMER-SITE.com/$1 [R=301,L,NC] The result of this rule is that i type MY-CUSTOMER-SITE.com in the browser and it redirects to www.MY-CUSTOMER-SITE.com I wonder if this is causing issues in SERPS. If I have some inbound links pointing to www.MY-CUSTOMER-SITE.com and some pointing to MY-CUSTOMER-SITE.com, I would think that this rewrite isn't necessary as it would seem that Googlebot is smart enough to know that these aren't two sites. -----Can you comment on whether this is a best practice for all domains?
-----I've run a report for backlinks. If my thought is true that there are some pointing to www.www.MY-CUSTOMER-SITE.com and some to the www.MY-CUSTOMER-SITE.com, is there any value in addressing this?0 -
What is the best way to handle special characters in URLs
What is the best way to handle special characters? We have some URL's that use special characters and when a sitemap is generate using Xenu it changes the characters to something different. Do we need to have physically change the URL back to display the correct character? Example: URL: http://petstreetmall.com/Feeding-&-Watering/361.html Sitmap Link: http://www.petstreetmall.com/Feeding-%26-Watering/361.html
Intermediate & Advanced SEO | | WebRiverGroup0 -
Duplicate content
I run about 10 sites and most of them seemed to fall foul of the penguin update and even though I have never sought inorganic links I have been frantically searching for a link based answer since April. However since asking a question here I have been pointed in another direction by one of your contributors. It seems At least 6 of my sites have duplicate content issues. If you search Google for "We have selected nearly 200 pictures of short haircuts and hair styles in 16 galleries" which is the first bit of text from the site short-hairstyles.com about 30000 results appear. I don't know where they're from nor why anyone would want to do this. I presume its automated since there is so much of it. I have decided to redo the content. So I guess (hope) at some point in the future the duplicate nature will be flushed from Google's index? But how do I prevent it happening again? It's impractical to redo the content every month or so. For example if you search for "This facility is written in Flash® to use it you need to have Flash® installed." from another of my sites that I coincidently uploaded a new page to a couple of days ago, only the duplicate content shows up not my original site. So whoever is doing this is finding new stuff on my site and getting it indexed on google before even google sees it on my site! Thanks, Ian
Intermediate & Advanced SEO | | jwdl0 -
I try to apply best duplicate content practices, but my rankings drop!
Hey, An audit of a client's site revealed that due to their shopping cart, all their product pages were being duplicated. http://www.domain.com.au/digital-inverter-generator-3300w/ and http://www.domain.com.au/shop/digital-inverter-generator-3300w/ The easiest solution was to just block all /shop/ pages in Google Webmaster Tools (redirects were not an easy option). This was about 3 months ago, and in months 1 and 2 we undertook some great marketing (soft social book marking, updating the page content, flickr profiles with product images, product manuals onto slideshare etc). Rankings went up and so did traffic. In month 3, the changes in robots.txt finally hit and rankings decreased quite steadily over the last 3 weeks. Im so tempted to take off the robots restriction on the duplicate content.... I know I shouldnt but, it was working so well without it? Ideas, suggestions?
Intermediate & Advanced SEO | | LukeyJamo0