Content by Country
-
Currently we have a news website aimed at several countries. We want to filter the content of some url (home, category pages, ...) using the country of origin of the visitor.
For example in the home we've heard of global character, and a column with news of the country of origin of the visitor.
This may affect the position or cause a Google penalty?
thank you very much
-
According to Matt Cutts this sort of thing is ok - http://www.youtube.com/watch?v=GFf1gwr6HJw - provided you're not treating Googlebot differently - which would be cloaking.
However, as I said above I have seen sites that have implemented this sort of thing penalised - likely because they've inadvertently tripped some sort of cloaking filter (even though strictly speaking what they're doing isn't cloaking).
Hope this helps,
Hannah
-
Thanks for the replies.
I show you what we do on the Web in beta
gestion.org / beta
All content is the same for all countries, except the middle column shows posts where the user's country. So far only four countries (Spain, Mexico, Argentina and Colombia). If a user is not in any of these countries, shows general posts.
This can be done?
-
Hi Jose,
Are you talking about doing an IP redirect to country-specific URLs? Or changing the content displayed based on the user's IP?
If so, I don't really like either of those options - automatically redirecting a user based on IP can cause issues in terms of indexation as Google crawl via a US IP, and if you're not careful US content is all they'll be able to see.
Similarly if you're talking about dynamically delivering page content based on IP - again you're likely to have indexation problems as Google may only be able to crawl and index US content; plus (more importantly) I've seen sites penalised in the past for doing this sort of thing.
If you want to direct visitors to the most appropriate content I quite like the approach that Cheap Flights use - if you visit cheapflights.com from a UK IP you're pushed to this page - http://www.cheapflights.com/workers/profile-select.aspx?sref=CFUK&redirect=GeoIP&geoip=GB&cfref=CFUS&spt=Home&rp=/
I hope this helps,
Hannah
-
Can i have your site URL ?? I am not sure but if you have filtered your content using country origin then it will not affect your ranking and does not cause Google penalty.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content in Footers (Not as routine as it seems)
Hello there, I know that content in the footer of sites are safe from duplication penalisation; however, what if the footers where replicated across different subdomains? For instance, the footer was duplicated across: www.example.com blog.example.com blog2.example.com I don't see it as a big issue personally; however, outsourced "specialists" seem to think that this is causing duplication problems and therefore negatively affecting the ranking power of "lesser" subdomains i.e. not the www version, which is by far the strongest subdomain. Would be good to get some insight if anybody has any. Thanks.
On-Page Optimization | | SEONOW1230 -
Delete or not delete outdated content
Hi there!
On-Page Optimization | | Enrico_Cassinelli
We run a website about a region in Italy, the Langhe area, where we write about wine and food, local culture, and we give touristic informations. The website also sports a nice events calendar: in 4 years we (and our users) loaded more than 5700 events. Now, we're starting to have some troubles managing this database. The database related to events is huge both in file size and number of rows. There are a lot of images that eat up disk space, and also it's becoming difficult to manage all the data in our backend. Also, a lot of users are entering the website by landing on outdated events. I was wondering if it could be a good idea to delete events older than 6 months: the idea was to keep only the most important and yearly recurring events (which we can update each year with fresh information), and trash everything else. This of course means that 404 errors will increase, and also that our content will gettin thinner, but at the same time we'll have a more manageable database, and the content will be more relevant and "clean". What do you think? thank you 🙂 Best0 -
"Turning off" content to a site
One site I manage has a lot of low quality content. We are in the process of improving the overall site content but we have "turned off" a large portion of our content by setting 2/3 of the posts to draft. Has anyone done this before or had experience with doing something similar? This quote from Bruce Clay comes to mind: “Where a lot of people don’t understand content factoring to this is having 100 great pages and 100 terrible pages—they average, when the quality being viewed is your website,” he explained. “So, it isn’t enough to have 100 great pages if you still have 100 terrible ones, and if you add another 100 great pages, you still have the 100 terrible ones dragging down your average. In some cases we have found that it’s much better, to improve your ranking, to actually remove or rewrite the terrible ones than add more good ones.” What are your thoughts? Thanks
On-Page Optimization | | ThridHour0 -
Duplicate Page Content for Product Pages
Hello, We have one website which URL is http://www.bannerbuzz.com & we have many product pages which having duplicate page content issue in SEOMOZ which are below. http://www.bannerbuzz.com/backlit-banners-1.html
On-Page Optimization | | CommercePundit
http://www.bannerbuzz.com/backlit-banners-10.html
http://www.bannerbuzz.com/backlit-banners-11.html
http://www.bannerbuzz.com/backlit-banners-12.html
http://www.bannerbuzz.com/backlit-banners-13.html We haven't any content on these pages, still getting duplicate page content errors for all pages in SEOMOZ. Please help me how can i fix this issue. Thanks,0 -
Article on site and distribution, is it duplicate content?
I was always taught to place all original articles on site, let them get indexed by Google, then put out for distribution through various press release outlets. With the latest penguin update, how does this practice work out concerning duplicate content? In theory, I wrote the article so I should get credit for it on my site first, then push through various distribution outlets to get it out to my targeted audience in my niche field. Typing out loud I would tend to think if the article is on my site first then I would get credit and any others following would be hit by duplicate content if in fact google considered it a dupe violation. Any input on this? Am I on track or am I heading for a train wreck.
On-Page Optimization | | anthonytjm0 -
Duplicate Product BUT Unique Content -- any issues?
We have the situation where a group of products fit into 2 different categories and also serve different purposes (to the customer). Essentially, we want to have the same product duplicated on the site, but with unique content and it would even have a slightly different product name. Some specifications would be redundant, but the core content would be different. Any issues?
On-Page Optimization | | SEOPA1 -
Why does SEOmoz use /blog/content-title vs /category/content-title? Any difference?
Assume a brand new blog being designed and all other things equal. What are the pros & cons between using the url structure /blog/content-title vs. /category/content-title? Note:
On-Page Optimization | | JasonJackson
Both scenarios would be using categorical archiving.0 -
Duplicate content issue with dynamically generated url
Hi, For those who have followed my previous question, I have a similar one regarding dynamically generated urls. From this page http://www.selectcaribbean.com/listing.html the user can make a selection according to various criteria. 6 results are presented and then the user can go to the next page. I know I should probably rewrite url's such as these: http://www.selectcaribbean.com/listing.html?pageNo=1&selType=&selCity=&selPrice=&selBeds=&selTrad=&selMod=&selOcean= but since all the results presented are basically generated on the fly for the convenience of the user, I am afraid google my consider this as an attempt to generate more pages as there are pages for each individual listing. What is my solution for this? Nofollow these pages? Block them thru robots txt?
On-Page Optimization | | multilang0