To use the same content just changing the keywords could be seen as duplicate content?
-
I want to offer the same service or product in many different cities, so instead of creating a new content for each city what I want to do it to copy the content already created for the product and service of a city and then change the name of the city and create a new url inside my website for each city.
for example let say I sell handmade rings in the USA, but I want o target each principal city in the USA, so I have want to have a unque url for ecxh city so for example for
Miami I want to have www.mydomain.com/handmade-rings-miami
and for LA the url would be www.mydomain.com/handmade-rings-la
Can I have the same content talking about the handmade rings and just change the keywords and key phrases? or this will count as a duplicate content?
content:
TITLE: Miami Handmade Rings
URL :www.mydomain.com/handmade-rings-miami
Shop Now handmade rings in Miami in our online store and get a special discount in Miami purchases over $50 and also get free shipping on Miami Local address...
See what our Miami handmade rings clients say about our products....
TITLE: LA Handmade Rings
URL: www.mydomain.com/handmade-rings-la
Shop Now handmade rings in LA in our online store and get a special discount in LA purchases over $50 and also get free shipping on LA Local address...
See what our LA handmade rings clients say about our products....
There are more than 100 location in the country I want to do this, so that is why I want to copy paste and replace..
Thanks in advance,
David Orion
-
This used to work superbly until about 2003. Then google was able to identify these sites and would drop all of the pages except one or two.
Cookie cutter pages are no longer useful.
-
Check out articles on the Panda update recently to see a lot about this topic.
I especially like this one from Vanessa Fox
The gist of it is that Google has started to devalue sites with lots of "cloned" pages where the content is very similar from page to page, but the area info, or keyword swaps out.
This is based on the idea that content that is tailored to a page's topic is more useful to the user, and it is more likely to carry nuances than broader content.
So for example I can talk about hotels in Dallas, or New York the same exact way "See Dallas/New York Hotels downtown and be close to all of the hip restaurants and activities"
Or I could go with something more Dallas Specific "See our Downtown Dallas Hotels near the American Airlines center, the Dallas Museum of Art, and direct access to the DART Trains"
The second example is WAY more useful to the user, and so Google's latest updates will tend to favor that over generic topical text.
So if you can afford to get copywriting for each topic I would.
If you can't then I would start picking out the biggest value terms and build content specific to that, use generic content for the other terms for now, but slowly replace those over time as well with more valuable content.
You may be in a situation where there isn't more valuable specific content to outrank you right now so the generic content will do decently in the rankings. In this case you may not see such a horrible ranking, but when someone eventually competes with you and catches on to your scheme it will be better to have the unique content already working for you.
Just my 2 cents
-
Yes, this will be seen as duplicate content. There's no "easy" way to create unique content for each locale. The best you can do is have a general outline and outsource the content creation to a high quality writer (or sit down and start writing yourself). Article spinning or cheap writers will lead to low quality articles that will be difficult to get to rank well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I target low competition related keywords to rank my main keyword?
Hi everyone, I created a product review based blog about finding the best outdoor basketball but it's MOZ keyword difficulty is increasing drastically. I'm finding it difficult to rank. Is it a good practice to target low competition related keywords to rank the main keyword?
Algorithm Updates | | rosie16
Please advise, Thank you!1 -
Should I use subdomains?
I'm thinking of a little project website, but wonder whether I should use subdomains, or just simply categorize the site. For example, (I haven't chosen my domain yet) If I had www.flowers.com, and wanted to produce pages for each type of flower, should i use rose.flower.com
Algorithm Updates | | Gordon_Hall
or
flower.com/rose For SEO purposes, or usability, does it matter? Thanks in advance.0 -
All keywords increasing rank except URL Keyword, whats going on?
Hello, Our website is a private equity firm database, privateequityfirms.com. We rank well for a number of private equity definitions and terms and have been increasing rank in those terms but unfortunately we have been losing ranking in our main keyword and url "private equity firms" .We have ranked as high as 3rd under wikipedia. The only real changes we have made are too the sitemap that is auto generated every time some thing is changed in the database. Does anyone have any ideas what is going on? I have included a Image to help show the problem. Thank you! MozAnalyticsPDF115_zpsddec64fa.png
Algorithm Updates | | Nicktaylor10 -
Do you think this page has been algorithmically penalised or is it just old?
Here is the page: http://www.designquotes.com.au/business-blog/top-10-australian-business-directories-in-2012/ It's fairly old, but when it was first written it hit #1 for "business directories". After a while it dropped but was receieving lots of traffic for long tail variations of "business directories Australia" As of the 4th of October (Penguin 2.1) it lost traffic and rankings entirely. I checked it's link profile and there isn't anything fishy: From Google Webmaster https://docs.google.com/spreadsheet/ccc?key=0AtwbT3wshHRsdEc1OWl4SFN0SDdiTkwzSmdGTFpZOFE&usp=sharing In fact, two links are entirely natural http://blog.businesszoom.com.au/2013/09/use-customer-reviews-to-improve-your-website-ranking/ http://dianajones.com.au/google-plus-local-equals-more-business-blog/ Yet when I search for a close match in title in Google AU, the article doesn't appear within even the first 4 pages. https://www.google.com.au/#q=top+10+Australian+Business+Directories&start=10 Is this simple because it's an old article? Should I re-write it, update the analysis and use a rel=canonical on the old article to the new?
Algorithm Updates | | designquotes0 -
How do you get great content for a small business?
We always talk about great engaging content being the way forwards for sites. As a small business this is an expensive commodity to outsource when you have probably in the region of 250 pages that could probably all use some work. To that end I have some questions. how do do you make a product or category description engaging? Should they still contain a certain number of words ( personally I hate ready reams of text) As on-page SEO what should we be striving to achieve? I am sure this has all been asked before but what the general consensus right now?
Algorithm Updates | | Towelsrus0 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
Dropped off cliff for a partic keyword & can't find out why
At the beginning of Dec we ranked consistently in the top 3 for the keyword 'Suffolk' for the site www.suffolktouristguide.com (apge rank 4, thousands of quality inboud links, site age 5 years +). Since then we've been falling off a cliff and today aren't even in the top 50 for this search term, but most of our othr search terms are unaffected. Our SEOMoz grade remains A for 'Suffolk' and we haven't changed anything in that time that could have had such a material effect (knowingly at least). A similar issue happened to my other site www.suffolkhotelsguide.com back in April and it hasn't recovered despite grade A's on the homepage and key pages. We've checked internal broken links, page download times, external links (used the disavow tool and reconsideration request and got back 'We reviewed your site and found no manual actions by the webspam team that might affect your site's ranking in Google'); etc etc Any thoughts on what I can try next? All suggestions appreciated as I am completely stuck (& have spent a fortune on 'SEO experts' to no effect).
Algorithm Updates | | SarahinSuffolk0 -
Has anyone starting using schema.org?
On the 3rd June 2011 Google announced that they are going to start using Schema. Do you think this will change the way search engines find content, from briefly looking at Schema I'm concerned that the proposed tags could just turn into another keyword meta tag and be abused. Have you started using this tags yet and have you noticed a difference?
Algorithm Updates | | Seaward-Group0