Duplicate content issue
-
I have recently built a site that has a main page intended to rank for national coverage. This site also has a number of pages targeted at local searches, these pages are slight variations of each other with town specific keywords. Does anyone know if google will see this as spam and quarantine my site from ranking? Thanks
-
You want to rank for local searches right? Now the question is do you have physical presence in those places? If not, by making city specific pages just to get rank for will definitely invite penalty sooner or later. Think about the customers first and not the search engines.
_Now, if you do have branches in those cities, you can create Google Local listing, can have separate landing pages for them given the fact that those pages say something unique about the business etc. Do not add rehash content that no one is going to add. Focus on adding value to users’ experience. _
-
Creating a site with multiple landing pages targeted to different regions is not new, therefore Google has made updates to try attempt to stop sites with low quality from capitalizing on localized keywords (miami keyword, tuscon x, san diego x, etc) where x is your main keyword.
What this means is that you need to do more than simply duplicate your pages and mix up the keywords, replace the local terms and create new URLs and titles/descriptions. What you should do is create completely unique copy, dynanic content and/or user engagement, local citations will help each landing page, and make sure to get local backlinks to each landing page.
-
While I certainly don't want to pretend to be able to predict anything Google might do, to me, the fact that you are thinking about this as being a potential problem should be enough to make you consider some options. Depending on how many pages you have, it may not be that difficult to get really truly original content produced for those other pages.
Will Google choose not to index you? I have no idea.
My guess is that you get indexed, but may not rank very high if the content is substantially similar on all of those pages. You might get stuck in the proverbial "sandbox." (ranked so low that no one can find you).
My gut says, if you have to ask "is this duplicate content?" It probably is, so make it unique.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawler issues
Can anyone please suggest why our site is not being crawled by Google at the moment? Thanks,
Web Design | | CheethamBellJWT0 -
Will a .com and .co.uk site (with exact same content) hurt seo
hello, i am sure this question has been asked before, but while i tried to search i could not find the right answer. my question is i have a .com and .co.uk site. both sites have exact same product, exact same product descriptions, and everything is the same. the reason for 2 sites is that .com site shows all the details for US customers and in $, and .co.uk site shows all the details to UK customers and with Pound signs. the only difference in the 2 sites might be the privacy policy (different for US and UK) and different membership groups the site belongs to (US site belong to a list of US trade groups, UK belongs to a list of UK trade groups). my question is other than the minor difference above, all the content of the site is exactly the same, so will this hurt seo for either one or both the site. Our US site much more popular and indexed already in google for 4 years, while our UK site was just started 1 month ago. (also both the sites are hosted by same hosting company, with one site as main domain and the other site as domain addon (i thought i include this information also, if it makes sense to readers)) i would appreciate a reply to the question above thanks
Web Design | | kannu10 -
Im having duplicate content issues in wordpress
all of my pages are working fine. but i added my sitemap to my footer in my website and when i click on my blog from my footer it takes me to the homepage. so now im having duplicate content for two diff urls. ive tried adding a rel=canonical and a 301 redirect to the blog page but it doesnt resolve the problem. also, when i go to my footer and click blog. after it brings me to the homepage ill try to click on my pages from the original bar at the top of my screen and it will bring me to the right pages. but it will have the same blog url in the search bar even when im on other pages. other than that all of my pages in my footer and in my homepage toolbar work fine. its just that one particular problem with the blog page in the footer and how it stays with the same blog url on every page after i click the blog in the footer. can someone please help. im using yoast and idk if i should disable it or what.
Web Design | | ClearVisionDesign0 -
Penalized by duplicate content?
Hello, I am in a very weird position. I am managing a website(EMD) which a part of it dynamically creates pages. The former webmaster who create this system though that this would help with SEO but I dought! The thing is that now the site has about 1500 pages which must look duplicate but are they really duplicate? Each page has a unique URL but the content is pretty much the same: one image and a different title with 5-8 words. There is more: All these pages are not accessible by the users but only for the crawlers!!! This URL machine is a part of a php - made photo gallery which i never understood the sense of it! The site overall is not performing very well in SERP, especially after Penguin, but judging by the link profile, the Domain authority, construction (ok besides that crazy photo gallery) and content, it never reached the position it should have in the past. The majority of these mysterious pages - and mostly their images - are cached by Google and some of them are in top places to some SERP - the ones that match the small title on page - but the numbers are poor, 10 - 15 clicks per month. Are these pages considered as duplicated, although they are cached, and is it safe for the site just to remove 1500 at once? The seomoz tools have pointed some of them as dups but the majority not! Can these pages impact the image of the whole site in search engines?( drop in Google and has disappeared from Yahoo and Bing!) Do I also have to tell Google about the removal? I have not seen anything like it before so any comment would be helpful! Thank you!
Web Design | | Tz_Seo0 -
What reason would scrapers, and syndication sites outrank all of our content?
Typing in any of our titles for content, scrapers and content syndication sites all outrank us by quite a bit. What is the main reason for this usually? I started noticing this happening quite a bit this year, and think maybe it has to do with panda. Has anyone figured out the reasoning?
Web Design | | upbuiltgames0 -
How will it affect my site if i link to a site with adult content?
We are currently working on creating 2 sites for a company, one with no adult content, one with adult content. Will it affect the non adult content site if i link to the other one in terms of Google and being blocked by some internet providers.
Web Design | | MattWheatcroft0 -
Duplicate content on mobile sites
Hi Guys We are launching a mobile webshop later this year and have decided to use a subdomain for this. (m.domainname.xx). The content will be more or less identical with the one on the standard desktop site (domainname.xx), but im struggeling to find out if this will create dipplicate content between the mobile and desktop site. Does anyone have a solid answer for this one?
Web Design | | AndersDK0 -
Dynamic pages and code within content
Hi all, I'm considering creating a dynamic table on my site that highlights rows / columns and cells depending on buttons that users can click. Each cell in the table links to a separate page that is created dynamically pulling information from a database. Now I'm aware of the google guidelines: "If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few." So we wondered whether we could put the dynamic pages in our sitemap so that google could index them - the pages can be seen with javascript off which is how the pages are manipulated to make them dynamic. Could anyone give us a overview of the dangers here? I also wondered if you still need to separate content from code on a page? My developer still seems very keen to use inline CSS and javascript! Thanks a bundle.
Web Design | | tgraham0