How much does on-site duplicated content affect SERPs?
-
Hi,
We've recently gotten into Moz, with our E-commerce websites, and discovered that it's crawler takes note of about 2500 pages which it thinks are the same (duplicated). We've now begun to completely rewrite every description of every product (including Meta Title/Description) so that this number may be reduced.
Since this is the biggest issue Moz spots I'm wondering what the effect of fixing it will be on our position in the SERP (mainly Google). Does anybody have some stories or experience about this topic?
Thanks in Advance!
Alexander
-
Thanks for your insight! we're trying to get all of our pages rank A, according to Moz's on-page analyis, optimised for different keywords.
-
If the duplicate content was duplicated on other sites, then by rewriting the content and making it unique, you may see your site ranking for keywords you didn't rank for previously. Google and Bing try to weed out duplicate content. They prefer to just show one result with that content. So, typically, they pick one of the pages that has the content, the one they think it the original, and they only include that page in the search results (at least in the first few pages). So, if you're site was being kept out of the search results because Google didn't pick your site to display, you may now start to rank for those keywords.
As to how dramatic a change it will be for you, it depends on how competitive the keywords are, how well you optimize the pages, and how authoritative your website is.
Kurt Steinbrueck
OurChurch.Com -
Yeah! We're re-doing all content! including categories, which we want to rank for their specific keywords of course
-
Depends on how fast your pages get crawled again, which depends on your niche and your domain trust as well.
Ive gotten badly optimised ecom sites to rank by just making great onpage seo changes. In a highly competitive space, you will see less "drastic" jumps but you'll still be able to see it. Be sure to add great unique content on your category pages ok?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site move?
What happens in a site move from a subdomain to a new domain and how does that effect the root domain of the subdomain and whether or not the subdomain SEO would be transferred to the new domain?
Web Design | | Jjjay1 -
Sitemap created on client's Joomla site but it is not showing up on site reports as existing? (Thumbs Up To Answers)
I am working with a web developer who built our client's site in Joomla. I seem to have a lot of issues with Joomla based sites. Any how, the site is www.pitgearusa.com and when we run site reports it is showing there is no xml sitemap. However he used a popular Joomla plugin for sitemaps called Xmap. Here is their url: http://www.jooxmap.com/ Can anyone provide any advice on what the website developer needs to do in order for the xml sitemap to function and "show up" on reports? Thanks Mashed Up
Web Design | | Atlanta-SMO0 -
Parallax, SEO, and Duplicate Content
We are working on a project that uses parallax to provide a great experience to the end user, and we are also trying to create a best case scenario for SEO. We have multiple keywords we are trying to optimize. We have multiple pages with the parallax function built into it. Basically each member of the primary navigation is it's own page, with all subpages built below it using the parallax function. Our navigation currently uses the hashbang method to provide custom URL's for each subpage. And the user is appropriately directed to the right section based on that hashbang. www.example.com/About < This is its own page www.example.com/about/#/history < This is a subpage that you scroll to on the About page We are trying to decide what the best method will be for trying to optimize each subpage, but my current concern is that because each subpage is really a part of the primary page, will all those URL's be seen as duplicate content? Currently the site can also serve each subpage as it's own page as well, so without the parallax function. Should I include those as part of the sitemap. There's no way to navigate to them unless I include them in the sitemap, but I don't want Google to think I'm disingenuous in providing them links that don't exist, solely for the purpose of SEO, but truthfully all of the content exists and is available to the user. I know that a lot of people are asking these questions, and there really are no right answers yet, but I'm curious about everyone else's experience so far.
Web Design | | PaulRonin2 -
Site structure- category pages
Hi, I'm relatively new to SEO but have tried to apply all best practices to my site. However, I've hit a stumbling block when it comes to whether or not to index my category pages. http://istudyenglishonline.com/category/expressions-idioms/ General info: the site has been created with Wordpress and has a directory of English idioms. Each idiom is associated with one or more categories that it falls under (emotions, sports, food etc). Each category has its own page where the list of idioms will be. As each idiom often has more than one associated category, the same idiom will appear in different category pages, thus creating duplicate content. However, I have given each category page its own unique description. The issue is, when there are numerous idioms, the category page will have more than 1 page. I don't have the ability to create a unique description for each subsequent page of the main category. I know that the very model for some vertical search engines (such as indeed.com) is to create such landing pages and that the more "categories" that they have assigned to their job ads, in this case, the more pages created and the more pages indexed in Google. This seems to work very well for them. My question is, am I doing things right? Should I be doing anything to the subsequent category pages to avoid duplicate content? My plan was to have so many idioms associated with so many categories that I have a fair number of landing pages indexed in google, thus attacking the long tail keywords. However, I'm not sure if I am going the right way. Any advice would be much appreciated!
Web Design | | villarroel0 -
Build New Site Without Losing Rankings
Good morning SEOmoz community. I have a question which I am pretty sure I already know the answer to, however i thought I would reach out to my fellow experts to see if anyone had some great advice. I would really like to give my website a makeover. i have two thoughts on this, one is to scrap the site completely and start fresh, the other would be to only change it visually, but keep all the content and on-page optimization. I am terrified of losing my rankings. I am ranked position 1 and 2 for highly competitive terms and have another 15 - 20 keywords on page 1. Any advice would be tremendously appreciated!!!
Web Design | | WebbyNabler0 -
How many sites on one hosting account?
How many sites is safe to house on one hosting provider? I use BlueHost and they advertise unlimited domains, but I'm not sure what the negative side effects might be from hosting too many on one hosting service. If it matter at all, I'm using WordPress to build my sites. Pros and Cons?
Web Design | | leafndrop0 -
Content Stacking - CSS positioning
I was curious to know what everyone thinks about CSS positioning so that the spiders will read a optimal bulk of content first - before it reads the others. Say I have some Tab's set up for navigational purposes, where the content in the last tab is actually what I want the bots to see first. What would be the best practices for accomplishing something like this?
Web Design | | imageworks-2612900 -
Two URLs with same content
We recently had a client who own multiple brands switch from having multiple urls to having a single domain with multiple sub domains. I've posted an example below to better explain. My question is the original url is still functional, so there are two urls with identical content, yet I haven't been getting a duplicate content error. Also, would a rel canonical link be beneficial in this case since the duplicate content is on two separate domains? My thoughts were to put a 301 redirect on the original pages so they permanently forward to the new sub-domain format. Is this the best course of action? If not, what would you recommend? Example: Original URLs
Web Design | | BluespaceCreative
www.example1.com
www.example2.com
www.example3.com
www.parentcompany.com New URLs
example1.parentcompany.com
example2.parentcompany.com
example3.parentcompany.com
www.parentcompany.com Let me know if this I need to clarify anything in better detail.
Thanks in advance!0