Glossary index and individual pages create duplicate content. How much might this hurt me?
-
I've got a glossary on my site with an index page for each letter of the alphabet that has a definition. So the M section lists every definition (the whole definition).
But each definition also has its own individual page (and we link to those pages internally so the user doesn't have to hunt down the entire M page).
So I definitely have duplicate content ... 112 instances (112 terms). Maybe it's not so bad because each definition is just a short paragraph(?)
How much does this hurt my potential ranking for each definition? How much does it hurt my site overall?
Am I better off making the individual pages no-index? or canonicalizing them?
-
Thanks, Ryan!
-
From here: http://moz.com/messages/write to Dirk's username: DC1611. There used to be a button in profiles, but it looks like it got shuffled in the redesign.
-
PM? Does Moz offer that function?
-
It's a bit difficult to assess which of the pages is more important without knowing the site. Having a lot of content is good - but if the only link between the content is that they all start with the same letter it could be pretty weak or pretty strong depending on the situation:
I'll give 2 examples :
Suppose that the index is on First names starting with S - in this case this page is a valuable one because a lot of people are searching for it - and the search volume is potentially bigger than the number of people that are looking for first name steve (= one specific item)
Suppose the index is about Illnesses starting with S - in this case the index page has very little value for a searcher, because people are searching illnesses based the symptoms -the fact that illnesses start with S doesn't link them together.
It could be helpful if you send me the actual url's via PM if you don't want to disclose them here.
rgds
Dirk
-
Oops. Sorry. Poor wording there. Meant to say ...
Definitely not concerned that the M index page and the M* definition** page BOTH show up in the search results.
We definitely do want at least one of the pages to not only show up in the rankings, but to rank highly. I'm guessing the M index page would actually have a chance of ranking high because it will have so many long tails related to our short-tail.
But it would seem weird to put a no-index on the M* definition** page ... since we have multiple internal links to those pages.
Thanks again for your patience. Really appreciate the feedback.
Steve
-
That's exactly what I am saying - your index page with all the definitions is from Google perspective completely different from the detailed definition page (the first one being much richer in content than the 2nd one). If getting these pages ranked is the least of concerns - you can keep it as it is. If you want to play on the safe side, you can put a noindex on the index page.
rgds,
Dirk
-
Just having a bit of a dilemma. Trying to make it easier for people who come to the glossary and then go to ... say ... the M page. Don't have to keep clicking away to see the definitions. Result: More user-friendly
But we also want to have a very specific definition page so that when we link from an article to the definition, the user doesn't have to see all of the M definitions. Result: More user-friendly.
Definitely not concerned that both the M index page and the M* definition** page show up in the search results. That would actually be swell. Just more concerned that our overall site ranking or domain authority will somehow suffer.
If you're saying that the M index page and the M* page** are dramatically different (because the M index page is much, much longer) and so I shouldn't worry, that's great. (Hope that's what you're saying.)
Thanks!
-
Hi,
As far as I understand it's not really a question of duplicate content in the SEO meaning. Although all the definitions starting with M are on the M-index page this page is quite different to the pages that contain the individual definitions of the terms that start with M.
A problem on many sites is that the pages that only contain the explanation of one term are very light in terms of content, and that the page with is listing all these terms is generally not very interesting from a user (and search perspective). I don't know your site, so difficult to assess if this is the case
You could make the index page noindex/follow - and just list the terms, linking to the explanation pages. For the explanation pages which are probably the most interesting for users & search engines: try to enrich them by adding more content, like links to articles on your site that use the term, or have more information on the term
Hope this helps,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long will old pages stay in Google's cache index. We have a new site that is two months old but we are seeing old pages even though we used 301 redirects.
Two months ago we launched a new website (same domain) and implemented 301 re-directs for all of the pages. Two months later we are still seeing old pages in Google's cache index. So how long should I tell the client this should take for them all to be removed in search?
Intermediate & Advanced SEO | | Liamis0 -
Weird Indexing Issues with the Pages and Rankings
When I found the my page was non-existent on the search results page, I requested Google to index my page via the Search Console. And then just a few minutes after I did that, that page rose to top 3 ranking on the search page (with the same keyword and browser search). It happens to most of the pages on my website. Maybe a week later the rankings sank again, and I had to do the process again to make my pages to the top. Any reasons to explain this phenomenon, and how I can fix this issue? Thank you in advance.
Intermediate & Advanced SEO | | mrmrsteven0 -
Why is Google no longer Indexing and Ranking my state pages with Dynamic Content?
Hi, We have some state specific pages that display dynamic content based on the state that is selected here. For example this page displays new york based content. But for some reason google is no longer ranking these pages. Instead it's defaulting to the page where you select the state here. But last year the individual state dynamic pages were ranking. The only change we made was move these pages from http to https. But now google isn't seeing these individual dynamically generated state based pages. When I do a site: url search it doesn't find any of these state pages. Any thoughts on why this is happening and how to fix it. Thanks in advance for any insight. Eddy By the way when I check these pages in google search console fetch as google, google is able to see these pages fine and they're not being blocked by any robot.txt.
Intermediate & Advanced SEO | | eddys_kap0 -
Web accessibility - High Contrast web pages, duplicate content and SEO
Hi all, I'm working with a client who has various URL variations to display their content in High Contrast and Low Contrast. It feels like quite an old way of doing things. The URLs look like this: domain.com/bespoke-curtain-making/ - Default URL
Intermediate & Advanced SEO | | Bee159
domain.com/bespoke-curtain-making/?style=hc - High Contrast page
domain.com/bespoke-curtain-making/?style=lc - Low Contrast page My questions are: Surely this content is duplicate content according to a search engine Should the different versions have a meta noindex directive in the header? Is there a better way of serving these pages? Thanks.0 -
Questions about duplicate photo content?
I know that Google is a mystery, so I am not sure if there are answers to these questions, but I'm going to ask anyway! I recently realized that Google is not happy with duplicate photo content. I'm a photographer and have sold many photos in the past (but retained the rights for) that I am now using on my site. My recent revelations means that I'm now taking down all of these photos. So I've been reverse image searching all of my photos to see if I let anyone else use it first, and in the course of this I found out that there are many of my photos being used by other sites on the web. So my questions are: With photos that I used first and others have stolen, If I edit these photos (to add copyright info) and then re-upload them, will the sites that are using these images then get credit for using the original image first? If I have a photo on another one of my own sites and I take it down, can I safely use that photo on my main site, or will Google retain the knowledge that it's been used somewhere else first? If I sold a photo and it's being used on another site, can I safely use a different photo from the same series that is almost exactly the same? I am unclear what data from the photo Google is matching, and if they can tell the difference between photos that were taken a few seconds apart.
Intermediate & Advanced SEO | | Lina5000 -
How do I create a strategy to get rid of dupe content pages but still keep the SEO juice?
We have about 30,000 pages that are variations of "<product-type>prices/<type-of-thing>/<city><state "<="" p=""></state></city></type-of-thing></product-type> These pages are bringing us lots of free conversions because when somebody searches for this exact phrase for their city/state, they are pretty low-funnel. The problem that we are running into is that the pages are showing up as dupe content. One solution we were discussing is to 301-redirect or canonical all the city-state pages back to jus tthe "<type of="" thing="">" level, and then create really solid unique content for the few hundred pages we would have at that point.</type> My concern is this. I still want to rank for the city-state because as I look through our best-converting search-terms, they nearly always have the city-state in the search term, so the search is some variation of " <product-type><type of="" thing=""><city><state>"</state></city></type></product-type> One thing we thought about doing is dynamically changing the meta-data & headers to add the city-state info there. Are there other potential solutions to this?
Intermediate & Advanced SEO | | editabletext0 -
Duplicate Content/ Indexing Question
I have a real estate Wordpress site that uses an IDX provider to add real estate listings to my site. A new page is created as a new property comes to market and then the page is deleted when the property is sold. I like the functionality of the service but it creates a significant amount of 404's and I'm also concerned about duplicate content because anyone else using the same service here in Las Vegas will have 1000's of the exact same property pages that I do. Any thoughts on this and is there a way that I can have the search engines only index the core 20 pages of my site and ignore future property pages? Your advice is greatly appreciated. See link for example http://www.mylvcondosales.com/mandarin-las-vegas/
Intermediate & Advanced SEO | | AnthonyLasVegas0 -
Indexation of content from internal pages (registration) by Google
Hello, we are having quite a big amount of content on internal pages which can only be accessed as a registered member. What are the different options the get this content indexed by Google? In certain cases we might be able to show a preview to visitors. In other cases this is not possible for legal reasons. Somebody told me that there is an option to send the content of pages directly to google for indexation. Unfortunately he couldn't give me more details. I only know that this possible for URLs (sitemap). Is there really a possibility to do this for the entire content of a page without giving google access to crawl this page? Thanks Ben
Intermediate & Advanced SEO | | guitarslinger0