Duplicate Content Problems
-
Hi I am new to the seomoz community I have been browsing for a while now.
I put my new website into the seomoz dashboard and out of 250 crawls I have 120 errors! So the main problem is duplicate content. We are a website that finds free content sources for popular songs/artists. While seo is not our main focus for driving traffic I wanted to spend a little time to make sure our site is up to standards. With that said you can see when two songs by an artist are loaded. http://viromusic.com/song/125642 & http://viromusic.com/song/5433265 seomoz is saying that it is duplicate content even though they are two completely different songs. I am not exactly sure what to do about this situation.
We will be adding more content to our site such as a blog, artist biographies and commenting maybe this will help? Although if someone was playing multiple bob marley songs the biography that is loaded will also be the same for both songs.
Also when a playlist is loaded http://viromusic.com/playlist/sldvjg on the larger playlists im getting an error for to many links on the page. (some of the playlists have over 100 songs) any suggestions?
Thanks in advance and any tips or suggestions for my new site would be greatly appreciated!
-
It's not going to get them deindexed and it's unlikely it could be the root cause why they wouldn't rank high.
There's honestly not much content on each page, nor it is easy to tell what keywords you are targeting since they kind of cannibalize each other, and that makes it harder to rank well.
-
Thanks for the reply! Well I guess ill try to find a good way to make the content different. Do you think that these pages will get deindexed? Or will they just not have a big possibility of ranking high?
-
A few things - too many links on a page happens all the time. I know it's a message I get, but because of the way my site is built, it's the way it is. It looks like the way your menus and such are set up that it could be just part of what you have to accept about your design. It's not going to kill you - but unless you want to make deeper levels to your site so people have fewer links to click through at the main page, then that's how the "issue" will render itself. You are not alone in having this happen to you.
As for duplicate content, something to think about: your point of it being two different songs is how you as a person look at it and you need to look at it as a crawler. Most everything else on those two pages is the same, except for the song featured in the center and maybe some of the songs listed to the right. Even the page titles are similar. I'm taking a random stab, but that means only about 15% of the content between the two is different - most noticeably the featured song. Again, this is part of the outcome of your site design; you aren't alone in having this problem. Either you need to redo your site design or add more differentiated content (change the page titles, change what data feeds into the "more songs" section, etc.).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Simple duplicate content query
Hello Community, One of my clients runs a job board website. They are having some new framework installed which will lead to them having to delete all their jobs and re-add them. The same jobs will be re-posted but with a different reference number which in turn with change each URL. I believe this will cause significant duplicate content issues, I just thought I would get a second opinion on best practice for approaching a situation like this. Would a possible solution be to delete jobs gradually and 301 re-direct old URLs to new URLs? Many thanks in advance, Adam
Technical SEO | | SO_UK0 -
Duplicate content w/ same URLs
I am getting high priority issues for our privacy & terms pages that have the same URL. Why would this show up as duplicate content? Thanks!
Technical SEO | | RanvirGujral0 -
How does Google view duplicate photo content?
Now that we can search by image on Google and see every site that is using the same photo, I assume that Google is going to use this as a signal for ranking as well. Is that already happening? I ask because I have sold many photos over the years with first-use only rights, where I retain the copyright. So I have photos on my site that I own the copyright for that are on other sites (and were there first). I am not sure if I should make an effort to remove these photos from my site or if I can wait another couple years.
Technical SEO | | Lina5000 -
How to fix duplicate page content error?
SEOmoz's Crawl Diagnostics is complaining about a duplicate page error. The example of links that has duplicate page content error are http://www.equipnet.com/misc-spare-motors-and-pumps_listid_348855 http://www.equipnet.com/misc-spare-motors-and-pumps_listid_348852 These are not duplicate pages. There are some values that are different on both pages like listing # , equipnet tag # , price. I am not sure how do highlight the different things the two page has like the "Equipment Tag # and listing #". Do they resolve if i use some style attribute to highlight such values on page? Please help me with this as i am not really sure why seo is thinking that both pages have same content. Thanks !!!
Technical SEO | | RGEQUIPNET0 -
How damaging is duplicate content in a forum?
Hey all; I hunted around for this in previous questions in the Q&A and didn't see anything. I'm just coming back to SEO after a few years out of the field and am preparing recommendations for our web dev team. We use a custom-coded software for our forums, and it creates a giant swathe of duplicate content, as each post has its own link. For example: domain.com/forum/post_topic domain.com/forum/post_topic/post1 domain.com/forum/post_topic/post2 ...and so on. However, since every page of the forum defaults to showing 20 posts, that means that every single forum thread that's 20 posts long has 21 different pages with identical content. Now, our forum is all user-generated content and is not generally a source of much inbound traffic--with occasional exceptions--but I was curious if having a mess of duplicate content in our forums could damage our ability to rate well in a different directory of the site. I've heard that Panda is really cracking down on duplicate content, and last time I was current on SEO trends, rel="canonical" was the hot new thing that everyone was talking about, so I've got a lot of catching up to do. Any guidance from the community would be much appreciated.
Technical SEO | | TheEnigmaticT0 -
How do I fix duplicate content with the home page?
This is probably SEO 101, but I'm unsure what to do here... Last week my weekly crawl diagnostics were off the chart because http:// was not resolving to http://www...fixed that but now it's saying I have duplicate content on: http://www.......com http://www.......com/index.php How do I fix this? Thanks in advance!
Technical SEO | | jgower0 -
Duplicate Content Resolution Suggestion?
SEOmoz tools is saying there is duplicate content for: www.mydomain.com www.mydomain.com/index.html What would be the best way to resolve this "error"?
Technical SEO | | PlasticCards0 -
The Bible and Duplicate Content
We have our complete set of scriptures online, including the Bible at http://lds.org/scriptures. Users can browse to any of the volumes of scriptures. We've improved the user experience by allowing users to link to specific verses in context which will scroll to and highlight the linked verse. However, this creates a significant amount of duplicate content. For example, these links: http://lds.org/scriptures/nt/james/1.5 http://lds.org/scriptures/nt/james/1.5-10 http://lds.org/scriptures/nt/james/1 All of those will link to the same chapter in the book of James, yet the first two will highlight the verse 5 and verses 5-10 respectively. This is a good user experience because in other sections of our site and on blogs throughout the world webmasters link to specific verses so the reader can see the verse in context of the rest of the chapter. Another bible site has separate html pages for each verse individually and tends to outrank us because of this (and possibly some other reasons) for long tail chapter/verse queries. However, our tests indicated that the current version is preferred by users. We have a sitemap ready to publish which includes a URL for every chapter/verse. We hope this will improve indexing of some of the more popular verses. However, Googlebot is going to see some duplicate content as it crawls that sitemap! So the question is: is the sitemap a good idea realizing that we can't revert back to including each chapter/verse on its own unique page? We are also going to recommend that we create unique titles for each of the verses and pass a portion of the text from the verse into the meta description. Will this perhaps be enough to satisfy Googlebot that the pages are in fact unique? They certainly are from a user perspective. Thanks all for taking the time!
Technical SEO | | LDS-SEO0