Dulpicate Content being reported
-
Hi
I have a new client whose first MA crawl report is showing lots of duplicate content.
The main batch of these are all the HP url with an 'attachment' part at the end such as:
www.domain.com/?attachment_id=4176
As far as i can tell its some sort of slide show just showing a different image in the main frame of each page, with no other content. Each one does have a unique meta title & H1 though.
Whats the best thing to do here ?
-
Not a problem and leave as is
-
Use the paremeter handling tool in GWT
-
Canonicalise, referencing the HP
or other solution ?
Many Thanks
Dan
-
-
Hi Dan,
Actually it looks like ctrl L will do it (you are creating an excel table). You usually need to erase the first few rows from the export so you have the column header in row 1 and then select all and create the table checking the 'my table has headers' so that you can then filter using the headers
-
Sorry Lynn but what is the 'windows' bit in control-windows-L since cant see on my keyboard, can it have a different icon/symbol etc?
-
Great stuff thanks Lynn !! Ill tell their dev to do that
many many thanks
All Best
Dan
-
cool cheers Don
-
Hi Dan,
The robots must be getting the urls from somewhere so it is worth finding out where. If you download the moz report in csv and open in excel you can control-windows-L to get a filterable list. If you filter for duplicates and find these urls on the left then on the far right it should reference where they are being linked from. I suspect you will find pages in the site that have these images in them and are linking to the attachment_id urls (often it is from gallery pages).
Once you have found the pages, then try applying the yoast redirects and see if they work as expected (ie redirect the attachment_id links to the relevant gallery page for example). Ideally you would get rid of the links completely from the code - this will probably need a bit of dev work on the template but should be pretty straightforward since you are likely just removing the A tag from around the images.
-
Gotcha, definitely don't want to nix pages then. I would imagine Lynn's response is more appropriate then, it is likely that he is using a plugin that has been updated to better SEO practices that he hasn't yet updated.
-
Many thanks Don
ill ask client but dont think so (doubt any links pointing to them) but due to varying kw rich meta titles and h1's think client may have implemented this for some seo reason (hes very seo savvy but bit old school) prob not aware needs more content on page beyond a pic & some meta & an h1.
On a side note do you think these could be dragging sites rankings down (there are 350 of them) ?
All Best
Dan
-
Thanks Lyn
Yes it is wp i think
If i click on the image it loads page with image (another duplicate) in the series next
I'm not sure what the normal page is since can only find these via the cralw reports, they dont seem to be linked to in any site nav etc
Does that sound to you then like best solution is via Yoast redirects etc ?
On a side note do you think these could be dragging sites rankings down (there are 350 of them) ?
Cheers
Dan
-
Hi Dan,
If these pages have no SEO value then you can just stop them from being crawled, thus preventing any duplicate content penalties. If you see some backlinks (SEO value) to any of these then I would use Canonical.
robots.txt
User-agent:: *
Disallow: /*attachment_id
Hope it helps,
Don
-
Hi Dan,
Is the site running wordpress? If so it sounds like maybe a badly coded template which is showing links somewhere in the code to the attachments (if you click on the image in its normal page does it take you to the duplicate url you mention?). It would be best to find out where the linking is happening and correct it so the links are removed if at all possible. The Yoast plugin also has a setting where you can redirect attachment ids to their related post (its in the permalinks settings of the yoast plugin) - that might help solve the problem.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content spamming risk
If some websites, which provide information about apps in a particular niche, are publishing the same content which we have given in our app's description when they refer our app for that particular niche then would it lead to spamming? Our website is getting a backlink from one such website so are we at any sort of risk? What should we do about it without having to lose that backlink?
Technical SEO | | Reema240 -
Does duplicate content not concern Rand?
Hello all, I'm a new SEOer and I'm currently trying to navigate the layman's minefield that is trying to understand duplicate content issues in as best I can. I'm working on a website at the moment where there's a duplicate content issue with blog archives/categories/tags etc. I was planning to beat this by implementing a noindex meta tag on those pages where there are duplicate content issues. Before I go ahead with this I thought: "Hey, these Moz guys seem to know what they're doing! What would Rand do?" Blogs on the website in question appear in full and in date order relating to the tag/category/what-have-you creating the duplicate content problem. Much like Rand's blog here at Moz - I thought I'd have a look at the source code to see how it was dealt with. My amateur eyes could find nothing to help answer this question: E.g. Both the following URLs appear in SERPs (using site:moz,com and very targeted keywords, but they're there): https://moz.com/rand/does-making-a-website-mobile-friendly-have-a-universally-positive-impact-on-mobile-traffic/ https://moz.com/rand/category/moz/ Both pages have a rel="canonical" pointing to themselves. I can understand why he wouldn't be fussed about the category not ranking, but the blog? Is this not having a negative effect? I'm just a little confused as there are so many conflicting "best practice" tips out there - and now after digging around in the source code on Rand's blog I'm more confused than ever! Any help much appreciated, Thanks
Technical SEO | | sbridle1 -
Duplicate Page Content
Hello, After crawling our site Moz is detecting high priority duplicate page content for our product and article listing pages, For example http://store.bmiresearch.com/bangladesh/power and http://store.bmiresearch.com/newzealand/power are being listed as duplicate pages although they have seperate URLs, page titles and H1 tags. They have the same product listed but I would have thought the differentiation in other areas would be sufficient for these to not be deemed as duplicate pages. Is it likely this issue will be impacting on our search rankings? If so are there any recommendations as to how this issue can be overcome. Thanks
Technical SEO | | carlsutherland0 -
Issue with duplicate content
Hello guys, i have a question about duplicate content. Recently I noticed that MOZ's system reports a lot of duplicate content on one of my sites. I'm a little confused what i should do with that because this content is created automatically. All the duplicate content comes from subdomain of my site where we actually share cool images with people. This subdomain is actually pointing to our Tumblr blog where people re-blog our posts and images a lot. I'm really confused how all this duplicate content is created and what i should do to prevent it. Please tell me whether i need to "noindex", "nofollow" that subdomain or you can suggest something better to resolve that issue. Thank you!
Technical SEO | | odmsoft0 -
Localized domains and duplicate content
Hey guys, In my company we are launching a new website and there's an issue it's been bothering me for a while. I'm sure you guys can help me out. I already have a website, let's say ABC.com I'm preparing a localized version of that website for the uk so we'll launch ABC.co.uk Basically the websites are going to be exactly the same with the difference of the homepage. They have a slightly different proposition. Using GeoIP I will redirect the UK traffic to ABC.co.uk and the rest of the traffic will still visit .com website. May google penalize this? The site itself it will be almost the same but the homepage. This may count as duplicate content even if I'm geo-targeting different regions so they will never overlap. Thanks in advance for you advice
Technical SEO | | fabrizzio0 -
Can duplicate [*] reports be suppressed?
That's the best question could come up with! Have searched but can't find any info. New user: First crawl error report show listings of pages with same titles/descriptions. In reality they are all the same page but with different parameters eg Email_Me_When_Back_In_Stock.asp?productId=xxxxxxxxx etc These have been excluded in both robots.txt (for some time ie disallow: /*?)and google webmaster tools (just done). Will they still show in updated report and if so is there a way to suppress them if the issues have been rectified as can be done in webmaster tools. Is there a way to test to see if they are being excluded by robots.txt and GWT?
Technical SEO | | RobWillox0 -
How to resolve this Duplicate content?
Hi , There is page i get when i do proper menu navigation Caratlane.com>jewellery>rings>casualsrings> http://www.caratlane.com/jewellery/rings/casual-rings/leaves-dew-diamond-0-03-ct-peridot-1-ct-ring-18k-yellow-gold.html When i do a site search in my search box by my product code number "JR00219" The same page is appears with different url http://www.caratlane.com/leaves-dew-diamond-0-03-ct-peridot-1-ct-ring-18k-yellow-gold.html So there is a duplicate content. How can we resolve it. Regards, kathir caratlane.com
Technical SEO | | kathiravan0 -
Solution for duplicate content not working
I'm getting a duplicate content error for: http://www.website.com http://www.website.com/default.htm I searched for the Q&A for the solution and found: Access the.htaccess file and add this line: redirect 301 /default.htm http://www.website.com I added the redirect to my .htaccess and then got the following error from Google when trying to access the http://www.website.com/default.htm page: "This webpage has a redirect loop
Technical SEO | | Joeuspe
The webpage at http://www.webpage.com/ has resulted in too many redirects. Clearing your cookies for this site or allowing third-party cookies may fix the problem. If not, it is possibly a server configuration issue and not a problem with your computer." "Error 310 (net::ERR_TOO_MANY_REDIRECTS): There were too many redirects." How can I correct this? Thanks0