Dulpicate Content being reported
-
Hi
I have a new client whose first MA crawl report is showing lots of duplicate content.
The main batch of these are all the HP url with an 'attachment' part at the end such as:
www.domain.com/?attachment_id=4176
As far as i can tell its some sort of slide show just showing a different image in the main frame of each page, with no other content. Each one does have a unique meta title & H1 though.
Whats the best thing to do here ?
-
Not a problem and leave as is
-
Use the paremeter handling tool in GWT
-
Canonicalise, referencing the HP
or other solution ?
Many Thanks
Dan
-
-
Hi Dan,
Actually it looks like ctrl L will do it (you are creating an excel table). You usually need to erase the first few rows from the export so you have the column header in row 1 and then select all and create the table checking the 'my table has headers' so that you can then filter using the headers
-
Sorry Lynn but what is the 'windows' bit in control-windows-L since cant see on my keyboard, can it have a different icon/symbol etc?
-
Great stuff thanks Lynn !! Ill tell their dev to do that
many many thanks
All Best
Dan
-
cool cheers Don
-
Hi Dan,
The robots must be getting the urls from somewhere so it is worth finding out where. If you download the moz report in csv and open in excel you can control-windows-L to get a filterable list. If you filter for duplicates and find these urls on the left then on the far right it should reference where they are being linked from. I suspect you will find pages in the site that have these images in them and are linking to the attachment_id urls (often it is from gallery pages).
Once you have found the pages, then try applying the yoast redirects and see if they work as expected (ie redirect the attachment_id links to the relevant gallery page for example). Ideally you would get rid of the links completely from the code - this will probably need a bit of dev work on the template but should be pretty straightforward since you are likely just removing the A tag from around the images.
-
Gotcha, definitely don't want to nix pages then. I would imagine Lynn's response is more appropriate then, it is likely that he is using a plugin that has been updated to better SEO practices that he hasn't yet updated.
-
Many thanks Don
ill ask client but dont think so (doubt any links pointing to them) but due to varying kw rich meta titles and h1's think client may have implemented this for some seo reason (hes very seo savvy but bit old school) prob not aware needs more content on page beyond a pic & some meta & an h1.
On a side note do you think these could be dragging sites rankings down (there are 350 of them) ?
All Best
Dan
-
Thanks Lyn
Yes it is wp i think
If i click on the image it loads page with image (another duplicate) in the series next
I'm not sure what the normal page is since can only find these via the cralw reports, they dont seem to be linked to in any site nav etc
Does that sound to you then like best solution is via Yoast redirects etc ?
On a side note do you think these could be dragging sites rankings down (there are 350 of them) ?
Cheers
Dan
-
Hi Dan,
If these pages have no SEO value then you can just stop them from being crawled, thus preventing any duplicate content penalties. If you see some backlinks (SEO value) to any of these then I would use Canonical.
robots.txt
User-agent:: *
Disallow: /*attachment_id
Hope it helps,
Don
-
Hi Dan,
Is the site running wordpress? If so it sounds like maybe a badly coded template which is showing links somewhere in the code to the attachments (if you click on the image in its normal page does it take you to the duplicate url you mention?). It would be best to find out where the linking is happening and correct it so the links are removed if at all possible. The Yoast plugin also has a setting where you can redirect attachment ids to their related post (its in the permalinks settings of the yoast plugin) - that might help solve the problem.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Non-standard HTML tags in content
I had coded my website's article content with a non-standard tag <cnt>that surrounded other standard tags that contained the article content, I.e.</cnt> , . The whole text was enclosed in a div that used Schema.org markup to identify the contents of the div as the articleBody. When looking at scraped data for stories in Webmaster Tools, the content of the story was there and identified as the articleBody correctly. It's recently been suggested by someone else that the presence of the non-standard <cnt>tags were actually making the content of the article uncrawlable by the Googlebot, this effectively rendering the content invisible. I did not believe this to be true, since the content appeared to be correctly indexed in Webmaster Tools, but for the sake of a test I agreed to removing them. In the last 6 weeks since they were removed, there have been no changes in impressions or traffic from organic search, which leads me to believe that the removal of the <cnt>tags actually had no effect, since the content was already being indexed successfully and nothing else has changed.</cnt></cnt> My question is whether or not an encapsulating non-standard tag as I've described would actually make the content invisible to Googlebot, or if it should not have made any difference so long as the correct Schema.org markup was in place? Thank you.
Technical SEO | | dlindsey0 -
Who gets punished for duplicate content?
What happens if two domains have duplicate content? Do both domains get punished for it, or just one? If so, which one?
Technical SEO | | Tobii-Dynavox0 -
No Java, No Content..?
Hello Mozers! 🙂 I have a question for you: I am working on a site and while doing an audit I disabled JavaScript via the Web Developer plugin for Chrome. The result is that instead of seeing the page content, I see the typical “loading circle” but nothing else. I imagine this not a good thing but what does this implies technically from crawler perspective? Thanks
Technical SEO | | Pipistrella0 -
Why has Google stopped indexing my content?
Mystery of the day! Back on December 28th, there was a 404 on the sitemap for my website. This lasted 2 days before I noticed and fixed. Since then, Google has not indexed my content. However, the majority of content prior to that date still shows up in the index. The website is http://www.indieshuffle.com/. Clues: Google reports no current issues in Webmaster tools Two reconsideration requests have returned "no manual action taken" When new posts are detected as "submitted" in the sitemap, they take 2-3 days to "index" Once "indexed," they cannot be found in search results unless I include url:indieshuffle.com The sitelinks that used to pop up under a basic search for "Indie Shuffle" are now gone I am using Yoast's SEO tool for Wordpress (and have been for years) Before December 28th, I was doing 90k impressions / 4.5k clicks After December 28th, I'm now doing 8k impressions / 1.3k clicks Ultimately, I'm at a loss for a possible explanation. Running an SEOMoz audit comes up with warnings about rel=canonical and a few broken links (which I've fixed in reaction to the report). I know these things often correct themselves, but two months have passed now, and it continues to get progressively worse. Thanks, Jason
Technical SEO | | indieshuffle0 -
How do I deal with Duplicate content?
Hi, I'm trying SEOMOZ and its saying that i've got loads of duplicate content. We provide phone numbers for cities all over the world, so have pages like this... https://www.keshercommunications.com/Romaniavoipnumbers.html https://www.keshercommunications.com/Icelandvoipnumbers.html etc etc. One for every country. The question is, how do I create pages for each one without it showing up as duplicate content? Each page is generated by the server, but Its impossible to write unique text for each one. Also, the competition seem to have done the same but google is listing all their pages when you search for 'DID Numbers. Look for DIDWW or MyDivert.
Technical SEO | | DanFromUK0 -
Huge number of indexed pages with no content
Hi, We have accidentally had Google indexed lots os our pages with no useful content at all on them. The site in question is a directory site, where we have tags and we have cities. Some cities have suppliers for almost all the tags, but there are lots of cities, where we have suppliers for only a handful of tags. The problem occured, when we created a page for each cities, where we list the tags as links. Unfortunately, our programmer listed all the tags, so not only the ones, where we have businesses, offering their services, but all of them! We have 3,142 cities and 542 tags. I guess, that you can imagine the problem this caused! Now I know, that Google might simply ignore these empty pages and not crawl them again, but when I check a city (city site:domain) with only 40 providers, I still have 1,050 pages indexed. (Yes, we have some issues between the 550 and the 1050 as well, but first things first:)) These pages might not be crawled again, but will be clicked, and bounces and the whole user experience in itself will be terrible. My idea is, that I might use meta noindex for all of these empty pages and perhaps also have a 301 redirect from all the empty category pages, directly to the main page of the given city. Can this work the way I imagine? Any better solution to cut this really bad nightmare short? Thank you in advance. Andras
Technical SEO | | Dilbak0 -
Question about content on ecommerce pages.
Long time ago we hired a seo company to do seo in our website and one of the things they did is that they wrote long text on the category pages of our products. Example here: http://www.theprinterdepo.com/refurbished-printers/wide-format-laser-refurbished-printers Now my marketing person is saying that if its possible to put the text below the items, technically I will find out how to do it, but from your seo experience, is it good or bad? What about if we short those texts to one paragraph only? Thanks
Technical SEO | | levalencia10 -
I am Posting an article on my site and another site has asked to use the same article - Is this a duplicate content issue with google if i am the creator of the content and will it penalize our sites - or one more than the other??
I operate an ecommerce site for outdoor gear and was invited to guest post on a popular blog (not my site) for a trip i had been on. I wrote the aritcle for them and i also will post this same article on my website. Is this a dup content problem with google? and or the other site? Any Help. Also if i wanted to post this same article to 1 or 2 other blogs as long as they link back to me as the author of the article
Technical SEO | | isle_surf0