Dulpicate Content being reported
-
Hi
I have a new client whose first MA crawl report is showing lots of duplicate content.
The main batch of these are all the HP url with an 'attachment' part at the end such as:
www.domain.com/?attachment_id=4176
As far as i can tell its some sort of slide show just showing a different image in the main frame of each page, with no other content. Each one does have a unique meta title & H1 though.
Whats the best thing to do here ?
-
Not a problem and leave as is
-
Use the paremeter handling tool in GWT
-
Canonicalise, referencing the HP
or other solution ?
Many Thanks
Dan
-
-
Hi Dan,
Actually it looks like ctrl L will do it (you are creating an excel table). You usually need to erase the first few rows from the export so you have the column header in row 1 and then select all and create the table checking the 'my table has headers' so that you can then filter using the headers
-
Sorry Lynn but what is the 'windows' bit in control-windows-L since cant see on my keyboard, can it have a different icon/symbol etc?
-
Great stuff thanks Lynn !! Ill tell their dev to do that
many many thanks
All Best
Dan
-
cool cheers Don
-
Hi Dan,
The robots must be getting the urls from somewhere so it is worth finding out where. If you download the moz report in csv and open in excel you can control-windows-L to get a filterable list. If you filter for duplicates and find these urls on the left then on the far right it should reference where they are being linked from. I suspect you will find pages in the site that have these images in them and are linking to the attachment_id urls (often it is from gallery pages).
Once you have found the pages, then try applying the yoast redirects and see if they work as expected (ie redirect the attachment_id links to the relevant gallery page for example). Ideally you would get rid of the links completely from the code - this will probably need a bit of dev work on the template but should be pretty straightforward since you are likely just removing the A tag from around the images.
-
Gotcha, definitely don't want to nix pages then. I would imagine Lynn's response is more appropriate then, it is likely that he is using a plugin that has been updated to better SEO practices that he hasn't yet updated.
-
Many thanks Don
ill ask client but dont think so (doubt any links pointing to them) but due to varying kw rich meta titles and h1's think client may have implemented this for some seo reason (hes very seo savvy but bit old school) prob not aware needs more content on page beyond a pic & some meta & an h1.
On a side note do you think these could be dragging sites rankings down (there are 350 of them) ?
All Best
Dan
-
Thanks Lyn
Yes it is wp i think
If i click on the image it loads page with image (another duplicate) in the series next
I'm not sure what the normal page is since can only find these via the cralw reports, they dont seem to be linked to in any site nav etc
Does that sound to you then like best solution is via Yoast redirects etc ?
On a side note do you think these could be dragging sites rankings down (there are 350 of them) ?
Cheers
Dan
-
Hi Dan,
If these pages have no SEO value then you can just stop them from being crawled, thus preventing any duplicate content penalties. If you see some backlinks (SEO value) to any of these then I would use Canonical.
robots.txt
User-agent:: *
Disallow: /*attachment_id
Hope it helps,
Don
-
Hi Dan,
Is the site running wordpress? If so it sounds like maybe a badly coded template which is showing links somewhere in the code to the attachments (if you click on the image in its normal page does it take you to the duplicate url you mention?). It would be best to find out where the linking is happening and correct it so the links are removed if at all possible. The Yoast plugin also has a setting where you can redirect attachment ids to their related post (its in the permalinks settings of the yoast plugin) - that might help solve the problem.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is this going to be seen by google as duplicate content
Hi All, Thanks in advance for any help that you can offer in regards to this. I have been conducted a bit of analysis of our server access file to see what googlebot is doing, where it is going etc. Now firstly, I am not SEO but have an interest. What I am seeing a lot of is that we have URL's that have an extension that sets the currency that is displayed on the products so that we can conduct Adwords campaigns in other countries, these show as follows: feedurl=AUD, feedurl=USD, feedurl=EUR etc. What I can see is that google bot is hitting a URL such as /some_product, then /someproduct?feedurl=USD and then /someproduct?feedurl=EUR and then /someproduct?feedurl=AUD all after each other. Now this is the same product page and just has the price shown slightly different on each. Would this count as a duplicate content issue? Should I disavow feedurl? Any assistance that you can offer would be greatly appreciated. Thanks, Tim
Technical SEO | | timsilver0 -
Optimising Widgets of Content with Yoast
Hi Guys, I am working on a client's website and the developer has built multiple pages using Widgets for Content. When using Yoast on a Page level it is not recognising the keyword focus or the content for review. Is there something I am missing or does anyone have any recommendations?
Technical SEO | | musthavemarketing0 -
Page Content
Our site is a home to home moving listing portal. Consumers who wants to move his home fills a form so that moving companies can cote prices. We were generating listing page URL’s by using the title submitted by customer. Unfortunately we have understood by now that many customers have entered exactly same title for their listings which has caused us having hundreds of similar page title. We have corrected all the pages which had similar meta tag and duplicate page title tags. We have also inserted controls to our software to prevent generating duplicate page title tags or meta tags. But also the page content quality not very good because page content added by customer.(example: http://www.enakliyat.com.tr/detaylar/evden-eve--6001) What should I do. Please help me.
Technical SEO | | iskq0 -
Syndicated content outranks my original article
I have a small site and write original blog content for my small audience. There is a much larger, highly relevant site that is willing to accept guest blogs and they don't require original content. It is one of the largest sites within my niche and many potential customers of mine are there. When I create a new article I first post to my blog, and then share it with G+, twitter, FB, linkedin. I wait a day. By this time G has seen the links that point to my article and has indexed it. Then I post a copy of the article on the much larger site. I have a rel=author tag within the article but the larger site adds "nofollow" to that tag. I have tried putting a link rel=canonical tag in the article but the larger site strips that tag out. So G sees a copy of my content on this larger site. I'm hoping they realize it was posted a day later than the original version on my blog. But if not will my blog get labeled as a scraper? Second: when I Google the exact blog title I see my article on the larger site shows up as the #1 search result but (1) there is no rich snippet with my author creds (maybe because the author tag was marked nofollow?), and (2) the original version of the article from my blog is not in the results (I'm guessing it was stripped out as duplicate). There are benefits for my article being on the larger site, since many of my potential customers are there and the article does include a link back to my site (the link is nofollow). But I'm wondering if (1) I can fix things so my original article shows up in the search results, or (2) am I hurting myself with this strategy (having G possibly label me a scraper)? I do rank for other phrases in G, so I know my site hasn't had a wholesale penalty of some kind.
Technical SEO | | scanlin0 -
External video content in iframe
Hi, On our site we have a lot of video content. The player is hosted by a third party so we are using an iframe to include the content on our site. The problem is that the content it self (on the third party domain) is shown in the google result. My question is: Can we ask the third party to disallow the content from indexing in their robots.txt or will that also affect our own use of the video content? For example we use video-sitemaps to include the videos in Google video search (the sitemap links to the videos on our own domain, but we are still using iframes on the pages to collect the content from the third party domain that will then be blocked by robots.txt). I hope you understand what I mean... Any suggestions? Thanks a lot!
Technical SEO | | Googleankan0 -
Issue: Duplicate Pages Content
Hello, Following the setting up of a new campaign, SEOmoz pro says I have a duplicate page content issue. It says the follwoing are duplicates: http://www.mysite.com/ and http://www.mysite.com/index.htm This is obviously true, but is it a problem? Do I need to do anything to avoid a google penalty? The site in question is a static html site and the real page only exsists at http://www.mysite.com/index.htm but if you type in just the domain name then that brings up the same page. Please let me know what if anything I need to do. This site by the way, has had a panda 3.4 penalty a few months ago. Thanks, Colin
Technical SEO | | Colski0 -
Duplicate Content and URL Capitalization
I have multiple URLs that SEOMoz is reporting as duplicate content. The reason is that there are characters in the URL that may, or may not, be capitalized depending on user input. A couple examples are: www.househitz.com/Pennsylvania/Houses-for-sale www.househitz.com/Pennsylvania/houses-for-sale www.househitz.com/Pennsylvania/Houses-for-rent www.househitz.com/Pennsylvania/houses-for-rent There are currently thousands of instances of this on the site. Is this something I should spend effort to try and resolve (may not be minor effort), or should I just ignore it and move on?
Technical SEO | | Jom0 -
Duplicate Content - Just how killer is it?
Yesterday I received my ranking report and was extremely disappointed that my high-priority pages dropped in rank for a second week in a row for my targeted keywords. This is after running them through the gradecard and getting As for each of them on the keywords I wanted. I looked at my google webmaster tools and saw new duplicate content pages listed, which were the ones I had just modified to get my keyword targeting better. In my hastiness to work on getting the keyword usage up, I neglected to prevent these descriptions from coming up when viewing the page with filter parameters, sort parameters and page parameters... so google saw these descriptions as duplicate content (since myurl.html and myurl.html?filter=blah are seen as different). So my question: is this the likely culprit for some pretty drastic hits to ranking? I've fixed this now, but are there any ways to prevent this in the future? (I know _of _canonical tags, but have never used them, and am not sure if this applies in this situation) Thanks! EDIT: One thing I forgot to ask as well: has anyone inflicted this upon themselves? And how long did it take you to recover?
Technical SEO | | Ask_MMM0