Duplicate content pages
-
Crawl Diagnostics Summary shows around 15,000 duplicate content errors for one of my projects, It shows the list of pages with how many duplicate pages are there for each page. But i dont have a way of seeing what are the duplicate page URLs for a specific page without clicking on each page link and checking them manually which is gonna take forever to sort.
When i export the list as CSV, duplicate_page_content column doest show any data.
Can anyone please advice on this please.
Thanks
<colgroup><col width="1096"></colgroup>
| duplicate_page_content | -
Hey there!
Thanks for writing in.
I downloaded the CSV from your Travel Pack campaign. It looks like all of the duplicate content pages are in the CSV that I exported. I found them by sorting the the rows in Excel. Here is a good guide on how to get started sorting in Excel: http://office.microsoft.com/en-us/excel-help/sort-data-in-a-range-or-table-HP010073947.aspx
Thanks!
Nick
-
Sorry if my English was not clear, it's not my first language. My issue is I can't get the list of duplicate URLs of my site...
-
If they are attached to specific strings ( String: After the URL it looks like this: /?alwer.ei.we ) you can block the string(s) in your robot.txt file.
Lets say there are 100 duplicates that start with"/?osifos.sdjvnksdj" block out the "?osifos" in your robot txt.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Make Moz reindex the pages for keywords
Hi Moz says we're 18th for a phrase but that's for the page /himonsia-generators because that was the highest ranking page. If you search now for "heavy range generators" we're jumping between 10-12th position but it's for the page /products/generators - which is correct. How can I make Moz "Reindex" the pages?
Moz Pro | | RichardAskew0 -
How can I deal with tag page duplicate issues
The Moz crawler reported some dupliated issues. Many of them have to do with tags.
Moz Pro | | IamKovacs
Each tag has a link, and as some articles are under several tags, these come up as duplicate content. I read Dr Peter's piece on Canonical stuff, but it's not clear to me if any of these are the solution. Perhaps the solution lies somewhere else? Maybe I need to block the robots from these urls (But that seems counter-SEO-productive) Thanks
Kovacs0 -
On Page Reports for Long-Tail Keywords?
First Q&A here, so take it easy on me. Hopefully this is not a dumb question. 99% of the on page reports I look at are for local keywords. I'm finding that some of the page reports get and F where they should be receiving A's. I noticed if I manage a long-tail keyword like "books in houston tx" that I get a grade based on the exact phrase as opposed to a combination of the keywords. So when I have text in the body saying "books in the Houston area" or "Houston Books" for example, instead of using my exact managed keyword, it will tell me that the keyword is not mentioned in the body anywhere. When it is, it's just not in the exact order... I'm trying to write my pages for users and I don't think the users want to hear "books in houston tx" several times. If I do a report on the same page for the keyword "books" I get an A. So this is where it gets a little in the grey area for me. I need a solid on page report for local long-tail search terms. Anyone have any advice? Is there a way I could be using this tool to better suit my needs?
Moz Pro | | AmericomMarketing0 -
Domain / Page Authority - logarithmic
SEOmoz says their Domain / Page Authority is logarithmic, meaning that lower rankings are easier to get, higher rankings harder to get. Makes sense. But does anyone know what logarithmic equation they use? I'm using the domain and page authority as one metric in amongst other metrics in my keyword analysis. I can't have some metrics linear, others exponential and the SEOmoz one logarithmic.
Moz Pro | | eatyourveggies0 -
Sorting On-page Optimization Reports
I know there are several pre-set ways to sort the on-page optimization reports, but how can I custom sort them into my own order?
Moz Pro | | WillWatrous0 -
Issue: Duplicate page title
Hello, I have run the "Crawl Diagnostics" report using SEOmoz pro and it says that I have a total of 56 errors. 18 of those errors being duplicate content and another 38 errors being duplicate title tags. Now I have looked at both reports and detail and the reason I am getting there errors is due to the fact the it is checking "http" and "https". So for example: my website is http://www.widgets.com On the crawl diagnostics report, it also checks https://www.widgets.com So it looks like I have duplicate content and duplicate title tags because of this Now my question is this: Is this really duplicate content? If so, how do I fix this? Any help is greatly appreciated.
Moz Pro | | threebiz0 -
How to check Page Authority in bulk?
Hey guys, I'm on the free trial for SEOmoz PRO and I'm in love. One question, though. I've been looking all over the internet for a way to check Page Authority in bulk. Is there a way to do this? Would I need the SEOmoz API? And what is the charge? All I really need is a way to check Page Authority in bulk--no extra bells and whistles. Thanks, Brandon
Moz Pro | | thegreatpursuit0 -
Crawl Diagnostics bringing 20k+ errors as duplicate content due to session ids
Signed up to the trial version of Seomoz today just to check it out as I have decided I'm going to do my own SEO rather than outsource it (been let down a few times!). So far I like the look of things and have a feeling I am going to learn a lot and get results. However I have just stumbled on something. After Seomoz dones it's crawl diagnostics run on the site (www.deviltronics.com) it is showing 20,000+ plus errors. From what I can see almost 99% of this is being picked up as erros for duplicate content due to session id's, so i am not sure what to do! I have done a "site:www.deviltronics.com" on google and this certainly doesn't pick up the session id's/duplicate content. So could this just be an issue with the Seomoz bot. If so how can I get Seomoz to ignore these on the crawl? Can I get my developer to add some code somewhere. Help will be much appreciated. Asif
Moz Pro | | blagger0