Errors - 7300 - Duplicate Page Content..Help me..
-
Hi,
I just received the crawl report with 7300 errors of duplicate page content.
Site built using php.
list of errors will be like this..
http://xxxxx.com/channels/?page=1
http://xxxxxx.com/channels/?page=2
I am not good in coding and using readymade script for this website. could anyone guide me to fix this issue?
Thanks.
-
Thanks Dana.
I am watching the video now.. will be trying to fix it...
-
You are welcome. I would add ta canonical tag to the first page of any category or article page that results in multiple (paginated) pages. Watch the video, there are several different ways to go. One easy thing you could do is add a "View all" option link to the first page. Add your canonical tag to the page in its "view all" state, and that should resolve it. There really are 2-3 ways to solve the problem. It just kind of depends on your content and preferences. You will also want to make sure that you direct Googlebot not to crawl or index your search results pages. This can be done in your Robots.txt file.
Because it takes time for Google to crawl, index and recognize your new canonical tags, it might take a few weeks for the duplicate content errors to go away. The same will be true for Roger Mozbot.
This is all, of course, presuming that pagination is your problem. It could be that there's another issue, but this is definitely worth trying.
-
Thanks a lot Dana. But in my report there is no warning for canonical issues. so, by adding the code, it will get resolved?
-
This appears to be a pagination issue. If so, then the solution may be fairly simple. You have a few options. You might want to first make sure that your canonical tags are in order. How you do those will depend on whether or not you want pages in a paginated series (like a category page with more than one page of products listed) included in Google's index. If you want them indexed, then each paginated should have its own rel=canonical tag, specific to that page. If you really only want the first page included in the index, then you could include a tag like this at the top of each page in that
partcular paginated series. You may also need to include rel=next and rel=prev,
depends on your content.
Here is an excellent video on pagination from Google that describes various options,
depending on what type of content you have and how you would like it to be indexed:
http://googlewebmastercentral.blogspot.com/2012/03/video-about-pagination-with-relnext-and.html
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need Help With WWW vs. Non-WWW Duplicate Pages
A friend I'm working with at RedChairMarket.com is having duplicate page issues. Among them, both www and non-www URLs are being generated automatically by his software framework, ASP.net mvc 3. How should we go about finding and tackling these duplicates? Thanks!
Technical SEO | | BrittanyHighland0 -
Results Pages Duplication - What to do?
Hi all, I run a large, well established hotel site which fills a specific niche. Last February we went through a redesign which implemented pagination and lots of PHP / SQL wizzardy. This has left us, however, with a bit of a duplication problem which I'll try my best to explain! Imagine Hotel 1 has a pool, as well as a hot tub. This means that Hotel 1 will be in the search results of both 'Hotels with Pools' and 'Hotels with Hot Tubs', with exactly the same copy, affiliate link and thumbnail picture in the search results. Now imagine this issue occurring hundreds of times across the site and you have our problem, especially since this is a Panda-hit site. We've tried to keep any duplicate content away from our landing pages with some success but it's just all those pesky PHP paginated pages which doing us in (e.g. Hotels/Page-2/?classifications[]263=73491&classifcations[]742=24742 and so on) I'm thinking that we should either a) completely noindex all of the PHP search results or b) move us over to a Javascript platform. Which would you guys recommend? Or is there another solution which I'm overlooking? Any help most appreciated!
Technical SEO | | dooberry0 -
Duplicate Content?
My site has been archiving our newsletters since 2001. It's been helpful because our site visitors can search a database for ideas from those newsletters. (There are hundreds of pages with similar titles: archive1-Jan2000, archive2-feb2000, archive3-mar2000, etc.) But, I see they are being marked as "similar content." Even though the actual page content is not the same. Could this adversely affect SEO? And if so, how can I correct it? Would a separate folder of archived pages with a "nofollow robot" solve this issue? And would my site visitors still be able to search within the site with a nofollow robot?
Technical SEO | | sakeith0 -
Duplicate page content - index.html
Roger is reporting duplicate page content for my domain name and www.mydomain name/index.html. Example: www.just-insulation.com
Technical SEO | | Collie
www.just-insulation.com/index.html What am I doing wrongly, please?0 -
How do I deal with Duplicate content?
Hi, I'm trying SEOMOZ and its saying that i've got loads of duplicate content. We provide phone numbers for cities all over the world, so have pages like this... https://www.keshercommunications.com/Romaniavoipnumbers.html https://www.keshercommunications.com/Icelandvoipnumbers.html etc etc. One for every country. The question is, how do I create pages for each one without it showing up as duplicate content? Each page is generated by the server, but Its impossible to write unique text for each one. Also, the competition seem to have done the same but google is listing all their pages when you search for 'DID Numbers. Look for DIDWW or MyDivert.
Technical SEO | | DanFromUK0 -
Duplicate Content
The crawl shows a lot of duplicate content on my site. Most of the urls its showing are categories and tags (wordpress). so what does this mean exactly? categories is too much like other categories? And how do i go about fixing this the best way. thanks
Technical SEO | | vansy0 -
Duplicate Content Issue
Very strange issue I noticed today. In my SEOMoz Campaigns I noticed thousands of Warnings and Errors! I noticed that any page on my website ending in .php can be duplicated by adding anything you want to the end of the url, which seems to be causing these issues. Ex: Normal URL - www.example.com/testing.php Duplicate URL - www.example.com/testing.php/helloworld The duplicate URL displays the page without the images, but all the text and information is present, duplicating the Normal page. I Also found that many of my PDFs seemed to be getting duplicated burried in directories after directories, which I never ever put in place. Ex: www.example.com/catalog/pdfs/testing.pdf/pdfs/another.pdf/pdfs/more.pdfs/pdfs/ ... when the pdfs are only located in a pdfs directory! I am very confused on how to fix this problem. Maybe with some sort of redirect?
Technical SEO | | hfranz0 -
Duplicate Pages Issue
I noticed a problem and I was wondering if anyone knows how to fix it. I was a sitemap for 1oxygen.com, a site that has around 50 pages. The sitemap generator come back with over a 2000 pages. Here is two of the results: http://www.1oxygen.com/portableconcentrators/portableconcentrators/portableconcentrators/services/rentals.htm
Technical SEO | | chuck-layton
http://www.1oxygen.com/portableconcentrators/portableconcentrators/1oxygen/portableconcentrators/portableconcentrators/portableconcentrators/oxusportableconcentrator.htm These are actaully pages somehow. In my FTP there in the first /portableconentrators/ folder there is about 12 html documents and no other folders. It looks like it is creating a page for every possible folder combination. I have no idea why you those pages above actually work, help please???0