Forum post multiple pages gives meta description duplicate.
-
My website has a forum that is using the title of the posts as a Meta Description.The problem is that when a posts becomes long and separates in pages Google tells me that i have duplicate meta description issues because the 2nd page and the 3rd page are using the same meta description.What is the best course of action here?
-
I don't think canonicalization would be inappropriate here. Each page is different, but each page of a forum topic is supporting the OPs' main post on the first page. You can canonical the subsequent pages and pass the authority to the main page so it ranks highest.
If someone uses a specific search term that another person used on page 4 of the topic, that page will still show in Google's SERPs and direct the user to the deeper page, rather than the main page. So, we're not applying a canonical tag because of duplicate content issues, but to support the forum topic's parent theme.
I suggest adding rel/next logic to your site's theme and let Google rank the subsequent pages accordingly. Once proper canonical and rel/next tags are implemented, I wouldn't worry about duplicate meta information.
-
In my experience the only way to resolve that is to write unique descriptions or have Google skip the pages.
There is also Pagination, but I've seen cases where after implementing the next and prev code it still was serving Duplicate description errors.
Hope you find your answer!
-
No indexing the rest of the pages means less results on SERP.
Canonicalising the other pages to page 1 is kinda wrong since its not duplicate content and passing all the link juice to just page 1 is not such a good idea.
-
Hi Angelos,
There are a few ways you can resolve this issue:
- noindex the page2, page3s by applying any subsequent pages as a rule
- canonicalise all page2 & page3s to the original post URLs
- update your robots.txt file to block page2 and page3 by using Disallow: /*/page2, Disallow: //*page3
Any of the above will resolve your issue, just comes down to which is the easiest for you to implement.
Feel free to tweet me at @StelinSEO if you've got any further questions
Stel
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content, although page has "noindex"
Hello, I had an issue with some pages being listed as duplicate content in my weekly Moz report. I've since discussed it with my web dev team and we decided to stop the pages from being crawled. The web dev team added this coding to the pages <meta name='robots' content='max-image-preview:large, noindex dofollow' />, but the Moz report is still reporting the pages as duplicate content. Note from the developer "So as far as I can see we've added robots to prevent the issue but maybe there is some subtle change that's needed here. You could check in Google Search Console to see how its seeing this content or you could ask Moz why they are still reporting this and see if we've missed something?" Any help much appreciated!
Technical SEO | | rj_dale0 -
Page Authority on Huffington Post
I was looking into my website backlinks and noticed that a link Huffington Post post has only one page authority while the other post has high page authority like 30 or 40. http://www.huffingtonpost.com/toby-nwazor/is-it-time-to-retire-the-_b_10610052.html Please suggest what can be the issue. Thanks!
Technical SEO | | 1MS0 -
Duplicate content due to numerous sub category level pages
We have a healthcare website which lists doctors based on their medical speciality. We have a paginated series to list hundreds of doctors. Algorithm: A search for Dentist in Newark locality of New York gives a result filled with dentists from Newark followed by list of dentists in locations near by Newark. So all localities under a city have the same set of doctors distributed jumbled an distributed across multiple pages based on nearness to locality. When we don't have any dentists in Newark we populate results for near by localities and create a page. The issue - So when the number of dentists in New York is <11 all Localities X Dentists will have jumbled up results all pointing to the same 10 doctors. The issue is even severe when we see that we have only 1-3 dentists in the city. Every locality page will be exactly the same as a city level page. We have about 2.5 Million pages with the above scenario. **City level page - **https://www.example.com/new-york/dentist - 5 dentists **Locality Level Page - **https://www.example.com/new-york/dentist/clifton, https://www.example.com/new-york/dentist/newark - Page contains the same 5 dentists as in New York city level page in jumbled up or same order. What do you think we must do in such a case? We had discussions on putting a noindex on locality level pages or to apply canonical pointing from locality level to city level. But we are still not 100% sure.
Technical SEO | | ozil0 -
When is Duplicate Content Duplicate Content
Hi, I was wondering exactly when duplicate content is duplicate content? Is it always when it is word-for-word or if it is similar? For example, we currently have an information page and I would like to add a FAQ to the website. There is, however, a crossover with the content and some of it is repeated. However, it is not written word for word. Could you please advise me? Thanks a lot Tom
Technical SEO | | National-Homebuyers0 -
Duplicate page/Title content - Where?
Hi, I have just run a crawl on a new clients site, and there is several 'duplicate page content' and 'Duplicate Page Title'' issues. But I cannot find any duplicate content. And to make matters worse. The actual report has confused me. Just for example the about us page is showing in both reports and for both under 'Other URLs' it is showing 1? Why? Does this mean there is 1 other page with duplicate page title? or duplicate page content? Where are the pages that have the duplicate page titles, or duplicate page content? I have run scans using other software and a copyscape scan. And apart from missing page titles, I cannot find any page that has duplicate titles or content. I can find % percentages of pages with similar/same page titles/content. But this is only partial and contextually correct. So I understand that SEO Moz may pick percentage of content, which is fine, and therefore note that there is duplicate content/page titles. But I cannot seem to figure out where I would the source of the duplicate content/page titles. As there is only 1 listed in both reports for 'Other URLs' Hopefully my long question, has not confused. many thanks in advance for any help
Technical SEO | | wood1e20 -
Duplicate Page Titles and %3E, how can I avoid this?
In my crawl report I keep seeing duplicate page title warning with URL's being referenced twice: e.g. /company/ceo-message/ /company/ceo-message/%3E I'm using canonical link tags but after the new crawl report, I'm still seeing this duplicate page title crawl error. How can I avoid this? I've been looking for answers for a few days but don't seem to see this exact problem discussed. Any insight is appreciated!
Technical SEO | | mxmo0 -
How can i resolve Duplicate Page Content?
Hello, I have created one campaign over SEOmoz tools for my website AutoDreams.it i have found 159 duplicate page content. My problem is that this web site is about car adsso it is easy to create pages with duplicate content and also Car ads are placed byregistered users. How can i resolve this problem? Regards Francesco
Technical SEO | | francesco870 -
Duplicate Content Home Page
Hello, I am getting Duplicate Content warning from SEOMoz for my home page: http://www.teacherprose.com http://www.teacherprose.com/index html I tried code below in .htaccess: redirect 301 /index.html http://www.teacherprose.com This caused error "too many re-directs" in browser Any thoughts? Thank You, Eric
Technical SEO | | monthelie10