Need for a modified meta-description every page for paginated content?
-
I'm currently working on a site, where there url structure which is something like: www.domain.com/catagory?page=4. With ~15 results per page.
The pages all canonical to www.domain.com/catagory, with rel next and rel prev to www.domain.com/catagory?page=5 and www.domain.com/catagory?page=3
Webmaster tools flags these all as duplicate meta descriptions, So I wondered if there is value in appending the page number to the end of the description, (as we have with the title for the same reason) or if I am using a sub-optimal url structure.
Any advice?
-
We don't have a view all page(We found them so slow, so long, and with so meny links we had a notable improvement in rankings in general when switching to the quicker paginated versions). And other then the first page none of the other pages are currently in our site map.
I'm not entirely sure how that would stop gwt flagging it as a duplicate meta though. Less you imply to also no-index them.
-
Do you have "View All" as an option for your paginated pages? If not, you might consider it, and then just include the "View all" version of the page in your site map. Just a thought...
-
That scale of unique descriptions is well beyond our capacity. We're actually considering dropping the number of items per page too.
Thanks for the help.
-
Could ignore cause any problems? (such as pages that should/shouldn't be indexed) I was rather suprised to discover that using cannonical wasn't enough.
-
I believe appending the page number, for example: (Page 3 of 5) to the end of the meta description would suffice from SEOmoz's crawler's or GWT's perspective, however, the best would be to have the ability to create completely unique meta descriptions.
-
It sounds like you have canonical and rel next/prev setup correctly so you shouldn't worry about duplicate meta descriptions. You could add ?page= as a query string to "ignore" in WMT and then it will ignore those pages and you won't be getting any errors from duplicate meta's on those pages.
Hope this helps,
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicated titles and meta descriptions
Hi, Dealing with both my duplicated titles and meta descriptions i'm wondering if there's a "quick" win I could potentially implement asap. A bit of background:
Technical SEO | | GhillC
Say I've 4 pages structured that way: domain.com/us/productA.html for the US domain.com/gb/productA.html the UK domain.com/fr/productA.html for France domain.com/de/productA.html For Germany At the moment, both my page titles and meta-descriptions are duplicated all over the place for product A.
Title is reading "Product A - company name"
MD is a bit better, being translated in all 3 languages (En, Fr, DE). Therefore being the same for the US and for the UK. Ideally, I would get unique page titles and MD all over the place. However, due to time and resource constraints, I can't make it happen overnight. So my questions are pretty simple:
1. Can I create a rule for page titles to be "Product A - country - company name" or similar? Would that be enough to make the page titles unique? Is there any value doing so?
2. Can I "localize" duplicate MD by simply naming the country? I assume it is not enough in this case as all the rest would be copy/pasted. Ideally speaking, both my page titles and MD would be completely unique but I can't afford doing so in the short term. Thanks!0 -
Duplicate content: using the robots meta tag in conjunction with the canonical tag?
We have a WordPress instance on an Apache subdomain (let's say it's blog.website.com) alongside our main website, which is built in Angular. The tech team is using Akamai to do URL rewrites so that the blog posts appear under the main domain (website.com/more-keywords/here). However, due to the way they configured the WordPress install, they can't do a wildcard redirect under htaccess to force all the subdomain URLs to appear as subdirectories, so as you might have guessed, we're dealing with duplicate content issues. They could in theory do manual 301s for each blog post, but that's laborious and a real hassle given our IT structure (we're a financial services firm, so lots of bureaucracy and regulation). In addition, due to internal limitations (they seem mostly political in nature), a robots.txt file is out of the question. I'm thinking the next best alternative is the combined use of the robots meta tag (no index, follow) alongside the canonical tag to try to point the bot to the subdirectory URLs. I don't think this would be unethical use of either feature, but I'm trying to figure out if the two would conflict in some way? Or maybe there's a better approach with which we're unfamiliar or that we haven't considered?
Technical SEO | | prasadpathapati0 -
Brand name as H1 on every page
Hi, Along with the title of each page, a Wordpress client has their brand name as a H1 on every single page. This is situated in the footer and just sits within the company info/address. Should these tags be removed, leaving just the page titles as H1s? Cheers, Lewis
Technical SEO | | PeaSoupDigital0 -
Is any code to prevent duplicate meta description on blog pages
Is any code to prevent duplicate meta description on blog pages I use rell canonical on blog page and to prevent duplicate title y use on page category title de code %%page%% Is there any similar code so to description?
Technical SEO | | maestrosonrisas0 -
Should I implement pagination(rel=next, rel=prev) if I have duplicate meta tags?
Hi, I just want to ask if it is necessary to implement pagination(rel=next, rel=prev) to my category pages because Google webmaster tools is telling me that these pages are having similar meta title and meta description. Ex. page1: http://www.site.com/iphone-resellers/1 meta title:Search for iphone resellers in US page2:http://www.site.com/iphone-resellers/2 meta title:Search for iphone resellers in US page3:http://www.site.com/iphone-resellers/3 meta title:Search for iphone resellers in US Thanks in advance. 🙂
Technical SEO | | esiow20130 -
Are aggregate sites penalised for duplicate page content?
Hi all,We're running a used car search engine (http://autouncle.dk/en/) in Denmark, Sweden and soon Germany. The site works in a conventional search engine way with a search form and pages of search results (car adverts).The nature of car searching entails that the same advert exists on a large number of different urls (because of the many different search criteria and pagination). From my understanding this is problematic because Google will penalize the site for having duplicated content. Since the order of search results is mixed, I assume SEOmoz cannot always identify almost identical pages so the problem is perhaps bigger than what SEOmoz can tell us. In your opinion, what is the best strategy to solve this? We currently use a very simple canonical solution.For the record, besides collecting car adverts AutoUncle provide a lot of value to our large user base (including valuations on all cars) . We're not just another leech adword site. In fact, we don't have a single banner.Thanks in advance!
Technical SEO | | JonasNielsen0 -
Does page speed affect what pages are in the index?
We have around 1.3m total pages, Google currently crawls on average 87k a day and our average page load is 1.7 seconds. Out of those 1.3m pages(1.2m being "spun up") google has only indexed around 368k and our SEO person is telling us that if we speed up the pages they will crawl the pages more and thus will index more of them. I personally don't believe this. At 87k pages a day Google has crawled our entire site in 2 weeks so they should have all of our pages in their DB by now and I think they are not index because they are poorly generated pages and it has nothing to do with the speed of the pages. Am I correct? Would speeding up the pages make Google crawl them faster and thus get more pages indexed?
Technical SEO | | upper2bits0 -
High number of Duplicate Page titles and Content related to index.php
It appears that every page on our site (www.bridgewinners.com) also creates a version of itself with a suffix. This results in Seomoz indicating that there are thousands of duplicate titles and content. 1. Does this matter? If so, how much? 2. How do I eliminate this (we are using joomla)? Thanks.
Technical SEO | | jfeld2220