Pagination for Search Results Pages: Noindex/Follow, Rel=Canonical, Ajax Best Option?
-
I have a site with paginated search result pages. What I've done is noindex/follow them and I've placed the rel=canonical tag on page2, page3, page4, etc pointing back to the main/first search result page. These paginated search result pages aren't visible to the user (since I'm not technically selling products, just providing different images to the user), and I've added a text link on the bottom of the first/main search result page that says "click here to load more" and once clicked, it automatically lists more images on the page (ajax). Is this a proper strategy?
Also, for a site that does sell products, would simply noindexing/following the search results/paginated pages and placing the canonical tag on the paginated pages pointing back to the main search result page suffice?
I would love feedback on if this is a proper method/strategy to keep Google happy.
Side question - When the robots go through a page that is noindexed/followed, are they taking into consideration the text on those pages, page titles, meta tags, etc, or are they only worrying about the actual links within that page and passing link juice through them all?
-
Firstly, read http://searchengineland.com/the-latest-greatest-on-seo-pagination-114284 for the basics on addressing this problem. It was noted in the other response but it's key that you approach it this way. Its common but easily fixable.
On your other note, robots read everything on the page, content included. They may not index any of it (considering it's on a NOINDEX page), but the absolutely read and crawl everything. And yes, naturally they follow the links on a FOLLOW page. They won't on a NOFOLLOW and will look elsewhere for links to follow.
Hope this answered your question. Let me know if not.
-
Can someone respond to the questions on my post? Thanks.
-
Use rel next prev and optionally if worried about pages 2-N coming up in SERPs add noindex meta tag to those pages
http://searchengineland.com/google-provides-new-options-for-paginated-content-92906
http://searchengineland.com/the-latest-greatest-on-seo-pagination-114284
http://searchengineland.com/implementing-pagination-attributes-correctly-for-google-114970
http://www.youtube.com/watch?v=njn8uXTWiGg
Why you would not want to use canonical - it works but not the proper use of the tag.
http://searchengineland.com/pagination-strategies-in-the-real-world-81204
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help! I need help with building a backlink campaign. Need best practices please.
Hello everyone. I am stuck. I need some good advice on how to build a whitehat backlinking campaign, and I need some advice regarding how to do this, and strategy. Thanks!
White Hat / Black Hat SEO | | RyanEly19860 -
Hreflang/Canonical Inquiry for Website with 29 different languages
Hello, So I have a website (www.example.com) that has 29 subdomains (es.example.com, vi.example.com, it.example.com, etc). Each subdomain has the exact same content for each page, completely translated in its respective language. I currently do not have any hreflang/canonical tags set up. I was recently told that this (below) is the correct way to set these tags up -For each subdomain (es.example.com/blah-blah for this example), I need to place the hreflang tag pointing to the page the subdomain is on (es.example.com/blah-blah), in addition to every other 28 subdomains that have that page (it.example.com/blah-blah, etc). In addition, I need to place a canonical tag pointing to the main www. version of the website. So I would have 29 hreflang tags, plus a canonical tag. When I brought this to a friends attention, he said that placing the canonical tag to the main www. version would cause the subdomains to drop out of the SERPs in their respective country search engines, which I obviously wouldn't want to do. I've tried to read articles about this, but I end up always hitting a wall and further confusing myself. Can anyone help? Thanks!
White Hat / Black Hat SEO | | juicyresults0 -
Recovering from Black Hat/Negative SEO with a twist
Hey everyone, This is a first for me, I'm wondering if anyone has experienced a similar situation and if so, what the best course of action was for you. Scenario In the process of designing a new site for a client, we discovered that his previous site, although having decent page rank and traffic had been hacked. The site was built on Wordpress so it's likely there was a vulnerability somewhere that allowed someone to create loads of dynamic pages; www.domain.com/?id=102, ?id=103, ?id=104 and so on. These dynamic pages ended up being malware with a trojan horse our servers recognized and subsequently blocked access to. We have since helped them remedy the vulnerability and remove the malware that was creating these crappy dynamic pages. Another automated program appears to have been recently blasting spam links (mostly comment spam and directory links) to these dynamically created pages at an incredibly rapid rate, and is still actively doing so. Right now we're looking at a small business website with a touch over 500k low-quality spammy links pointing to malware pages from the previously compromised site. Important: As of right now, there's been no manual penalty on the site, nor has a "This Site May Have Been Compromised" marker in the organic search results for the site. We were able to discover this before things got too bad for them. Next Steps? The concern is that when the Penguin refresh occurs, Google is going to notice all these garbage links pointing to those malware pages and then potentially slap a penalty on the site. The main questions I have are: Should we report this proactively to the web spam team using the guidelines here? (https://www.google.com/webmasters/tools/spamreport?hl=en&pli=1) Should we request a malware review as recommended within the same guidelines, keeping in mind the site hasn't been given a 'hacked' snippet in the search results? (https://support.google.com/webmasters/topic/4598410?hl=en&ref_topic=4596795) Is submitting a massive disavow links file right now, including the 490k-something domains, the only way we can escape the wrath of Google when these links are discovered? Is it too hopeful to imagine their algorithm will detect the negative-SEO nature of these links and not give them any credit? Would love some input or examples from anyone who can help, thanks in advance!
White Hat / Black Hat SEO | | Etna0 -
Title and Meta Description Best Practice
Hi guys, I need some creative input on this. I'm working on a Hyundai dealership's website, and I want it to rank well for "used cars" in its local market. I need it to rank well for four cities for "Hyundai Dealer" also. Can you pick apart a dummy meta title and description I put together? In the example "Metropolis" will be the home city Title, "Hyundai Dealer serving Metropolis, Gotham City, Star City & Red City, NY | Used Cars Red City, NY" Description, "Visit Bob's Hyundai in Metropolis, NY. We're a new & used car dealer near Gotham City, Star City & Red City, NY." Be brutally honest and let me know what I can do to achieve this objective beyond this too if you can. I want to know how I can achieve this objective. Thanks a bunch!
White Hat / Black Hat SEO | | oomdomarketing0 -
Why website isn't showing on results?
Hello Moz! Just got a quick question - we have a clientcalled and for some reason they just aren't showing up in the search results. It's not a new domain and hasn't been penalised (or has reason for penalty). All the content is fresh and has no bad back links to the site. It is a new website and has been indexed by Google but for even for branded search terms, it just doesn't show up anywhere on page 1 (i think page 4). Any help or advise is great appreciated is it's doing my head in. We are using www.google.com.au. Kindest Regards
White Hat / Black Hat SEO | | kymodo0 -
On page SEO? (This is good! I promise)
I have been doing some research on onsite optimization and I hit a dead end, need some help with OnSite.... These three I get for the most part... (If you would like to add anything please do) Title optimization - needs to be unique with keywords included under 90 words Meta description - needs to be unique with keywords included under 150 words Meta keywords – all keywords Questions begin here... H1 headings – Should this be the first thing the spider crawls? Should they be unique? Is there a penalty for having this content the same on every page? (H1s are under the logo at the top of every one of my sites pages) H2-H6 headings – Should they be unique? Is there a penalty for having this content the same on every page? Bold text – does this matter for SEO? Italic text - does this matter for SEO? Link anchor text – These are the same on most pages. However, most of these links are part of the navigation, does this matter for SEO? is this duplicate? how does the search engine analyze this data? Image alt attributes – I have the share image buttons on my site (Facebook, Twitter, etc...) and they have the same alt attributes on each page. Does this matter for SEO? Body text – I found a competitor site that’s ranking #1 for a key term. This competitor has 11,106 words in their body with the keyword mentioned 29 times (0.8%). They placed all this text in a small scroll down on the bottom of their page. Its strange how they included it. Please review attached image. the competitor URL is http://www(dot)1804design(dot)com/ w6AiM.png
White Hat / Black Hat SEO | | SEODinosaur0 -
User comments with page content or as a separate page?
With the latest Google updates in both cracking down on useless pages and concentrating on high quality content, would it be beneficial to include user posted comments on the same page as the content or a separate page? Having a separate page with enough comments on it would he worth warranting, especially as extra pages add extra pagerank but would it be better to include them with the original article/post? Your ideas and suggestions are greatly appreciated.
White Hat / Black Hat SEO | | Peter2640