Whats the best way to remove search indexed pages on magento?
-
A new client ( aqmp.com.br/ )call me yestarday and she told me since they moved on magento they droped down more than US$ 20.000 in sales revenue ( monthly)...
I´ve just checked the webmaster tool and I´ve just discovered the number of crawled pages went from 3.260 to 75.000 since magento started... magento is creating lots of pages with queries like search and filters. Example:
- http://aqmp.com.br/acessorios/lencos.html
- http://aqmp.com.br/acessorios/lencos.html?mode=grid
- http://aqmp.com.br/acessorios/lencos.html?dir=desc&order=name
Add a instruction on robots.txt is the best way to remove unnecessary pages of the search engine?
-
I have tried using them and didn´t do anything - furthermore, if you check this video out by Google themselves, you will find that using these parameters is a "hint/suggestion" as opposed to a solid directive.
http://www.youtube.com/watch?v=DiEYcBZ36poRel Canonical is also a hint.
But Meta Noindex,follow is a solid directive which they have to pay attention to.
Hope that helps - been there, done it got the t shirt through a lot of pain and frustration!
-
What do you think about Google URL parameters? http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1235687
-
Hi Ian,
You are right in that Yoast Meta Robots can be cranky - I installed it and had to play around with it to get it working.
However, it does offer a very nice feature that I think is worth it - you can apply various combinations of Meta Robots directives to product pages individually - so this adds more value than just being able to do NOINDEX on reviews, wishlists, etc... pages. But install it on your dev site before trying it live.
So my solution uses both Yoast and my custom code - you check the URL for any querystrings, such as ?manufacturer etc... and apply different logic according to what you wish to be indexed or not.
Feel free to PM me.
-
Hi,
can you expand on this and point me in the right direction if possible please BJS1976? I have the same problems too as originally asked by 'SEO Martin'.
I have obviously seen that the Yoast_MetaRobots plugin is recommended by others when searching for a solution to noindexing the non-content pages (search results, filters etc). However I am very reluctant to install this as many people who have tried said it has broken their sites.
If there is another way of implementing the noindex, follow meta tag, I would be very greatful to know how as like you I am really struggling to with this one.
Many Thanks
-
Hi,
I am quite familiar with Magento and struggling with the SEO of this ecommerce mammoth!
As far as I am aware, you should implement the meta tag "NOINDEX, FOLLOW" on those pages that you do not want indexed - as your pages are already in the index, this is the way to go - blocking them on robots.txt does not get pages out from the index if they are already in there.
I suggest you apply some "querystring" logic to your template - you will find the page here:
app/design/frontend/default/YOURTEMPLATE/template/page/html/head.phtmlThat way, you can apply the
depending on the page content.
Hope this helps you and let's stay in touch about Magento! (PM me)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does redirecting a duplicate page NOT in Google‘s index pass link juice? (External links not showing in search console)
Hello! We have a powerful page that has been selected by Google as a duplicate page of another page on the site. The duplicate is not indexed by Google, and the referring domains pointing towards that page aren’t recognized by Google in the search console (when looking at the links report). My question is - if we 301 redirect the duplicate page towards the one that Google has selected as canonical, will the link juice be passed to the new page? Thanks!
Intermediate & Advanced SEO | | Lewald10 -
Sitemap Indexed Pages, Google Glitch or Problem With Site?
Hello, I have a quick question about our Sitemap Web Pages Indexed status in Google Search Console. Because of the drastic drop I can't tell if this is a glitch or a serious issue. When you look at the attached image you can see that under Sitemaps Web Pages Indexed has dropped suddenly on 3/12/17 from 6029 to 540. Our Index status shows 7K+ indexed. Other than product updates/additions and homepage layout updates there have been no significant changes to this website. If it helps we are operating on the Volusion platform. Thanks for your help! -Ryan rou1zMs
Intermediate & Advanced SEO | | rrhansen0 -
Is there a way to rel = canonical only part of a page?
Hi guys: I'm doing SEO for a boat accessories store, and for instance they have marine AC systems, many of them, and while the part number, number of BTUs, voltage, and accessories change on some models, the description stays exactly the same across the board on many of them...people often search on Google by model number, and I worry that if I put rel = canonical, then the result for that specific model they're looking for won't come up, just the one that everything is being redirected to. (and people do this much more than entering a site nowadays and searching by product model, it's easier). Excuse my ignorance on this stuff, I'm good with link building and content creation, but the behind-the-scenes aspects... not so much: Can I "rel=canonical" only part of the page of the repeat models (the long description)? so people can still search by model number, and reach the model they are looking for? Am I misunderstanding something here about rel=canonical (Interesting thing, I rank very high for these pages with tons of repeat descriptions, number one in many places... but wonder if google attributes a sort of "across the site" penalty for the repeated content... but wouldn't ranking number 1 for these pages mean nothing's wrong?. Thanks)
Intermediate & Advanced SEO | | DavidCiti1 -
Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Intermediate & Advanced SEO | | RosemaryB
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic. Any recommendations would be much appreciated.0 -
Best practice to prevent pages from being indexed?
Generally speaking, is it better to use robots.txt or rel=noindex to prevent duplicate pages from being indexed?
Intermediate & Advanced SEO | | TheaterMania0 -
What is Best Way to Scale RCS Content?
SEO has really moved away from the nitty gritty analysis of backlinking factors, link wheels, and the like and has shifted to a more holistic marketing approach. That approach is best described around MOZ as “Real Company S#it”. RCS is a great way to think about what we really do because it is so much more than just SEO or just Social Media. However, our clients and business owners do want to see results and want it quantified in some way. The way most of our clients understand SEO is by ranking high on specific terms or online avenues they have a better possibility of generating traffic/sales/revenue. They understand this more from the light of traditional marketing, where you pay for a TV ad and then measure to see how much revenue that ad generated. In the light of RCS and the need to target a large number of keywords for a given client, how do most PROs handle this situation; where you have a large number of keywords to target but with RCS? Many I’ve asked tend to use the traditional approach of creating a single content piece that is geared towards a given target keyword. However, that approach can get daunting if you have say 25 keywords that a small business wants to target. In this case is not really a case of scaling down the client expectations? What if the client wants all of the keywords and has the budget? Do you just ramp your RCS content creation efforts? It seems that you can do overkill and quickly run out of RCS content to produce.
Intermediate & Advanced SEO | | AaronHenry0 -
Why are so many pages indexed?
We recently launched a new website and it doesn't consist of that many pages. When you do a "site:" search on Google, it shows 1,950 results. Obviously we don't want this to be happening. I have a feeling it's effecting our rankings. Is this just a straight up robots.txt problem? We addressed that a while ago and the number of results aren't going down. It's very possible that we still have it implemented incorrectly. What are we doing wrong and how do we start getting pages "un-indexed"?
Intermediate & Advanced SEO | | MichaelWeisbaum0 -
Removing pages from index
Hello, I run an e-commerce website. I just realized that Google has "pagination" pages in the index which should not be there. In fact, I have no idea how they got there. For example, www.mydomain.com/category-name.asp?page=3434532
Intermediate & Advanced SEO | | AlexGop
There are hundreds of these pages in the index. There are no links to these pages on the website, so I am assuming someone is trying to ruin my rankings by linking to the pages that do not exist. The page content displays category information with no products. I realize that its a flaw in design, and I am working on fixing it (301 none existent pages). Meanwhile, I am not sure if I should request removal of these pages. If so, what is the best way to request bulk removal. Also, should I 301, 404 or 410 these pages? Any help would be appreciated. Thanks, Alex0