Disallowed "Search" results with robots.txt and Sessions dropped
-
Hi
I've started working on our website and I've found millions of "Search" URL's which I don't think should be getting crawled & indexed (e.g. .../search/?q=brown&prefn1=brand&prefv1=C.P. COMPANY|AERIN|NIKE|Vintage Playing Cards|BIALETTI|EMMA PAKE|QUILTS OF DENMARK|JOHN ATKINSON|STANCE|ISABEL MARANT ÉTOILE|AMIRI|CLOON KEEN|SAMSONITE|MCQ|DANSE LENTE|GAYNOR|EZCARAY|ARGOSY|BIANCA|CRAFTHOUSE|ETON).I tried to disallow them on the Robots.txt file, but our Sessions dropped about 10% and our Average Position on Search Console dropped 4-5 positions over 1 week. Looks like over 50 Million URL's have been blocked, and all of them look like all of them are like the example above and aren't getting any traffic to the site.
I've allowed them again, and we're starting to recover. We've been fixing problems with getting the site crawled properly (Sitemaps weren't added correctly, products blocked from spiders on Categories pages, canonical pages being blocked from Crawlers in robots.txt) and I'm thinking Google were doing us a favour and using these pages to crawl the product pages as it was the best/only way of accessing them.
Should I be blocking these "Search" URL's, or is there a better way about going about it??? I can't see any value from these pages except Google using them to crawl the site.
-
If you have a site with, at least 30k URLs, looking at only 300 keywords won't reflect the general status of the whole site. If you are looking for a 10% loss in traffic, I'd start by chasing the pages that lost more traffic, then analyzing whether they lost rankings or if there are some other issues.
Another way to find where there is traffic loss is in search Console, looking at keywords that aren't in the top300. There might be a lot to analyze.
It's not a big deal having a lot of pages blocked in robots.txt when what's blocked is correctly blocked. Keep in mind that GSC will flag those pages with warnings as they were previously indexed and now are blocked. That's just how they've set up flags.
Hope it helps.
Best luck.
Gaston -
If you have a general site which happens to have a search facility, blocking search results is quite usual. If your site is all 'about' searching (e.g: Compare The Market, stuff like that) then the value-add of your site is how it helps people to find things. In THAT type of situation, you absolutely do NOT want to block all your search URLs
Also, don't rule out seasonality. Traffic naturally goes up and down, especially at this time of year when everyone is on holiday. How many people spend their holidays buying stuff or doing business stuff online? They're all at the beach - mate!
-
Hi Gaston
"Search/" pages were getting a small amount of traffic, and a tiny bit of revenue, but I definitely don't think they need to be indexed or are important to users. We're down in mainly "Sale" & "Brand" pages, and I've heard the Sale in general across the store isn't going well, but don't think I can go back management with that excuse
I think my sitemaps are sorted now, I've broken them down into 6 x 5,000 URL files, and all the canonical tags seem to be fine and pointing to these URL's. I am a bit concerned that URL's "blocked by robots.txt" shot up from 12M to 73M, although all the URLs Search Console are showing me look like they need to be blocked!
We've also tracking nearly 300 Keywords, and they've actually had good improvements in the same period. Finding it hard to explain it!
-
Hi Frankie,
My guess is that the traffic you were losing was because of its traffic driven by /search pages.
The questions you should be asking are:
- Are those /search pages getting traffic?
- Are them important to users?
- After being disallowed, which pages were losing traffic?
As a general rule, Google doesn't want to crawl nor index internal search pages, unless they have some value to users.
On another matter, the crawlability of your product pages can be easily solved with a sitemap file. If you are worried about the size of it, remember that it can contain up to 50k URLs and you can create several sitemaps and list them in a sitemap index.
More info about that here: Split up your large sitemaps - Google Search Console HelpHope it helps.
Best luck,
Gaston
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to Get Rid of Dates Shown In Google Search Results
When I enter "Site: URL" to check what a search how Google displays search result, a date appears at the very front. This takes away several characters, really valuable real estate. How can I stop Google from displaying these dates? There are certain Wordpress plugins like "WP Date Remover" however the seem to only apply to blog posts. Dates are appearing on results on all my Wordpress pages. Is there an internal setting in Wordpress that will allow me to remove dates for these non blogpost pages?
Intermediate & Advanced SEO | | Kingalan11 -
Javascript search results & Pagination for SEO
Hi On this page http://www.key.co.uk/en/key/workbenches we have javascript on the paginated pages to sort the results, the URL displayed and the URL linked to are different. e.g. The paginated pages link to for example: page2 http://www.key.co.uk/en/key/workbenches#productBeginIndex:30&orderBy:5&pageView:list& The list is then sorted by javascript. Then the arrows either side of pagination link to e.g. http://www.key.co.uk/en/key/workbenches?page=3 - this is where the rel/prev details are - done for SEO But when clicking on this arrow, the URL loaded is different again - http://www.key.co.uk/en/key/workbenches#productBeginIndex:60&orderBy:5&pageView:list& I did not set this up, but I am concerned that the URL http://www.key.co.uk/en/key/workbenches?page=3 never actually loads, but it's linked to Google can crawl it. Is this a problem? I am looking to implement a view all option. Thank you
Intermediate & Advanced SEO | | BeckyKey0 -
Fluctuating Rankings on "difficult" keywords
Hi guys, I have a client who wants to rank well for two very "difficult" keywords and eight easier ones. The easy ones are "treadmills + city" and the difficult ones are "treadmills" and "treadmill". We have got great traction on the "+city" keywords and he now ranks on page one for all those. However, we have noticed that although he ranks on page 2-3 for "treadmill" treadmills", those rankings fluctuate widely day to day. Rankings for the "+city" versions are stable, a rising slowly as I would expect. Rankings for the difficult keywords can be 235 one day, 32 the next week, 218 the day after that, then stable at 30ish for a week, then fluctuation again. I know Google update every day, but what are the likely causes of the easier keywords being stable, while the harder ones fluctuate? Thanks.
Intermediate & Advanced SEO | | stevedeane0 -
On page report card for the keyword "computers"
I was looking at which websites ranks in the TOP 3 for the keyword "computers"... I noticed that first is wikipedia and then there are Dell and Apple... I then did an on page report card and I noticed that wikipedia has a grade A (which is great ) However, Apple has an F ( which sucks !! ) but there still rank out there. My question is why is Apple ranking for the keyword computers with no tiitle, no URL, no H1, no body, no B/Strong... when wikipedia has all of that and the term " computers " occurs 290 times on its page... Is is due to the fact that apple has millions of external links and is that enough to rank even with an " irrelevant " page ? By the way I have noticed that on other keywords such as " bicycle ". Wikipedia is ranking 1 st and then sites like www.trekbikes.com are out there but they shouldn't based on their homepage "optimization ". I know there are other factors but I am just trying to figure why such sites ( like apple or trek bikes ) rank out there. Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Google Said "Repeat the search with the omitted results included."
We have some pages targeting the different countries but with the Near to Similar content/products, just distinguished with the country name etc. one of the page was assigned to me for optimizing. two or three Similar pages are ranked with in top 50 for the main keyword. I updated some on page content to make it more distinguish from others. After some link building, I found that this page still not showing in Google result, even I found the following message on the google. "In order to show you the most relevant results, we have omitted some entries very similar to the 698 already displayed.
Intermediate & Advanced SEO | | alexgray
If you like, you can repeat the search with the omitted results included." I clicked to repeat omitted result and found that my targeted url on 450th place in google (before link building this was not) My questions are Is google consider this page low quality or duplicate content? Is there any role of internal linking to give importance a page on other (when they are near to similar)? Like these pages can hurt the whole site rankings? How to handle this issue?0 -
No longer to be found for "certain" keywords.
I'd like to see if anyone could potentially shade a light on this rather strange scenario: Basically yesterday I noticed that we are no longer to be found for 'certain' keywords that we had page 2-3 ranking. Yet, for other keywords we still appear on page 2-3. These keywords are very competitive and our rankings has constantly improved in the course of 5-6 months. Now my question is that what could or may have contributed to the fact that for only some keywords we are no longer to be found? Another question is, can Google remove you from their SERPs for certain keywords 'only'? Thank you,
Intermediate & Advanced SEO | | micfo
Maximilian.0 -
So what exactly does Google consider a "natural" link profile?
As part of my company's ongoing SEO effort we have been analyzing our link profile. A colleague of mine feels that we should be targeting at least 50% branded anchor text. He claims this is what search engines consider "natural" and we should not go past a threshold of 50% optimized anchor text to make sure we avoid any penalties or decrease in rankings. 50% brand term anchor text seems too high to me. I pointed out that most of our competitors who outrank us have a much greater percentage of optimized links. I've also read other industry experts state that somewhere in the range of 30% branded anchor text would be considered natural. What percent of branded vs. optimized anchor text do you feel looks "natural" and what do you base your opinion on?
Intermediate & Advanced SEO | | DeannaTallman0 -
10,000 New Pages of New Content - Should I Block in Robots.txt?
I'm almost ready to launch a redesign of a client's website. The new site has over 10,000 new product pages, which contain unique product descriptions, but do feature some similar text to other products throughout the site. An example of the page similarities would be the following two products: Brown leather 2 seat sofa Brown leather 4 seat corner sofa Obviously, the products are different, but the pages feature very similar terms and phrases. I'm worried that the Panda update will mean that these pages are sand-boxed and/or penalised. Would you block the new pages? Add them gradually? What would you recommend in this situation?
Intermediate & Advanced SEO | | cmaddison0