Use Internal Search pages as Landing Pages?
-
Hi all
Just a general discussion question about Internal Search pages and using them for SEO. I've been looking to "noindexing / follow" them, but a lot of the Search pages are actually driving significant traffic & revenue.
I've over 9,000 search pages indexed that I was going to remove, but after reading this article (https://www.oncrawl.com/technical-seo/seo-internal-search-results/) I was wondering if any of you guys have had success using these pages for SEO, like with using auto-generated content. Or any success stories about using the "noindexing / follow"" too.
Thanks!
-
I also want to know how to increase the Authority of my website. I have taken the Moz course to learn the SEO to give their services. Can you guide more?
-
Hi Guillaume_L
I'd presume your idea is fine, looks like a good idea! As long as you keep the number of URL parameters to one or two like outlined in the article you shared, or you'll end up with millions/billions of URL's. It'll also help keep your Page Titles/Descriptions shorter if you only have 1 or 2 parameters set up. I'd suggest noindexing any pages that have more than 2 parameters chosen.
Seems like you'll be doing something similar to what AirBNB are doing, so I think you should be good to go!
-
I actually found this link interesting: https://thecontentworks.uk/dynamic-pages-seo-friendly/.
It seems to me like generating dynamic content delivers a better user experience, and creates less risk of duplicate content/errors. It would be odd if Google penalized me for this.
-
Hi Guys,
I fell on this topic while searching a SEO friendly solution to a site I am currently building and would love to have your insights on this as it is directly related to having internal search results indexed on Google.
I am currently building an industry-specific small business directory (ex: Plumber) in Wordpress. The way I initially set it up is that when the user gets on the home page (ex: Find Plumbers in your area), they select the location they are interested in from a dropdown (ex: Miami) and the site returns a list of Plumbers that serve this location. I set up the results page as being an Archive that displays results linked to the search query, in this case, a location. The URLs of the search results look like www.company.com/plumber/?_location=Miami.
This said, I am not expecting people to find my home page on Google, but rather a specific location, which is an internal search result.
While everyone seems to agree that internal search results are not supposed to be indexed, my alternative would be to build a page for every location, which would create hundreds of new pages with duplicate content (I have a FAQ about how to select the best plumber on the same page below the results).
I also looked around and Yelp has a similar approach for the location results (ex: https://www.yelp.ca/search?cflt=plumbing&find_loc=Toronto%2C+ON)
Any thoughts on this use case?
Thanks
-
That's a great question. If you have pages that are generating revenue and ranking really well. I'd be hesitant to remove them from the index. Like the article mentions, Wayfair generates a huge amount of search traffic through these auto-generated internal search pages. If these are considered high quality and ranking well in Google, I would probably recommend leaving them alone.
If you want to trim some of these down, I'd use Google Analytics to find ones that aren't generating organic traffic/revenue. You could consider adding the "noindex" tag to those.
In general it is best practice to remove internal search pages from Google as they can contribute to a large amount of index bloat. However, I wouldn't reduce any that you see are performing well.
I'd be happy to take a look if you have any other questions!
-
I don't think that you should be looking at those from a SEO perspective. Why? If some people landed there through the search engines, this means that they have found the result in the SERP useful. This is much more important that the SEO. There are many who work for SEO, but do not manage to attract interest from those who search Google or other SE. So, keep and develop you internal search pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
In writing the url, it is better to use the language used by the people of my country or English?
We speak Persian and all people search in Persian on Google. But I read in some sources that the url should be in English. Please tell me which language to use for url writing?
Technical SEO | | ghesta
For example, I brought down two models: 1fb0e134-10dc-4737-904f-bfdf07143a98-image.png https://ghesta.ir/blog/how-to-become-rich/
2)https://ghesta.ir/blog/چگونه-پولدار-شویم/0 -
How to de-index a page with a search string with the structure domain.com/?"spam"
The site in question was hacked years ago. All the security scans come up clean but the seo crawlers like semrush and ahrefs still show it as an indexed page. I can even click through on it and it takes me to the homepage with no 301. Where is the page and how to deindex it? domain/com/?spam There are multiple instances of this. http://www.clipular.com/c/5579083284217856.png?k=Q173VG9pkRrxBl0b5prNqIozPZI
Technical SEO | | Miamirealestatetrendsguy1 -
John Mueller says don't use Schema as its not working yet but I get markup conflicts using Google Mark-up
I watched recently John Mueller's Google Webmaster Hangout [DEC 5th]. In hit he mentions to a member not to use Schema.org as it's not working quite yet but to use Google's own mark-up tool 'Structured Data Markup Helper'. Fine this I have done and one of the tags I've used is 'AUTHOR'. However if you use Google's Structured Data Testing Tool in GWMT you get an error saying the following Error: Page contains property "author" which is not part of the schema. Yet this is the tag generated by their own tool. Has anyone experienced this before? and if so what action did you take to rectify it and make it work. As it stands I'm considering just removing this tag altogether. Thanks David cqbsdbunpicv8s76dlddd1e8u4g
Technical SEO | | David-E-Carey0 -
Why do they rank the home page?
We are trying to rank for the key word Motorcycle Parts. We have moved up to page 2 over the past couple months; however, google is ranking our home page not our http://www.rockymountainatvmc.com/s/49/61/Motorcycle-Parts page that is for motorcycle parts. We are working on internal linking to help point the right signals too. Any other thoughts? ( we have new content written to put in as well we just have to wait for an issue to be fixed before we can put it in)
Technical SEO | | DoRM0 -
Should I nofollow search results pages
I have a customer site where you can search for products they sell url format is: domainname/search/keywords/ keywords being what the user has searched for. This means the number of pages can be limitless as the client has over 7500 products. or should I simply rel canonical the search page or simply no follow it?
Technical SEO | | spiralsites0 -
Internal search : rel=canonical vs noindex vs robots.txt
Hi everyone, I have a website with a lot of internal search results pages indexed. I'm not asking if they should be indexed or not, I know they should not according to Google's guidelines. And they make a bunch of duplicated pages so I want to solve this problem. The thing is, if I noindex them, the site is gonna lose a non-negligible chunk of traffic : nearly 13% according to google analytics !!! I thought of blocking them in robots.txt. This solution would not keep them out of the index. But the pages appearing in GG SERPS would then look empty (no title, no description), thus their CTR would plummet and I would lose a bit of traffic too... The last idea I had was to use a rel=canonical tag pointing to the original search page (that is empty, without results), but it would probably have the same effect as noindexing them, wouldn't it ? (never tried so I'm not sure of this) Of course I did some research on the subject, but each of my finding recommanded one of the 3 methods only ! One even recommanded noindex+robots.txt block which is stupid because the noindex would then be useless... Is there somebody who can tell me which option is the best to keep this traffic ? Thanks a million
Technical SEO | | JohannCR0 -
Should search pages be disallowed in robots.txt?
The SEOmoz crawler picks up "search" pages on a site as having duplicate page titles, which of course they do. Does that mean I should put a "Disallow: /search" tag in my robots.txt? When I put the URL's into Google, they aren't coming up in any SERPS, so I would assume everything's ok. I try to abide by the SEOmoz crawl errors as much as possible, that's why I'm asking. Any thoughts would be helpful. Thanks!
Technical SEO | | MichaelWeisbaum0 -
Local SEO for service industry - one landing page for every town...in every county...in every state?
Starting a second local based service site. Initially going to target a couple counties and move on from there as the business grows. The first site of mine I set up a page for each town [service] + [town] + [state] + [zip]. I am afraid this could get out of control though if I don't have unique content on each page. For the last site I simply copied the page and replace the town name in each as well as the picture, picture title, and image name to make it look more unique for users but not necessarily Google. I had pretty good results but I want this next site to be done properly. Should I only target a few of the major markets to begin with? What about long tail searches for smaller towns that currently bring in a good amount of business? I am concerned about having "too many" long tail pages for each town which would essentially become a listing of every town and county in the state if I was to maintain the pace I want to. Also I would need a good amount of backlinks to each specific town page url if I wanted to do well in each of those specific markets right? Is this where the fine line between niche term and broad search is? Is there any happy medium?
Technical SEO | | kabledesigns0