For large sites, best practices for pages hidden behind internal search?
-
If a website has 1M+ pages, with most of them being hidden behind an internal search, what's the best way to get pages included in an engine's index?
Does a direct clickpath to those pages need to exist from the homepage or other major hub pages on the site?
Is submitting an XML sitemap enough?
-
Hello Vlevit,
You could do several things. I recommend giving Google your product feed, which should accomplish your goals. Another possible solution would be to make those search pages noindex,follow so they don't end up getting indexed, but Google can still use them for discovery.
Thanks for explaining the situation.
Below is more on submitting product feeds. It is for Google Product Search, but I would imagine the "link" field where you put the URL to your product detail page will help those pages get indexed in the standard results:
http://support.google.com/merchants/bin/answer.py?hl=en&answer=188494#USEverett
-
Everett, thanks for your reply. I understand the problems of showing internal search pages. I'm not looking to have internal search results being indexed, just the pages that the results link to. We're in eCommerce.
I was under the impression that there was a clever way to have the individual product pages indexed without establishing a direct click path, but best practices recommend otherwise.
Question answered. Thanks all for your help.
-
Hello Vlevit,
If you can be more specific we may be able to be of more help. Google doesn't want you to show internal search result pages, but if this is a different type of situation it there may be an exception. Are these search result pages, product pages, category pages, content pages.... is it an eCommerce site, community, content site... ?
Generally speaking, 1M+ pages with no links going into them and content that is either sparce/thin or partially/fully duplicated on other similar pages (like a search for widgets and a search for green widgets showing overlapping content) is exactly the type of thing that will get you in hot water that would affect even the rankings of your home page.
Do you feel like your question has been answered or would you like to be more specific about your site and goals?
Cheers,
Everett
-
This is what I was assuming, but was wondering if there was a clever way around creating direct click paths to those pages, while still maintaining their importance to the site. Thanks for the info.
-
Make sure they are part of the actual structure of your website, not just part of search. Meaning, you have to have links pointing at them. Also, you will also want to make sure that those pages have value.
-
Hi vlevit,
The best practice would be to exist a direct path of flow from index page. Something like: index -> category(filter) -> subcategory(filter) -> page/product. But in some cases xml sitemaps can also help you in indexing.
BUT, beware with to large XML sitemaps, try to create more then one sitemap, group them as possible.
A few very good resources can be found under the next links:
http://www.seomoz.org/ugc/solving-new-content-indexation-issues-for-large-b2b-websites
http://www.seomoz.org/qa/view/29009/sitemaps-management-for-big-sites-tens-of-millions-of-pages
I hope it helpes,
Istvan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to deal with over 1000 pages of duplicate content?
Hi Using the moz tools i have over a 1000 pages of duplicate content. Which is a bit of an issue! 95% of the issues arise from our news and news archive as its been going for sometime now. We upload around 5 full articles a day. The articles have a standalone page but can only be reached by a master archive. The master archive sits in a top level section of the site and shows snippets of the articles, which if a user clicks on them takes them to the full page article. When a news article is added the snippets moves onto the next page, and move through the page as new articles are added. The problem is that the stand alone articles can only be reached via the snippet on the master page and Google is stating this is duplicate content as the snippet is a duplicate of the article. What is the best way to solve this issue? From what i have read using a 'Meta NoIndex' seems to be the answer (not that i know what that is). from what i have read you can only use a canonical tag on a page by page basis so that going to take to long. Thanks Ben
Technical SEO | | benjmoz0 -
Deindexed site - is it best to start over?
A potential client's website has been deindexed from Google. We'd be completely redesigning his site with all new content. Would it be best to purchase a new url and redirect the old deindexed site to the new one, or try stick with the old domain?
Technical SEO | | WillWatrous0 -
Best practices for switching site languages around
Hi folks. The site in question is at http://bit.ly/UDV186 It is split into English and Spanish versions, each at root/en and root/es respectively. The home page is in Spanish. We're trying to rank the site for English keywords so we want to switch the homepage to English and put the Spanish version as secondary. What are the best practices for this? Can we just literally swap the two versions around onto the existing URLs, i.e. take the English text and put it onto the home page? Provided all links point to the correct page, would that be fine? Are there any other best practice considerations to take? Thanks in advance.
Technical SEO | | MattBarker0 -
Best META Fields to Include on New Site
I am in the process of transitioning sites to a Drupal CMS and am curious to know what META information to provide on each of the new site pages. Currently, this is the set-up I plan on using: My questions to the community are: whether or not I've added all pertinent information, and if there's anything I'm overlooking
Technical SEO | | NiallSmith0 -
Do sites really need a 404 page?
We have people posting broken links to our site is this looking us link juice as they link to 404 pages. We could redirect to the homepage or just render the home page content, in both cases we can still display a clear page not found message. Is this legal (white hat).
Technical SEO | | ed1234560 -
What are the best Free Press release sites to gain free links
Hi I am trying to find some good free press release sites that allow you to have a link with your press release to help drive traffic to your site but all the ones that i have found do not have links within them. The only ones i can find where you can have links in them are paid ones. Does anyone use press release sites to gain links to their sites and if so could you let me know which ones they are please and how important you feel they are.
Technical SEO | | ClaireH-1848860 -
Can search engines penalize my site if I block IPs from some countries?
I have spotted that some countries in South America generate lot's of traffic on my site and I don't want to sell my service there. Can I be penalized for blocking IPs from certain counties? Thanks!
Technical SEO | | Xopie0 -
Xenu Alternative for Large Sites
We're launching a new site and we're trying to crawl it to check for any problems. It's millions of pages and Xenu seems to start encountering errors as the numbers mount past 500,000. Does anyone know of an alternative, free or paid, that could handle the size better?
Technical SEO | | eLocalusa0