What to do when your home page an index for a series of pages.
-
I have created an index stack. My home page is http://www.southernwhitewater.com
The home page is the index itself and the 1st page http://www.southernwhitewater.com/nz-adventure-tours-whitewater-river-rafting-hunting-fishing
My home page (if your look at it through moz bat for chrome bar} incorporates all the pages in the index. Is this Bad? I would prefer to index each page separately. As per my site index in the footer
What is the best way to optimize all these pages individually and still have the customers arrive at the top to a picture. rel= canonical?
Any help would be great!!
-
HI Rod,
That should have fixed your duplicate content problem.
-
Hi Don, I have made those changes now.
I still have an index for my home page, but it is; a banner, 2 text pages and a gallery page. If you look at the index though the Moz bar you can see that the H1 & H2 titles have information from the whole index as if it is one long single page.
Rod B
-
Thanks Moosa
Yes I have just put this up and went a little OTT on the urls. I will pull strip them back shortly.
No, the index (index stack) is about the individual pages stacked down a page with no submenu, as opposed to the site index in the footer. The footer shows how to get to the individual pages by themselves the same way you would in a drop down menu.
-
Thanks Don
Good answer and I appreciate the detail. I have been assured that the individual content will not be seen by google as duplicate by the creators, but I am not entirely convinced, and have come to a similar conclusion to you and will be pulling away the home page and making a link across to an index of the services pages. I have a client who is very particular about the look. Most will understand the trade-off between plenty of above the fold content, and keeping the opening image clean picturesque and minimalistic.
I will let you know when its finished. I would like to hear from other people who stack pages in an index as opposed to a drop-down menu or sub-navigation. This design seems to be quite good for tablets where scrolling is easy and loading pages is slower. (another trade of between desktops and tablets)
I will be testing a change from one configuration to other to see if the performance improves or not on some of my other sites. Nothing like an a/b split to put the mind at rest.
-
Hi Rod,
I will try address the 2 questions I see here.
1. Is an "index stack" bad for your site.
2. How do I optimize my site.1. Index Stack, okay to be completely honest I have never heard the term before. However, with the link you provided I was able to understand what you mean. In such case you have taken the content of your sub-pages and placed them directly on your home page. Is this bad.. YES!
The reason why this is bad is because it is creating duplicate content. This is bad because the page that should be ranking for X content is now competing with your home page for these rankings. Your home page should be about your business and services, what you do, and how you do it. The sub-pages are your re-enforcement they support the content stated on your homepage.
Imagine a beautiful hotel. The outside is adorned in the finest materials of the time, well maintained and awe inspiring. A equally impressive doorway beacons you to come in. As you do, you find yourself in the lobby. Well placed pristine signs detail some of the hotel's offerings, private gym's, hot tubs, saunas, massage, 5 star dinning, dance halls, and judging by the impressive lobby you have no doubt these offerings will be spectacular.
In this example the lobby is your homepage an impressive list of the offerings guest will find inside. What you don't see is a middle-aged fat man sprawled out in the lobby getting a massage. Which is what your index stack is doing.
2. How to optimize (fix it). To fix this you'll want to structure you home page to be about your business, and services. High level overviews of what you offer, with strong brand placement.
The purpose of the homepage is to make it clear what you do, how you do it, why people love it, and in general more about your brand. The purpose of your inner pages is to completely detail the particular service or products, with call to action to convert interest into money.
I would remove the internal page content from the homepage, and focus again more on an overview. Essentially cut out everything between the Hunting & Gathering Club, and the Call to Action "There's a Southern Whitewater Experience for everyone". Beef up the links to the internal pages by making them look more attractive and obvious.
There are other things that may need addressed as well, but I hope this answers your primary question,
Don
-
Your question is kind of confusing to me but if you are asking about having all internal links on the home page. This is fine. I take a quick look of your website and from the indexing point of view it looks just fine to me!
Although your URLs are very much over optimized it will not take more than a second for one to realize that you are pushing SEO here. I think this is not really a very good idea so my advice is to optimize your URLs and make it SEO friendly instead of inserting your keywords in to it.
If you are asking something different, I would love if you can elaborate your question a bit more.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do uncrawled but indexed pages affect seo?
It's a well known fact that too much thin content can hurt your SEO, but what about when you disallow google to crawl some places and it indexes some of them anyways (No title, no description, just the link) I am building a shopify store and it's imposible to change the robots.txt using shopify, and they disallow for example, the cart. Disallow: /cart But all my pages are linking there, so google has the uncrawled cart in it's index, along with many other uncrawled urls, can this hurt my SEO or trying to remove that from their index is just a waste of time? -I can't change anything from the robots.txt -I could try to nofollow those internal links What do you think?
Intermediate & Advanced SEO | | cuarto7150 -
301 migration - Indexed Pages rising on old site
Hello, We did a 301 redirect from site a to site b back in March. I would check on a daily basis on the index count using query "site:sitename" The past couple of days, the old domain (that was 301 redirected) indexed pages has been rising which is really concerning. We did a 301 redirect back in march 2016, and the indexed count went from 400k pages down to 78k. However, the past 3 days it went from 78k to 89,500. And I'm worried that the number is going to continue to rise. My question - What would you do to investigate / how to investigate this issue? Would it be screaming frog and look at redirects? Or is this a unique scenario that I'd have to do other steps/procedures?
Intermediate & Advanced SEO | | ggpaul5620 -
Why does Google rank a product page rather than a category page?
Hi, everybody In the Moz ranking tool for one of our client's (the client sells sport equipment) account, there is a trend where more and more of their landing pages are product pages instead of category pages. The optimal landing page for the term "sleeping bag" is of course the sleeping bag category page, but Google is sending them to a product page for a specific sleeping bag.. What could be the critical factors that makes the product page more relevant than the category page as the landing page?
Intermediate & Advanced SEO | | Inevo0 -
Old pages STILL indexed...
Our new website has been live for around 3 months and the URL structure has completely changed. We weren't able to dynamically create 301 redirects for over 5,000 of our products because of how different the URL's were so we've been redirecting them as and when. 3 months on and we're still getting hundreds of 404 errors daily in our Webmaster Tools account. I've checked the server logs and it looks like Bing Bot still seems to want to crawl our old /product/ URL's. Also, if I perform a "site:example.co.uk/product" on Google or Bing - lots of results are still returned, indicating the both still haven't dropped them from their index. Should I ignore the 404 errors and continue to wait for them to drop off or should I just block /product/ in my robots.txt? After 3 months I'd have thought they'd have naturally dropped off by now! I'm half-debating this: User-agent: *
Intermediate & Advanced SEO | | LiamMcArthur
Disallow: /some-directory-for-all/* User-agent: Bingbot
User-agent: MSNBot
Disallow: /product/ Sitemap: http://www.example.co.uk/sitemap.xml0 -
Pages are being dropped from index after a few days - AngularJS site serving "_escaped_fragment_"
My URL is: https://plentific.com/ Hi guys, About us: We are running an AngularJS SPA for property search.
Intermediate & Advanced SEO | | emre.kazan
Being an SPA and an entirely JavaScript application has proven to be an SEO nightmare, as you can imagine.
We are currently implementing the approach and serving an "escaped_fragment" version using PhantomJS.
Unfortunately, pre-rendering of the pages takes some time and even worse, on separate occasions the pre-rendering fails and the page appears to be empty. The problem: When I manually submit pages to Google, using the Fetch as Google tool, they get indexed and actually rank quite well for a few days and after that they just get dropped from the index.
Not getting lower in the rankings but totally dropped.
Even the Google cache returns a 404. The question: 1.) Could this be because of the whole serving an "escaped_fragment" version to the bots? (have in mind it is identical to the user visible one)? or 2.) Could this be because we are using an API to get our results leads to be considered "duplicate content" and that's why? And shouldn't this just result in lowering the SERP position instead of a drop? and 3.) Could this be a technical problem with us serving the content, or just Google does not trust sites served this way? Thank you very much! Pavel Velinov
SEO at Plentific.com1 -
Removing pages from index
My client is running 4 websites on ModX CMS and using the same database for all the sites. Roger has discovered that one of the sites has 2050 302 redirects pointing to the clients other sites. The Sitemap for the site in question includes 860 pages. Google Webmaster Tools has indexed 540 pages. Roger has discovered 5200 pages and a Site: query of Google reveals 7200 pages. Diving into the SERP results many of the pages indexed are pointing to the other 3 sites. I believe there is a configuration problem with the site because the other sites when crawled do not have a huge volume of redirects. My concern is how can we remove from Google's index the 2050 pages that are redirecting to the other sites via a 302 redirect?
Intermediate & Advanced SEO | | tinbum0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
Rankings for Home vs. Internal Pages - Potential 301?
Hi everyone: A site I'm working with until recently was ranking page 1 for its primary keyword. For the last month, they've dropped to page 4. One thing we've noticed is that the page that is ranking is an internal page (http://www.example.com/keyword-string) and at this point, everything ranking above us is ranking based on the root domain (http://www.competitor.com). We've eliminated Panda, penalties, and any other obvious causes for the drop in rankings. We have similar or better page rank, external links, domain trust, etc. in comparison to the sites still ranking page 1. We think this may be part of our problem. Has anyone else dealt with this? What did you do to change it and how did it work? We're considering eliminating the existing internal page and 301'ing to the home page. The keyword in question is the core of the business, so this is a natural change, but we're loath to lose years of investment in promoting the internal page. Also, the site was originally optimized with the primary keyword throughout (appears in META tags, headers on multiple pages). How important is it to clear these out to make Google see the home page as most relevant? Thanks!!
Intermediate & Advanced SEO | | kdcomms0