Importance of an optimized home page (index)
-
I'm helping a client redesign their website and they want to have a home page that's primarily graphics and/or flash (or jquery). If they are able to optimize all of their key sub-pages, what is the harm in terms of SEO?
-
No, that's not correct. Think about the way link juice flows as well. Remember that it is the professionals who say neither of these practices are good. If you want a good strong SEO position, you need to avoid these practices.
-
Just from my own sites...
Most profitable can be one of three things....
A) Highest sales volume
B) Best profit margin
C) Items that are easy to obtain, inventory and deliver (low labor)
If I had a client, I would talk to her and ask what she can sell a lot of and what she enjoys moving.
Where you can get SERP position based upon competition levels is also a part of this.
-
If I'm hearing you correctly, you are saying that while not recommended, it's possible to have a good SEO program for deep pages, while not optimizing the home page. Is that accurate?
-
Thank you Egol. How do you determine which keywords are most profitable? Is it something that the client figures out overtime, or is there a way to determine "profitability" before the SEO process begins?
-
As EGOl and iNet state, using flash for a homepage isn't a great idea.
You will need links outside of the flash to ensure crawlers find deeper pages.
If you do optimise sub level pages over the homepage, then fancy homepage flash become inert as most visitors are going to land on sub pages, by passing the homepage.
Include Flash by all means, but make it a page element, not the entire page.
Jquery on the other hand is awesome and if done correctly will not hinder your SEO.
-
Google will tell you not to have sites that are image heavy or that have content buried so that they cannot easily crawl / read it.
Steer well clear of both of these possibilities as neither will benefit your customer. Also remember that Google also uses page load speed as a metric as well.
Regards,
Andy
-
For many sites the homepage brings in more traffic than any other page. Also the homepage is usually the strongest page on a site and that gives it the ability to attack the most difficult (and often most profitable) keywords. Not using the homepage would be like not using your best sword.
Maybe there is a way that you can include flash elements on the page and still have space left for optimized text?
Or, better, have a homepage that has a number of enticing links that will pull the visitor deeper into the site and engage them with more information.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
An informational product page AND a shop page (for same brand)
Hi all, This is my first foray into e-commerce SEO. I'm working with a new client who sells upscale eBikes online. Since his products are expensive, he wants to have informational pages about the brands he sells eg. www.example.com/brand. However these brands are also category pages for his online shop eg. www.example.com/shop/brand I'm worried about keyword cannibalization and adding an extra step/click to get to the shop (right now the navigational menu takes you to the information page and from there you have to click to get to the shop) I'm pretty sure it would make more sense to have ONE killer shopping page that includes all the brand information but I want to be 100% sure before I advise him to take this big step. Thoughts?
Technical SEO | | MouthyPR1 -
Home has DA 50 but Subpages have Page Authority of 1
Hello, we already asked this, but there was no answer. We would be happy for any information. How could it be, that our subpages all have a PA 1 if the home got DA 50? technical specialities: Megamenue opens on click only Category pages dont exist (home/i-do-not-exist-as-page-category/PA-1-subpage) All subpages have a high amount of links to ressources (over 200) subpages are crawled and online for some time what would be the most obvious cause for the low PA? Would the external link profile be the main reason? thanks in advance. I would be happy to answer your questions
Technical SEO | | brainfruit0 -
Indexed pages
Just started a site audit and trying to determine the number of pages on a client site and whether there are more pages being indexed than actually exist. I've used four tools and got four very different answers... Google Search Console: 237 indexed pages Google search using site command: 468 results MOZ site crawl: 1013 unique URLs Screaming Frog: 183 page titles, 187 URIs (note this is a free licence, but should cut off at 500) Can anyone shed any light on why they differ so much? And where lies the truth?
Technical SEO | | muzzmoz1 -
Need Help On Proper Steps to Take To De-Index Our Search Results Pages
So, I have finally decided to remove our Search Results pages from Google. This is a big dealio, but our traffic has consistently been declining since 2012 and it's the only thing I can think of. So, the reason they got indexed is back in 2012, we put linked tags on our product pages, but they linked to our search results pages. So, over time we had hundreds of thousands of search results pages indexed. By tag pages I mean: Keywords: Kittens, Doggies, Monkeys, Dog-Monkeys, Kitten-Doggies Each of these would be linked to our search results pages, i.e. http://oursite.com/Search.html?text=Kitten-Doggies So, I really think these pages being indexed are causing much of our traffic problems as there are many more Search Pages indexed than actual product pages. So, my question is... Should I go ahead and remove the links/tags on the product pages first? OR... If I remove those, will Google then not be able to re-crawl all of the search results pages that it has indexed? Or, if those links are gone will it notice that they are gone, and therefore remove the search results pages they were previously pointing to? So, Should I remove the links/tags from the product page (or at least decrease them down to the top 8 or so) as well as add the no-follow no-index to all the Search Results pages at the same time? OR, should I first no-index, no-follow ALL the search results pages and leave those tags on the product pages there to give Google a chance to go back and follow those tags to all of the Search Results pages so that it can get to all of those Search Results pages in order to noindex,. no follow them? Otherwise will Google not be able find these pages? Can someone comment on what might be the best, safest, or fastest route? Thanks so much for any help you might offer me!! Craig So, I wanted to see if you have a suggestion on the best way to handle it? Should I remove the links/tags from the product page (or at least decrease them down to the top 8 or so) as well as add the no-follow no-index to all the Search Results pages at the same time? OR, should I first no-index, no-follow ALL the search results pages and leave those tags on the product pages there to give Google a chance to go back and follow those tags to all of the Search Results pages so that it can get to all of those Search Results pages in order to noindex,. no follow them? Otherwise will Google not be able find these pages? Can you tell me which would be the best, fastest and safest routes?
Technical SEO | | TheCraig0 -
Why are only a few of our pages being indexed
Recently rebuilt a site for an auctioneers, however it has a problem in that none of the lots and auctions are being indexed by Google on the new site, only the pages like About, FAQ, home, contact. Checking WMT shows that Google has crawled all the pages, and I've done a "Fetch as Google" on them and it loads up fine, so there's no crawling issues that is standing out. I've set the "URL Parameters" to no effect too. Also built a sitemap with all the lots in, pushed to Google which then crawled them all (massive spike in Crawl rate for a couple days), and still just indexing a handful of pages. Any clues to look into would be greatly appreciated. https://www.wilkinsons-auctioneers.co.uk/auctions/
Technical SEO | | Blue-shark0 -
Getting home page content at top of what robots see
When I click on the text-only cache of nlpca(dot)com on the home page http://webcache.googleusercontent.com/search?q=cache:UIJER7OJFzYJ:www.nlpca.com/&hl=en&gl=us&strip=1 our H1 and body content are at the very bottom. How do we get the h1 and content at the top of what the robots see? Thanks!
Technical SEO | | BobGW0 -
Advice on importing content please to keep page fresh
Hi i am working on a site at the moment http://www.cheapflightsgatwick.com which is a travel news and holiday news site but i am trying to find out what is best reference content for the following section http://www.cheapflightsgatwick.com/humberside-airport What i am thinking of doing to keep google keep coming to my section of the site is to import content, what i mean is, to use google for keywords such as humerside airport and have the stories appear on this page as well as writing my own content. I was thinking of importing content because it would help keep the page fresh without my original content become too old but i am concerned about this and not sure if it is the right thing to do. i am concerned if i do this if my rankings would reduce because i am importing content and for people to read the rest of the story they will have to leave my site. i am also worried that google will reduct points for duplicate content. can anyone let me know what i should do, if i should just stick to original content and not import it or to import it and if i should import it using google news how would i do this. many thanks
Technical SEO | | ClaireH-1848860 -
GWT indexing wrong pages
Hi SEOMoz I have a listings site. In a part of the page, I have 3 comboboxes, for state, county and city. On the change event, the javascript redirects the user to the page of the selected location. Parameters are passed via GET, and my URL is rewrited via htaccess. Example: http:///www.site.com/state/county/city.html The problem is, there is A LOT(more than 10k) of 404 errors. It is happenning because the crawler is trying to index the pages, sometimes WITHOUT a parameter, like http:///www.site.com/state//city.html I don't know how to stop it, and I don't wanna remove it, once it's very clicked by the users. What should I do?
Technical SEO | | elias990