Do home page carry more seo benefit than other pages?
-
hi,
i would like to include my kws in the URL and they are under 50 characters.
is there anything in the algo that tells engines to give more importance to homepage?
-
What often happens is the homepage of most businesses; this is the page that gets link to the most,
because it has the most backlinks leading to it, that's why that page is often displayed most of the results.
For example, we run a business that sells garden rooms in South West England, and there are many variations of the same building that we build, so for example there called "Garden Bars" "Summerhouses" "Garden Gyms" and we have different pages for every single product we sell.
Still, its the home page which most often appears most on Google, simply because we have so many do-follow quality links leading to that page, so the organic SEO, is improved for that page, so it gets more organic visitors,
-
Hi there,
In most cases, your home page has more SEO value due to the number of external links to pointing to it. It is normal that other websites are linking to your home page instead of linking to your inner pages. I would definitely include your keywords in the inner URL, but you should not make the URL very long.
Ross
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
[SEO] Star Ratings -> Review -> Category Page
Hello there, Basically, if you put non-natural star ratings on the category page, like in the attached images, you will get manual ban from google right?
White Hat / Black Hat SEO | | Shanaki
(i know it for sure, cause I had clients with this situation) The real question is:
If I put a form that allows users to write a review about the category products on the category page, for REAL, will google still ban? Any advice? Any example? With respect,
Andrei Irh0O kto4o0 -
Active Rain and SEO
I have been an active rain member for a long time. When I check my web site I can not find any links from Active Rain. I just updated my Active Rain profile and upgraded to their paid subscription. Can you tell me if this blog is creating a follow link back to my web site at www.RealEstatemarketLeaders.com the blog on active rain is here. at http://activerain.trulia.com/blogsview/4529309/hud-homes-for-sale-in-tri-cities-wa
White Hat / Black Hat SEO | | Brandon_Patton0 -
Is Yahoo! Directory still a beneficial SEO tactic
For obvious reasons, we have submitted our clients to high authority directories such as Yahoo! Directory and Business.com. However, with all of the algorithm updates lately, we've tried to cut back on the paid directories that we submit our clients to. Having said that, my question is, is Yahoo! Directory still a beneficial SEO tactic? Or are paid directories, with the exception of BBB.com, a bad SEO tactic?
White Hat / Black Hat SEO | | MountainMedia0 -
How/why is this page allowed to get away with this?
I was doing some research on a competitor's backlinks in Open Site Explorer and I noticed that their most powerful link was coming from this page: http://nytm.org/made-in-nyc. I visited that page and found that this page, carrying a PageRank of 7, is just a long list of followed links. That's literally all that's on the entire page - 618 links. Zero nofollow tags. PR7. On top of that, there's a link at the top right corner that says "Want to Join?" which shows requirements to get your link on that page. One of these is to create a reciprocal link from your site back to theirs. I'm one of those white-hat SEOs who actually listens to Matt Cutts, and the more recent stuff from Moz. This entire page basically goes against everything I've been reading over the past couple years about how reciprocal links are bad, and if you're gonna do it, use a nofollow tag. I've read that pages, or directories, such as these are being penalized by Google, and possible the websites with links to the page could be penalized as well. I've read that exact websites such as these are getting deindexed by the bunches over the past couple years. My real question is how is this page allowed to get away with this? And how are they rewarded with such high PageRank? There's zero content aside from 618 links, all followed. Is this just a case of "Google just hasn't gotten around to finding and penalizing this site yet" or am I just naive enough to actually listen and believe anything that comes out of Matt Cutts videos?
White Hat / Black Hat SEO | | Millermore0 -
Linking my pages
Hello everybody, i have a small dilemma and i am not shore what to do. I am (my company) the owner of 10 e-commerce web sites. On every site i have a link too the other 9 sites and i am using an exact keyvoerd (not the shop name).Since the web stores are big and have over a 1000 pages, this means thet all my sites have a lot off inbound links (compared with my competiton). I am woried that linking them all together could be bad from Googles point of wiev. Can this couse a problem for me, should i shange it? Regardes, Marko
White Hat / Black Hat SEO | | Spletnafuzija0 -
Purchasing Expired Domains for SEO Value?
While doing competitive research for a client I have stumbled on a "site developed by" footer link for a fairly established business that points to an expired domain. I'm inclined to notify the business in question that the link is expired BUT I was curious to get some thoughts on if purchasing this domain and redirecting it to my site or another would be a good purely "SEO tactic" as it would seemingly pass "juice"??? Thanks, Dave
White Hat / Black Hat SEO | | DavidGadarian0 -
Press Releases and SEO in 2013
Mozers, A few questions for the community: Distributing a press release through a service like 24-7pressrelease.com - is it a serious duplicate content issue when an identical press release is distributed to multiple sites with no canonical markup (as far as I can tell)? All of the backlinks in the press release are either nofollow or redirects. If there IS a duplicate content issue, will the website be affected negatively given the numerous Panda and Penguin refreshes? Why SHOULDN'T a company issue a press release to multiple sites if it actually has something legitimate to announce and the readership of a given site is the target demographic? For example, why shouldn't a company that manufactures nutritional health supplements issue the same press release to Healthy Living, Lifestyle, Health News, etc _with a link to the site?_I understand it's a method that can be exploited for SEO purposes, but can't all SEO methods be taken to an extreme? Seems to me that if this press release scenario triggers the duplicate content and/or link spam penalty(ies), I'd consider it a slight deficiency of Google's search algorithm. Any insight would be much appreciated. Thanks.
White Hat / Black Hat SEO | | b40040400 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0