Linking Back to the Same Page
-
What are the other seo's opinions on linking the same keyword you are targeting lets take an example like Trampolines.
So we have a Online shop selling trampolines would you feel it a good or bad thing to link the keyword trampolines from the homepage to the homepage almost creating a loop.
Some SEO's say yes some say no ?
-
I don't see any reason why you would do this unless you are trying to manipulate the search engines. I would expect that Google and Bing would eventually find a way to ignore these links if they haven't done so already.
-
Some SEO's say yes as they believe that any anchor text linking is beneficial, but like Acorn-Internet said, value to the user should be #1 as if there is value to a user, then there is value to search engines (most of the time)
I would suggest against it as Google is on a quality rampage right now it seems, so questionable tactics that have negligible SEO benefits (such as this) should be avoided in my opinion.
w00t!
-
The key question to ask yourself is always 'will this help my users/customers?'
I do know some people that try this, but in my opinion there's no benefit.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can Javascript Links Be Used to Reduce Links per Page?
We are trying to reduce the number of links per page, so for the low-value footer links we are considering coding them as javascript links. We realize Google can read java, but the goal is to reduce level of importance assigned to those internal links. Would this be a valid approach? So the question is would converting low-value footer links to js like below help reduce the number of links per page in google's eyes even though we're reasonably sure they can read javascript. <a <span="" class="html-tag">href</a><a <span="" class="html-tag">="</a><a class="html-attribute-value html-external-link" target="_blank">javascript:void(0);</a>" data-footer-link="/about/about">About Us
On-Page Optimization | | Jay-T0 -
Too many on page links - created by filters
I have an ecommerce site and SEOmoz "Crawl Diagnostics Summary" points out that I have too many hyperlinks on most of my pages. The most recent thing I've done that could the culprit is the creation of number product filters. Each filter I put on the page is creating a hyperlink off that page. As an example, there's a filter available for manufacturers. Under that, there are 8 new filter links, thus new hyperlinks. On one category there are 60 new links created because of filters. I feel like these filters have made the user experience on the site better BUT has dramatically increased the number of outbound links off the page. I know keeping it to under 100 is a rule-of-thumb but at the same time there must be some validity to trying to limit them. Do you have any recommendation on how I can "have my cake and eat it too?" Thanks for any help!
On-Page Optimization | | jake3720 -
On page link question, creating an additional 'county' layer between states and zips/cities
Question We have a large site that has a page for all 50 states. Each of these pages has unique content, but following the content has a MASSIVE amount of links for each zip AND city in that state. I am also in the process of creating unique content for each of these cities and zips HOWEVER, I was wondering would it make sense to create an additional 'county' layer between the states and the zips/cities. Would the additional 'depth' of the links bring down the overall rank of the long tail city and zip pages, or would the fact that the counties would knock the on page link count down from a thousand or so, to a management 50-100 substantially improve the overall quality and ranking of the site? To illustrate, currently I have State -> city and zip pages (1200+ links on each state page) what i want to do is do state -> county (5-300 counties on each state page) -> city + zip (maybe 50-100 links on each county page). What do you guys think? Am I incurring some kind of automatic penalty for having 1000+ links on a page?
On-Page Optimization | | ilyaelbert0 -
What are all those meta name= and link rel= on the cnn home page source?
I usually use Description, title and keywords tag. I keep seeing these meta name = "classification" or "distribution" and also link rel =stylesheet" and "pingback" etc. Please tell me how important this is for SEO. It would be great to be pointed to the right page. Also, is there a wordpress pluggin to just fill in and have these be populated on the front end? Thank You
On-Page Optimization | | waspmobile0 -
No index parts of a page?
Little bit of an odd question this, but how would one go about getting Google to not index certain content on a page? I'm developing an online store for a client and for a few of the products they will be stocking they will be using the manufacturers specs and descriptions. These descriptions and specs, therefore, will not be unique as they will be also used by a number of other websites. The title tag, onpage h1 etc will be fine for the seo of the actual pages (with backlinks, of course) so the impact of google not counting the description should be slight. I'm sure this can be done but for the life of me I cannot remember how. Thanks Carl
On-Page Optimization | | Grumpy_Carl0 -
Ranking for specific pages
HI, Lets say my website is abc.com and my targeted keyword is abc for index page. Internal pages, like abc.com/apple.htm, abc.com/banana.htm. The targeted keyword for apple.htm is fresh apples, buy apples, and for banana.htm, fresh banana, buy banana. How to define these keywords in the campaign. Please suggest. Thanks.
On-Page Optimization | | younus0 -
Third party pages
Suppose you are using a third party tool such as an affiliate program. Typically, all the files are organized under one subdirectory. In addition, you may have little or no ability to modify any of the files in terms of SEO. Would you recommend hiding the entire subdirectory with a noindex? Best,
On-Page Optimization | | ChristopherGlaeser
Christopher0 -
Prevent link juice to flow on low-value pages
Hello there! Most of the websites have links to low-value pages in their main navigation (header or footer)... thus, available through every other pages. I especially think about "Conditions of Use" or "Privacy Notice" pages, which have no value for SEO. What I would like, is to prevent link juice to flow into those pages... but still keep the links for visitors. What is the best way to achieve this? Put a rel="nofollow" attribute on those links? Put a "robots" meta tag containing "noindex,nofollow" on those pages? Put a "Disallow" for those pages in a "robots.txt" file? Use efficient Javascript links? (that crawlers won't be able to follow)
On-Page Optimization | | jonigunneweg0