Links to category pages unnatural?
-
If people are linking to your site, it would seem natural that the vast majority of those links would point to the homepage, product page, or a article/content page.
Let's say you have 100 links pointing to your site, and 40 of them are pointing to category pages. Would this seem unnatural?
Does Google or other search engines have a way of determining this as a factor in ascertaining whether the links are natural or not?
Is there a rule of thumb when it comes to the pages that are linked to on your site?
-
From 100 links, 40 pointing categories looks ok for me.
Google is against link building practices and follows such practice ( PENGUIN ), so if you want to do link building and not get caught, you need to build a very natural link profile:
- Anchors: Do not use the exacly keyword you want to rank as anchor, instead use long sentences including your kw.
- Variety: Do not get just "keyword" links, links with url, branch are also important.
- Nofollow: It´s very important to include nofollow links in your link profile.
- **Footer/sidebar: **Avoid links from footer/sidebar, specially if they are nofollow.
- **Variety of domains: **Get your links from different domains.
- Avoid links fromk penalized websites: It´s also very important to analyze the domains your site is going to be linked are healthy enough.Fore more complete information, checkout this article: http://moz.com/ugc/category/link-building
-
It doesn't really matter which pages of your site have links going to them, including category pages. There's a few things to watch out for regarding unnatural inbound links and they're listed below in order of priority, however any of the below points will cause you issues if you abuse them.
- The main one to watch out for is an over optimized anchor text profile.
- The second thing to watch is the type of inbound links you have. Are they all the same type of links such as directory links or blog comments etc.
- The third thing to watch out for is your inbound link velocity. Is that category page getting new inbound links at 5 x the velocity of everyone else's sites in your niche.
If all your links are from diverse and (reasonably) reputable sources, and your inbound anchor text profile isn't over optimized, then you could have hundreds of links pointing to your category pages without an issue. If a big brand suddenly released a whole new range of merchandise then their category page/s would acquire 1000's of links without an issue.
Feel free to build links to any page you want to rank, just make sure they're good links. I also hope that your category page has it's onsite Seo is up to scratch. Having all of your meta data, heading tags, and content up to scratch will go a long way and it'll help you to get the most power out of your inbound links too. Hope this helps a little.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple sub-category of the same name ? does that effect SEO
Hello, If I have multiple sub-category of the same name ? does that affect SEO for example I have the following category structure? domain/bmw/series5/2006.html domain/bmw/series5/2007.html .. etc domain/bmw/series3/2007.html domain/bmw/series3/2006.html ..etc domain/Acura/cl/2006.html domain/Acura/cl/2007.html .. etc I do use canonical url because I may have the same product in multiple categories but my question does google penalize me because I have the same (year) url key for multiple categories even though I use canonical url ? do I have any advantage in masking them filters vs sub-category from SEO point of view ? specially my goal is to have different meta title and meta description for each sub category ?
Algorithm Updates | | LKCservicesINC0 -
Schema Mark up - Product Listing Pages
Hi I know you can add product schema to a product page, but can you add mark up to a product listing/category page? If so, which one would you use? I saw the item list mark up but didn't think this was relevant. Thank you
Algorithm Updates | | BeckyKey0 -
How is this possible? #2 ranking with NO on-page keywords, no backlinks, no sitemap...
Hi everybody. I have a question ... I'm totally stumped. This question is being asked today (November 16th, 2015) just after Google updated something in their algorithm. Nobody seems to know what they did. and it has something to do with the new "Rank Brain" system they're now using. My niche is Logo Design Software (https://www.thelogocreator.com). I had the keywords "logo creator" on the page roughly 7 times. After Google updated, I lost about 10 spots and as of this writing, I've dropped to #15. So, maybe I over optimized. fine. Noticing that for the keyword "logo creator" ... NONE of the top 14 spots actually have "logo creator" in their page title and NONE of them have more that 2 instances (if any) of the keyword "logo creator" on the actual page. So I removed ALL instances of my keyword "logo creator" from my home page - used the Webmaster's Fetch Tool and moved up a few spots instantly. So what the heck? And the #2 spot for that keyword is www.logomakr.com - they have NO words at all on their pages, no blog, no sitemap and far fewer links than anybody in the top 10. Can anybody reading this shed some light? Marc Marc Sylvester
Algorithm Updates | | Laughingbird
Laughingbird Software0 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
Changing the # of results per page in Google search settings displays totally different results. Why is this?
Curious what's going on here. This is the first time I've seen this before. What's happening is this ... In Google, I search for "mobile apps orange county" and get a standard list of 10 results. I go to Google's search settings in the top right corner of the page (button is grey with a gear) to change the number of results per page from 10 to 50 (also did 100). When I go back to Google and search again for "mobile apps orange county" I get a much larger list but with completely different results. This time around the top 10-12 are dominated by the same website (ocregister.com) What's going on here that Google would now show different results? Why is this one website all of a sudden dominating the first 12 results? Thanks everyone! ByteLaunch
Algorithm Updates | | ByteLaunch0 -
First page slot 1 spot doesn't equal global monthly traffic
We have a client who has occupied the top spot on Google for the past several months. According to the Google Adwords keyword suggestion tool, this keyword should generate around 5,000 Global and Local Monthly average visits. Trends show this keyword has consistent month over month traffic. The keyword search type is broad match. When we look at analytics, they're only getting 5 visits per month. Shouldn't the top spot get the lion's share of traffic? We've noticed this trend on several of our clients whose traffic doesn't really increase proportionate to the estimated search volume that Google returns in the Adwords tool. Ideas? We see the estimated traffic and tell clients, "Once we get you in that top organic slot, you'll get most of that traffic," but it's not correlating. Thanks so much.
Algorithm Updates | | GravitateOnline0 -
Home page replaced by subpage in google SERP (good or bad)
SInce Panda, We have seen our home page drop from #2 in google.ie serp to page 3 but it has been replaced in the same position @#2 by our relevent sub page for the keyword that we ranked#2 for. Is this a good or bad thing from and seo point of view and is it better to have deep pages show in serp rather than the homepage of a site and what is the best line of action from here in relation to seo. Is it best to work on subpage or home page for that keyword and should link building for that phrase be directed towards the subpage or the homepage as the subpage is obviously more relevent in googles eyes for the search term. It is clear that all areas of the site should be looked at in relation to link building and deep links etc but now that google is obviously looking at relevancy very closely should all campaigns be sectioned into relevent content managed sections and the site likewise and treated on an individual basis. Any help that you may have would be very welcome. Paul
Algorithm Updates | | mcintyr0 -
Link analysis task
Hi mozzers, I am currently working on a phd, and one of the professors asked me for help. He would like to know how many Danish school websites (n=1500) links to a certain section of a government website (the relevant section has around 1600 pages). The problem is, that the government website is coded very poorly from an seo perspective with lots of strange URL variables, entailing OSE can't give valid data. So, what would be the best way to check how many of the school websites link? Throw all 1500 website through Xenu, or is there a smarter solution? Maybe the link out feature on Bing? Any suggestions will be greatly appreciate. Thanks!
Algorithm Updates | | ThomasHgenhaven0