Too many on page links
-
I'm having trouble interpreting this data. It says several of my blog pages have too many on page links, some as high as 140 and there is no example of a blog post that they are referring to. What am I missing? I never post more than a handful (5-7) in our 600-1000wd blogs. When I drill down, it doesn't give me very much information except "Found over 41 years ago" off to the right. When I click on the "too many on page links" URL, it provides a long list of website pages that are renamed with the blog name. huh?
A lot of this stuff isn't very intuitive, SEOMoz.
-
Tried to set up my campaign to look at the root domain, but it said that my website is set up as a subdomain automatically converted this step of the process to "subdomain".
I believe this would be due to either the root domain being redirected to the subdomain, or there not being any content on the root domain address. If another mozzer does not offer a definitive answer, try contacting the SEOmoz help desk for more information.
When you say "choose" one version of my site, do you mean as far as what SEOMoz crawls? Or are you suggesting I make a change to the site itself?
The site itself. You have one website and it's content should only be available via 1 address. Think of it this way. You can create a website at the address "mysite.com". Next, you create another site at the URL "www.mysite.com" which is an exact duplicate of the "mysite.com" site. This is exactly what you have done. You can even repeat the process further and create subdomains such as "www1.mysite.com". Each subdomain is a duplicate of the main domain and causes confusion for users and search engines alike. Resolve this confusion. Choose ONE way to present your site and remain consistent.
The blog is, I believe, set up as a subdomain (with www.blogs.aerohive.com) and it is hosted by a third-party.
The URL "blogs.aerohive.com" is indeed hosted elsewhere, but the URL "www.blogs.aerohive.com" is hosted at the same location as your main site. It is a mirror of your main site. Remove this subdomain.
I am trying to understand what was set up correctly or incorrectly within SEOMoz and what can fix and what I can fix with my website
Presently there are two major issues which need to be resolved. Both issues are with your website itself, not the SEOmoz tools. If you have managed hosting, the easiest step is to call or open a ticket with your hosting provider and make two requests:
1. Add a 301 to redirect all non-www traffic to it's www equivalent
2. Delete the www.blogs.aerohive.com subdomain
You should be able to copy and paste the above two requests and paste it into a ticket. Your hosting provider should completely understand what actions are necessary.
-
Thanks Sha,
There seems to be a consensus on what issue is, although I asked some extra questions in my reply above to Ryan. Kind of confusing. The blog should be setup as a subdomain, right?
Amanda
-
Thanks Brian - there seems to be a consensus on what issue is, although I asked some extra questions in my reply above to Ryan. Kind of confusing.
Amanda
-
This was very helpful (as were the other two replies) so I am hoping you can help clarify how I can get this addressed. (going with list format to simplify):
1- I tried to set up my campaign to look at the root domain, but it said that my website is set up as a subdomain automatically converted this step of the process to "subdomain".
2- SEOMoz Pro did tell me at some point in the setup process about what you mentioned above, which is that there are two versions of my site. This may have been at the setup point where it converted root domain to subdomain, though I can't recall.
3- When you say "choose" one version of my site, do you mean as far as what SEOMoz crawls? Or are you suggesting I make a change to the site itself?
4- The blog is, I believe, set up as a subdomain (with www.blogs.aerohive.com) and it is hosted by a third-party. I set up the dedicated blog campaign as a subdomain, but had some difficulty with it as well. Not sure if that was correctly set up or not.
Basically I am trying to understand what was set up correctly or incorrectly within SEOMoz and what can fix and what I can fix with my website so that I can better analyze it here (and obviously get search engine approval as the ultimate goal).
Thanks,
Amanda
-
Hi Amanda,
I don't think the problem is with the tool. The tool is simply reporting what the crawler sees.
Since you have said "an actual blog page, which I am attempting to analyze as a dedicated campaign for my blog" I am thinking that you have set up a separate campaign for what you think of as the "actual blog" and designated the campaign as a subdomain?
If this is the case, then presumably, you also have a campaign set up for what you think of as the main site. So, as Ryan and Brian mentioned, you have two copies of all content in your blog, but it exists in two different locations on your server, so is being seen by two different campaigns with different URL's.
The 100 links per page is a recommended rule of thumb to protect usability and avoid degrading the value of individual links on the page. So, as both Ryan and Brian advised, it is much more important right now to deal with the duplication issues on your site.
Hope that helps,
Sha
-
Ouch, yea I see. I would definitely get those canonical (www / non-www / www.blog) issues resolved first, then worry about the links
Brian
-
It seems a subdomain was created for your site and errors were made in the process.
If your main URL is www.areohive.com and you decide to offer a blog as a subdomain, the recommended name would be blog.aerohive.com. Your main site is located on the "www" subdomain and the blog would be offered on the "blog" subdomain. It is unnecessary and a bit confusing to keep the www prefix with another subdomain as happened with http://www.blogs.aerohive.com/.
Your subdomain is currently set up as a mirror of content for your main site. Both URLs you shared are valid URLs with identical pages displayed and a header code of 200 is returned. This duplication should be resolved immediately.
Upon further checking, I just realized your site is set up with a blogs.aerohive.com subdomain, and it does show unique content. Therefore the necessary step is to remove the www.blogs.aerohive.com subdomain.
Another major SEO issue involving duplication is your site is also available in both the www and non-www form of the URL. http://aerohive.com and http://www.aerohive.com are both valid URLs which display content and 200 header response codes. This issue should also be fixed immediately. Your site has backlinks to both versions of the URL.
Choose one version of your URL, the www or non-www version, then be consistent. 301 redirect the unused version of the URL to the main URL. This step will consolidate your backlinks and improve your ranking. Next, review your entire site to ensure all links consistently use the chosen URL format. Also check your social sites and signatures to ensure they are updated as well.
Once the above major issues are resolved, we can then loop back to the linking issue. Relatively speaking, it is a much less important issue.
-
Hi Brian,
I think you are right, but I think there is something wrong with the way this tool is analyzing my pages. I explained what I think is going on in Ryan's thread above. Like maybe this is an analysis of the website page (which I have analyzed in a separate campaign) vs. an actual blog page, which I am attempting to analyze as a dedicated campaign for my blog.
amanda
-
I think what's happening, and I don't understand this part especially (although I eluded to it originally), but I am analyzing my blog, the "too many links" is under my blog campaign, but when it names a "page" it is actually a website page renamed with the blog title. I will post an example but it will come up as a bad link for you I think.
1- I think that perhaps I have this campaign set up wrong? So it isn't analyzing the pages correctly, or the right pages
-
The issue here is likely that your blog template and plugins combined are creating a situation where there are lots of links on the page to begin with (navas, footer, archives etc. etc.). As Ryan said, all of those other links on the page that are not in your post (page) count too, so when you add a few more inline in your post (or page) you are pushed over the top.
Depending on what type of CMS you use, getting the number links on the page down is probably going to involve some kind of change (template/plugins) to remove some site wide links you don't really need. Once you do that, you should see that 140 number go away (or at least way down).
Side note, it was reported elsewhere that the "Found over 41 years ago" thing is a bug, you should open a help ticket and let the staff here know you were affected by it.
Brian
-
If you can offer a link to the page, we can be a lot more helpful.
Based on what you shared all I can offer is the software looks at the <a>tags on the page to locate links, and apparently 140 were located. Links include all the links on the page include the site's navigation links, footer links, sidebar links, image links, social sharing links, etc.</a>
<a>You may also use the "Highlight Links or Text" icon located on the MOZbar to locate various links on a given page.</a>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal linking
Hi Guy's, Whats the best way to set internal links on your website: 1. href=: /page/
On-Page Optimization | | Happy-SEO
2. href=: https://domain.com/page/ Thanks!0 -
Why are my pages de-indexed?
<form id="form-t3_37nfib9dz" class="usertext" action="http://www.reddit.com/r/SEO/comments/37nfib/why_were_my_pages_deindexed/#"> Hello all, I am very new to SEO. For some reason many of the pages on my site were de-indexed. Specifically the ones linked from this page: However other pages, like the ones linked from this page and this page were not de-indexed. http://www.lawyerconnection.ca/practice-areas/car-accident-injury-lawyers/[1] However the pages linked from this page were not de-indexed: http://www.lawyerconnection.ca/practice-areas/slip-and-fall-lawyers/[2] http://www.lawyerconnection.ca/podcastresources/[3] That first page itself was not de-indexed, just the site that it links to. It just happened today, so maybe I am jumping the gun but I doubt it. When I enter the page into google webmaster tools again and press fetch, one of the child pages, it re-indexes. What could be the problem here? I had someone re-write the content for every city but I have a feeling that there is less differences in the car accidents pages? Is this considered duplicated content do you think? Am I making some other mistake I can't think of? Is it just a one day blip (I doubt it) Let me know, thanks. </form>
On-Page Optimization | | RafeTLouis0 -
How to rank well on 2 keywords - 2 separate pages or 1 combined page
Hi, I have a website about allergy. We ar developing new content, and through keyword research I have discovered that "dog allergy" and "cat allergy" are both very common searches. However, the cause, and symtoms are very alike for these 2 types of allergy so it would make sense to combine the two allergies on one page. So my question is: What do I choose to increase my chances to ranke the best I can for both "cat allergy", and "dog allergy"? Should I develop 2 separate pages for cat & dog allergy or should I do a combined page? (We would of course review the texts so no duplicate content/text would be used if we chose to have 2 pages) I would be so greatful for your advice!! Kind regards, Jeanette
On-Page Optimization | | Mylan-GDM0 -
Duplicate Pages software
Hey guys, i was told few hours ago about a system that can take few of your keywords and automatically will create new links and pages (in the map file) for your website, so a website that was build with 20 pages( for example) will be shown to SE as a site with hundreds of pages, thing that should help the SEO IS anyone heard about such a software? is it legal? any advice that you can give on this mater? Thanks i.
On-Page Optimization | | iivgi0 -
PANDA Attack: Too many on page links
Hey guys! I have a bit of a dilemma...one of my sites got hit by Panda 😞 The content itself contains about 10 links, however since the site is a process directory, at the bottom of the page you will find that the visitor can also browse process directory by name or page and then beneath this there are 80 links :s My concern is that if i remove this I will lose internal link juice! HELP! What approach should I take? I was thinking of either reducing the number of links OR hiding it by using Java ORRRR removing the links entirely. Advice anyone? This is a page as an example: http://www.processlibrary.com/directory/files/csrsc/25349/ All pages are like this!
On-Page Optimization | | OrangeGuys0 -
How do you avoid getting hit for too many links with an ecommerce site?
On my campaign for www.fourcolormagnets.com one of my warnings was "too many on-page links". Is there any thing to do for ecommerce sites? and also, my page www.fourcolormagnets.com/rectangle-sizes.php is listed as having 744 links but, I count nowhere near that number. And idea where this comes from?
On-Page Optimization | | JHSpecialty0 -
Would I be safe canonicalizing comments pages on the first page?
We are building comment pages for an article site that live on a separate URL from the article (I know this is not ideal, but it is necessary). Each comments page will have a summary of the article at the top. Would I be safe using the first page of comments as the canonical URL for all subsequent comment pages? Or could I get away with using the actual article page as the canonical URL for all comment pages?
On-Page Optimization | | BostonWright0 -
Alternatives for having less then 100 links per page
Guys, I'm aware of the recomendation of having <100 links per page. The thing is I'm running a vacation rental website (my clients pay me to advertise their properties on my website). We use an AJAX interface with pagination to show the properties. So I have cities that have +400 properties on them... the pagination works fine but google can't crawl trough it (there is a google doc about making ajax systems crawlable, but that would invove a huge rewrite of our code and I dont understand how it helps the SEO). So my question is: what do I do to mantain each property having at least one link pointing to them at the same time that I keep the # of links in each page <100 ? Any suggestions ?
On-Page Optimization | | pqdbr0