Too many on page links
-
I'm having trouble interpreting this data. It says several of my blog pages have too many on page links, some as high as 140 and there is no example of a blog post that they are referring to. What am I missing? I never post more than a handful (5-7) in our 600-1000wd blogs. When I drill down, it doesn't give me very much information except "Found over 41 years ago" off to the right. When I click on the "too many on page links" URL, it provides a long list of website pages that are renamed with the blog name. huh?
A lot of this stuff isn't very intuitive, SEOMoz.
-
Tried to set up my campaign to look at the root domain, but it said that my website is set up as a subdomain automatically converted this step of the process to "subdomain".
I believe this would be due to either the root domain being redirected to the subdomain, or there not being any content on the root domain address. If another mozzer does not offer a definitive answer, try contacting the SEOmoz help desk for more information.
When you say "choose" one version of my site, do you mean as far as what SEOMoz crawls? Or are you suggesting I make a change to the site itself?
The site itself. You have one website and it's content should only be available via 1 address. Think of it this way. You can create a website at the address "mysite.com". Next, you create another site at the URL "www.mysite.com" which is an exact duplicate of the "mysite.com" site. This is exactly what you have done. You can even repeat the process further and create subdomains such as "www1.mysite.com". Each subdomain is a duplicate of the main domain and causes confusion for users and search engines alike. Resolve this confusion. Choose ONE way to present your site and remain consistent.
The blog is, I believe, set up as a subdomain (with www.blogs.aerohive.com) and it is hosted by a third-party.
The URL "blogs.aerohive.com" is indeed hosted elsewhere, but the URL "www.blogs.aerohive.com" is hosted at the same location as your main site. It is a mirror of your main site. Remove this subdomain.
I am trying to understand what was set up correctly or incorrectly within SEOMoz and what can fix and what I can fix with my website
Presently there are two major issues which need to be resolved. Both issues are with your website itself, not the SEOmoz tools. If you have managed hosting, the easiest step is to call or open a ticket with your hosting provider and make two requests:
1. Add a 301 to redirect all non-www traffic to it's www equivalent
2. Delete the www.blogs.aerohive.com subdomain
You should be able to copy and paste the above two requests and paste it into a ticket. Your hosting provider should completely understand what actions are necessary.
-
Thanks Sha,
There seems to be a consensus on what issue is, although I asked some extra questions in my reply above to Ryan. Kind of confusing. The blog should be setup as a subdomain, right?
Amanda
-
Thanks Brian - there seems to be a consensus on what issue is, although I asked some extra questions in my reply above to Ryan. Kind of confusing.
Amanda
-
This was very helpful (as were the other two replies) so I am hoping you can help clarify how I can get this addressed. (going with list format to simplify):
1- I tried to set up my campaign to look at the root domain, but it said that my website is set up as a subdomain automatically converted this step of the process to "subdomain".
2- SEOMoz Pro did tell me at some point in the setup process about what you mentioned above, which is that there are two versions of my site. This may have been at the setup point where it converted root domain to subdomain, though I can't recall.
3- When you say "choose" one version of my site, do you mean as far as what SEOMoz crawls? Or are you suggesting I make a change to the site itself?
4- The blog is, I believe, set up as a subdomain (with www.blogs.aerohive.com) and it is hosted by a third-party. I set up the dedicated blog campaign as a subdomain, but had some difficulty with it as well. Not sure if that was correctly set up or not.
Basically I am trying to understand what was set up correctly or incorrectly within SEOMoz and what can fix and what I can fix with my website so that I can better analyze it here (and obviously get search engine approval as the ultimate goal).
Thanks,
Amanda
-
Hi Amanda,
I don't think the problem is with the tool. The tool is simply reporting what the crawler sees.
Since you have said "an actual blog page, which I am attempting to analyze as a dedicated campaign for my blog" I am thinking that you have set up a separate campaign for what you think of as the "actual blog" and designated the campaign as a subdomain?
If this is the case, then presumably, you also have a campaign set up for what you think of as the main site. So, as Ryan and Brian mentioned, you have two copies of all content in your blog, but it exists in two different locations on your server, so is being seen by two different campaigns with different URL's.
The 100 links per page is a recommended rule of thumb to protect usability and avoid degrading the value of individual links on the page. So, as both Ryan and Brian advised, it is much more important right now to deal with the duplication issues on your site.
Hope that helps,
Sha
-
Ouch, yea I see. I would definitely get those canonical (www / non-www / www.blog) issues resolved first, then worry about the links
Brian
-
It seems a subdomain was created for your site and errors were made in the process.
If your main URL is www.areohive.com and you decide to offer a blog as a subdomain, the recommended name would be blog.aerohive.com. Your main site is located on the "www" subdomain and the blog would be offered on the "blog" subdomain. It is unnecessary and a bit confusing to keep the www prefix with another subdomain as happened with http://www.blogs.aerohive.com/.
Your subdomain is currently set up as a mirror of content for your main site. Both URLs you shared are valid URLs with identical pages displayed and a header code of 200 is returned. This duplication should be resolved immediately.
Upon further checking, I just realized your site is set up with a blogs.aerohive.com subdomain, and it does show unique content. Therefore the necessary step is to remove the www.blogs.aerohive.com subdomain.
Another major SEO issue involving duplication is your site is also available in both the www and non-www form of the URL. http://aerohive.com and http://www.aerohive.com are both valid URLs which display content and 200 header response codes. This issue should also be fixed immediately. Your site has backlinks to both versions of the URL.
Choose one version of your URL, the www or non-www version, then be consistent. 301 redirect the unused version of the URL to the main URL. This step will consolidate your backlinks and improve your ranking. Next, review your entire site to ensure all links consistently use the chosen URL format. Also check your social sites and signatures to ensure they are updated as well.
Once the above major issues are resolved, we can then loop back to the linking issue. Relatively speaking, it is a much less important issue.
-
Hi Brian,
I think you are right, but I think there is something wrong with the way this tool is analyzing my pages. I explained what I think is going on in Ryan's thread above. Like maybe this is an analysis of the website page (which I have analyzed in a separate campaign) vs. an actual blog page, which I am attempting to analyze as a dedicated campaign for my blog.
amanda
-
I think what's happening, and I don't understand this part especially (although I eluded to it originally), but I am analyzing my blog, the "too many links" is under my blog campaign, but when it names a "page" it is actually a website page renamed with the blog title. I will post an example but it will come up as a bad link for you I think.
1- I think that perhaps I have this campaign set up wrong? So it isn't analyzing the pages correctly, or the right pages
-
The issue here is likely that your blog template and plugins combined are creating a situation where there are lots of links on the page to begin with (navas, footer, archives etc. etc.). As Ryan said, all of those other links on the page that are not in your post (page) count too, so when you add a few more inline in your post (or page) you are pushed over the top.
Depending on what type of CMS you use, getting the number links on the page down is probably going to involve some kind of change (template/plugins) to remove some site wide links you don't really need. Once you do that, you should see that 140 number go away (or at least way down).
Side note, it was reported elsewhere that the "Found over 41 years ago" thing is a bug, you should open a help ticket and let the staff here know you were affected by it.
Brian
-
If you can offer a link to the page, we can be a lot more helpful.
Based on what you shared all I can offer is the software looks at the <a>tags on the page to locate links, and apparently 140 were located. Links include all the links on the page include the site's navigation links, footer links, sidebar links, image links, social sharing links, etc.</a>
<a>You may also use the "Highlight Links or Text" icon located on the MOZbar to locate various links on a given page.</a>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do javascript pseudo-links dilute link juice ?
Hi, On our ecommerce, we use multiple pseudo-links for the layered navigation (to filter by color, site, etc), so that google doesn't crawl every combination of filters. I know this kind of links don't pass link juice and don't get crawled (provided you hide the target urls in your javascript). But, as there is an "onclick" property, I'm afraid that google could understand that these are links, and treat them the same way as nofollowed links (not following them but diluting link juice anyway). Do you know if this is the case ? Thanks,
On-Page Optimization | | Strelok0 -
How many Anchor text i can make on One page.
I would like to have clear answer in numbers i.e. 1, 2, 3, or 4 etc. of how many Anchor text i can make on One page.????
On-Page Optimization | | 1akal0 -
How many words per page should be have?
Hi, How many words per page should we have? And how many keywords should be in there for optimal ranking> Thanks Andrew
On-Page Optimization | | Studio330 -
How many outbound links is too many outbound links?
As a part of our SEO strategy, we have been focusing on writing several high quality articles with unique content. In these articles we regularly link to other websites when they are high quality, authoritative sites. Typically, the articles are 500 words or more and have 3-5 outbound links, but in some cases there are as many as 7 or 8 outbound links. Before we get too carried away with outbound links, I wanted to get some opinions on how many outbound links we should be trying to include and more information on how the outbound links work. Do they pass our website's authority on to the other website? Could our current linking strategy cause future SEO problems? Finally, do you have any suggestions for guidelines we should be using? Thank you for your help!
On-Page Optimization | | airnwater0 -
Locating Duplicate Pages
Hi, Our website consists of approximately 15,000 pages however according to our Google Webmaster Tools account Google has around 26,000 pages for us in their index. I have run through half a dozen sitemap generators and they all only discover the 15,000 pages that we know about. I have also thoroughly gone through the site to attempt to find any sections where we might be inadvertently generating duplicate pages without success. It has been over six months since we did any structural changes (at which point we did 301's to the new locations) and so I'd like to think that the majority of these old pages have been removed from the Google Index. Additionally, the number of pages in the index doesn't appear to be going down by any discernable factor week on week. I'm certain it's nothing to worry about however for my own peace of mind I'd like to just confirm that the additional 11,000 pages are just old results that will eventually disappear from the index and that we're not generating any duplicate content. Unfortunately there doesn't appear to be a way to download a list of the 26,000 pages that Google has indexed so that I can compare it against our sitemap. Obviously I know about site:domain.com however this only returned the first 1,000 results which all checkout fine. I was wondering if anybody knew of any methods or tools that we could use to attempt to identify these 11,000 extra pages in the Google index so we can confirm that they're just old pages which haven’t fallen out of the index yet and that they’re not going to be causing us a problem? Thanks guys!
On-Page Optimization | | ChrisHolgate0 -
I have a client where every page has over 100 links
Some links are in the main navigation (it has a secondary and tertiary level) and some links are repeated in the left navigation. Every page has over 100 links if crawled. From a practical standpoint, would you (a) delete the 3rd-level links (or at least argue for that) or (b) rel='nofollow' them? From a usability standpoint, this setup works as they are almost one click from everything. From a crawl standpoint, I see some pages missed in google (the sitemap has over 200 links). Looking for the best on-page current SEO advice to set these guys on the road to success.
On-Page Optimization | | digimech0 -
Webpage with around 200+ inter links per page
HI, I have a client that I recently took on and I run a SEOmoz crawl on the website and a minor error was that .. 1. web pages have more than 200+ interlinks, reason for this is becuase the client offers a service to every place in the world and then has links to contact us/about us etc those kind of pages. is this something we need to strongly avoid? 2. The title tags are all over the suggested amount by 1/2 letters. - again is this really bad?
On-Page Optimization | | Prestige-SEO0 -
No index parts of a page?
Little bit of an odd question this, but how would one go about getting Google to not index certain content on a page? I'm developing an online store for a client and for a few of the products they will be stocking they will be using the manufacturers specs and descriptions. These descriptions and specs, therefore, will not be unique as they will be also used by a number of other websites. The title tag, onpage h1 etc will be fine for the seo of the actual pages (with backlinks, of course) so the impact of google not counting the description should be slight. I'm sure this can be done but for the life of me I cannot remember how. Thanks Carl
On-Page Optimization | | Grumpy_Carl0