Too many on page links for WP blog page
-
Hello,
I have set my WP blog to a page so new posts go to that page making it the blog. On a SEOmoz campaign crawl, it says there are too many links on one page, so does this mean that as I am posting my blog posts to this page, the search engines are seeing the page as one page with links instead of the blog posts?
I worry that if I continue to add more posts (which obviously I want to) the links will increase more and more, meaning that they will be discounted due to too many links.
What can I do to rectify this?
Many thanks in advance
-
Ah ok I get it...sorry I am a little slow!
Thanks for your help and I will appy that method to remove the cat. links.
-
No, there are indeed over 100 <a>tags on the page. I guess I wasn't very clear... sorry for the confusion.</a>
<a>You have 10 blog post summaries listed on the page. Each post summary has a few links plus there are many other nav links on the page, bringing you up over the 100 mark. If you keep adding posts that count should stay roughly the same....when you add more posts the older ones are automatically pushed through to Older Posts, keeping the 10 latest summaries on that page.</a>
<a>As I mentioned earlier, one method to reduce the</a> <a>tag count on the page would be to remove the category links. They don't seem necessary since you are only using the one blog category.</a>
-
On the SEOmoz crawl, it says that the blog category has 106 links on it so I assumed that was becuaes of the page the posts are harvested on? Is that wrong as if its only 10 or so I can handle that!
-
How many posts do you want to have on a page? If your count is around 10 you should be in good shape.... and you can control how many of these posts appear on your first (summary) page. It looks like you're at 10 now; any additional posts are viewed through your "view older posts" at the bottom.
Agreed though, if you're showing 15+ post starts on your list then you'll certainly be over in link count.
-
Ok thanks, that sounds like a good idea. My worry is though, that as I develop the blog and add more posts, this time next year for example I will surely have at least double the amount of links? I dont see any way round it really just because I am using the page to harvest the posts?
-
You have 10 stubs on the page ... that's a decent number and will keep your page's link count reasonable.
What about the idea of removing the "Filed under BLOG" links? If you're just running the single blog category, you wouldn't really need the links. That will save you 10 links right off the bat that can be reserved for links in the short descriptions.
The 100 link limit is a good rule of thumb, but you shouldn't be penalized if you're going a little over.
-
Good idea...sorry! Its here.
Thanks
-
Do you have a link to check out?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why does Google's search results display my home page instead of my target page?
Why does Google's search results display my home page instead of my target page?
Technical SEO | | h.hedayati6712365410 -
Backlink Profile: Should I disavow these links? Auto-Generated Links etc
Hello Moz Community, At first I wanted to say that I really like the Q&A section and that I read and learned a lot - and today it is time for my first own question 😉 I checked our backlink-profile these days and I found in my opinion a few bad/spammy links, most of them are auto-generated by pickung up some (meta) information from our webpage. Now my question is if I should dasavow these links over webmasters or if these links shouldn't matter as I guess basically every webpage will be picked up from them. Especially from the perspective that our rankings dropped significantly last weeks, but I am not sure if this can be the real reason. Examples are pages like: https://www.askives.com/ -Auto-Generates for example meta descriptions with links http://www.websitesalike.com/ -find similar websites http://mashrom.ir/ -no idea about this, really crazy Or we are at http://www.europages.com/, which makes sense for me and we get some referral traffic as well, but they auto-generated links from all their TLDs like .gr / .it / .cn etc. -just disavow all other TLDs than .com? Another example would be links from OM services like: seoprofiler.com Moreover we have a lot of links from different HR portals (including really many outdated job postings). Can these links “hurt” as well? Thanks a lot for your help! Greez Heiko
Technical SEO | | _Heiko_0 -
Problems with to many indexed pages
A client of our have not been able to rank very well the last few years. They are a big brand in our country, have more than 100+ offline stores and have plenty of inbound links. Our main issue has been that they have to many indexed pages. Before we started we they had around 750.000 pages in the Google index. After a bit of work we got it down to 400-450.000. During our latest push we used the robots meta tag with "noindex, nofollow" on all pages we wanted to get out of the index, along with canonical to correct URL - nothing was done to robots.txt to block the crawlers from entering the pages we wanted out. Our aim is to get it down to roughly 5000+ pages. They just passed 5000 products + 100 categories. I added this about 10 days ago, but nothing has happened yet. Is there anything I can to do speed up the process of getting all the pages out of index? The page is vita.no if you want to have a look!
Technical SEO | | Inevo0 -
How many keyword per page
how many keyword per page will optimize my page to list on best rank in Google search ?
Technical SEO | | krisanantha0 -
404's in WMT are old pages and referrer links no longer linking to them.
Within the last 6 days, Google Webmaster Tools has shown a jump in 404's - around 7000. The 404 pages are from our old browse from an old platform, we no longer use them or link to them. I don't know how Google is finding these pages, when I check the referrer links, they are either 404's themselves or the page exists but the link to the 404 in question is not on the page or in the source code. The sitemap is also often referenced as a referrer but these links are definitely not in our sitemap and haven't been for some time. So it looks to me like the referrer data is outdated. Is that possible? But somehow these pages are still being found, any ideas on how I can diagnose the problem and find out how google is finding them?
Technical SEO | | rock220 -
When Is It Good To Redirect Pages on Your Site to Another Page?
Suppose you have a page on your site that discusses a topic that is similar to another page but targets a different keyword phrase. The page has medium quality content, no inbound links, and the attracts little traffic. Should you 301 redirect the page to a stronger page?
Technical SEO | | ProjectLabs1 -
External Links on a Front Page
Does anyone have any links to information about external links on a front page ? I am advising a client that this is not the best idea and that they could be put in a different place but can't find any proof of this.
Technical SEO | | marcelo-2753980 -
Mask links with JS that point to noindex'ed paged
Hi, in an effort to prepare our page for the Panda we dramatically reduced the number of pages that can be indexed (from 100k down to 4k). All the remaining pages are being equipped with unique and valuable content. We still have the other pages around, since they represent searches with filter combination which we deem are less interesting to the majority of users (hence they are not indexed). So I am wondering if we should mask links to these non-indexed pages with JS, such that Link-Juice doesn't get lost to those. Currently the targeted pages are non-index via "noindex, follow" - we might de-index them with robots.txt though, if the "site:" query doesn't show improvements. Thanks, Sebastian
Technical SEO | | derderko0