What do do when sidebar is causing "Too Many On-Page Links" error
-
I have been going through all the errors, warnings from my weekly SEO Moz scans. One thing I'm see a bit of is "Too Many On-Page Links". I've only seen a few, but as in the case of this one: http://blog.mexpro.com/5-kid-friendly-cancun-mexico-resorts
there is only 2 links on the page (the image and the read more). So I think the sidebar links are causing the error. I feel my tags are important to help readers find information they may be looking for. Is there a better method to present tags than the wordpress tag cloud? Should I exclude the tags, with the risk of making things more difficult for my users?
Thanks for your help.
-
Don't stack 301s. i.e. Don't redirect to another redirect. Update all your current redirects to point to the final pages. In general, you shouldn't worry about too many redirects.
I haven't watched the webinar but if you are currently doing well in rankings, I wouldn't change the permalink structure. I also think that WP automatically sets up redirects when you change the permalink structure (from my experience, would like someone to confirm though).
-
My only issue with this is I will have to do another 301 redirect. I have moved my blog once and have performed many 301's. Also I want to redirect all my category pages per the WP webinar given a couple weeks ago to this permalink: %year%/%category%//%postname% so then I will have another stack of redirects. Just concerned about all the redirects. Any advice?
-
Is there a better method to present tags than the wordpress tag cloud?
Yes, categories. If anything, the current tag cloud is annoying to use (different sizes, ugly text, repetitive tags). Organize them into several categories that describe the topics accurately and cause minimal overlap.
Also want to mention that your navigation dropdown links is causing the majority of the problem (each section has several links under it). Once again, think about consolidating these links if possible.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"No Information Available" Error for Homepage in Google
Hi Everyone, Been racking my brain around this one. Not sure why it is happening. Basically Google is showing the "www" version of the homepage, when 99% of the site is "non-www". It also says "No Information Available". I have tried submitting it through GSC, but it is telling me it is blocked through the Robots.txt file. I don't see anything in there that would block it. Any ideas? shorturl.at/bkpyG I would like to get it to change to the regular "non-www" and actually be able to show information.
Intermediate & Advanced SEO | | vetofunk0 -
How would you link build to this page?
Hi Guys, I'm looking to build links to a commercial page similar to this: https://apolloblinds.com.au/venetian-blinds/ How would you even create quality links (not against Google TOS) to a commercial page like that? Any ideas would be very much appreciated. Cheers.
Intermediate & Advanced SEO | | spyaccounts140 -
Should I remove pages to concentrate link juice?
So our site is database powered and used to have up to 50K pages in google index 3 years ago. After re-design that number was brought down to about 12K currently. Legacy URLs that are now generating 404 have mostly been redirected to appropriate pages (some 13K 301 redirects currently). Trafficked content accounts for about 2K URLs in the end so my question is should I in context of concentrating link juice to most valuable pages: remove non-important / least trafficked pages from site and just have them show 404 no-index non-important / least trafficked pages from site but still have them visible 1 or 2 above plus remove from index via Webmaster Tools none of the above but rather something else? Thanks for any insights/advice!
Intermediate & Advanced SEO | | StratosJets0 -
Error reports showing pages that don't exist on website
I have a website that is showing lots of errors (pages that cannot be found) in google webmaster tools. I went through the errors and re-directed the pages I could. There are a bunch of remaining pages that are not really pages this is why they are showing errors. What's strange is some of the URL's are showing feeds which these were never created. I went into Google webmaster tools and looked at the remove URL tool. I am using this but I am confused if I need to be selecting "remove page from search results and cache" option or should I be selecting this other option "remove directory" I am confused on the directory. I don't want to accidentally delete core pages of the site from the search engines. Can anybody shed some light on this or recommend which I should be selecting? Thank you Wendy
Intermediate & Advanced SEO | | SOM240 -
Webmaster Tools - Structured Data 100% drop. Many people with same issue, nobody seems to understand what might have caused it.
WMT shows a significant drop in structured data markup on June 7th, steep incline by June 21st. Now the same thing happened on August 9th, with no signs of recovery. Lost 45% of our search traffic. There are many people with the same problem, and nobody seems to know what caused it. Here are a few links to some forums: #1 Google Groups, #2 Google Groups, #3 Google Groups, #4 70% drop on GWT on June 7 Google SEO News and Discussion forum at WebmasterWorld. On our end we see a 100% drop in breadcrumbs and a 100% drop in hcards leading to a 45% search traffic drop. Any ideas why might have happened and how to fix this?
Intermediate & Advanced SEO | | PhilippGreitsch0 -
Fetch as GoogleBot "Unreachable Page"
Hi, We are suddenly having an error "Unreachable Page" when any page of our site is accessed as Googlebot from webmaster tools. There are no DNS errors shown in "Crawl Errors". We have two web servers named web1 and web2 which are controlled by a software load balancer HAProxy. The same network configuration has been working for over a year now and never had any GoogleBot errors before 21st of this month. We tried to check if there could be any error in sitemap, .htaccess or robots.txt by excluding the loadbalancer and pointing DNS to web1 and web2 directly and googlebot was able to access the pages properly and there was no error. But when loadbalancer was made active again by pointing the DNS to it, the "unreachable page" started appearing again. This very same configuration has been working properly for over a year till 21st of this month. Website is properly accessible from browser and there are no DNS errors either as shown by "Crawl Errors". Can you guide me about how to diagnose the issue. I've tried all sorts of combinations, even removed the firewall but no success. Is there any way to get more details about error instead of just "Unreachable Page" error ? Regards, shaz
Intermediate & Advanced SEO | | shaz_lhr0 -
Duplicate page content and Duplicate page title errors
Hi, I'm new to SeoMoz and to this forum. I've started a new campaign on my site and got back loads of error. Most of them are Duplicate page content and Duplicate page title errors. I know I have some duplicate titles but I don't have any duplicate content. I'm not a web developer and not so expert but I have the impression that the crawler is following all my internal links (Infact I have also plenty of warnings saying "Too many on-page links". Do you think this is the cause of my errors? Should I implement the nofollow on all internal links? I'm working with Joomla. Thanks a lot for your help Marco
Intermediate & Advanced SEO | | marcodublin0 -
Links to Facebook pages
I would like to ask if anyone has any knowledge regarding linking to a company's facebook page. I have built a few links to a client's facebook page in an effort to have it rank better in SERPs. I just learned that unlike twitter and linkedin, it is apparently not possibly to directly link to facebook pages. At least it is not possible from a search engine's perspective. If you follow any facebook page link while you are not logged into facebook, you are redirected to the facebook home page. I can't think of any way around this obstacle. I'd love some clever solution such as providing a URL which includes a basic dummy facebook login but there is nothing I am aware of to achieve this result. Does anyone have any ideas on this topic?
Intermediate & Advanced SEO | | RyanKent0