Optimized site-wide internal links in footer - a problem?
-
Hello all - I am looking at a website with 8 heavily keyword optimized site-wide links in the footer. Yes, there are only 8 but it looks a bit spammy and I'm tempted to remove them. I imagine there's some possibility of a Google penalty too? What would your advice be? Thanks, Luke
-
Thanks Michael - you're dead right in your approach there - amazed how many have got it so wrong by writing for Googlebot and not the actual site users. Found this interesting re: internal links - plenty of discussion on the issue but definitely a lack of clarity: http://www.seroundtable.com/google-internal-links-anchor-text-16864.html
-
If they are internal links you can have and do what you like. However the key is user experience. Have you taken a look at you analytics to see how popular the links are and the visitor behaviour with them? At a guess, if they look keyword engineered they can be off-putting.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GSC Performance completely dropped off, but Google Analytics is steady. Why can't GSC track my site anymore?
Hey everyone! I'm having a weird issue that I've never experienced before. For one of my clients, GSC has a complete drop-off in the Performance section. All of the data shows that everything fell flat, or almost completely flat. But in Google Analytics, we have steady results. No huge drop-off in traffic, etc. Do any of you know why GSC would all of a sudden be unable to crawl our site? Or track this data? Let me know what you think!
Algorithm Updates | | TaylorAtVelox
Thanks!2 -
How can I tell if my site was impacted by the March 2019 Core Update?
I am trying to determine if my website was impacted by the March 2019 Core Update. Based on the various articles I have been reading, I do not believe my niche (software) was impacted. I see a very small tick up on search console and google analytics, but it is well within the normal range. Where else should I be looking to see if we were impacted? Thank you!
Algorithm Updates | | NikCall1 -
In one site a 3rd party is asking visitors to give feedback via pop-up that covers 30-50% of the bottom of the screen, depending on screen size. Is the 3rd party or the site in danger of getting penalized after the intrusive interstitial guidelines?
I am wondering whether the intrusive interstitial penalty affects all kinds of pop-ups regardless of their nature, eg if a third party is asking feedback through a discreet pop-up that appears from the bottom of the screen and covers max 50% of it. Is the site or the third party who is asking the feedback subject to intrusive interstitial penalty? Also is the fact that in some screens the popup covers 30% and in some others 50% plays any role?
Algorithm Updates | | deels-SEO0 -
Drastic Drop in Link Juice
Hi Back in December we shifted my web domain from a gourmetdirect.com to gourmetdirect.co.nz as part of a site-wide revamp. Everything was going along fine until recently when my Linking domains plummeted and external links fell from 6000 approx to 600. We still have the .com live for loads of disfunctional reasons. Can anyone help? I have gone from a top ranker to a no show and my contractors are all shaking their heads.
Algorithm Updates | | GourmetDirect0 -
Clean up of Links, What to get rid of?
We have been cleaning up our back office and preparing our .com domain to take all our future traffic and have got into a debate about how far to clean up the old past links. We have not ever had a penalty on the site as far as we know, but did once get the site taken offline by Google as they thought it was a malware site back in March this year. They put it straight back up and running in 5 hours, but was very strange as it is an amazon-webstore retail site. We are not sure why Google thought (edit: typo) this, so just in-case we have been combing through the historical links and now started to disavow any links we cannot get removed manually. So far just a couple of sites that have no relevance to our retail business. However, the debate we have been having is around Directory listings: Should we get rid of these too? Gut reaction is Yes, based on the need for quality relevant links for the end user, but then some are passing proper links to relevant sections of our site albeit in a directory format. Dmoz comes to mind Any thoughts? Bruce.
Algorithm Updates | | BruceA0 -
Redirected old domain to new, how long before seeing the external links under the new domain?
Before contracting SEO services, my client decided to change his established root domain to one more customer-friendly. Since he had no expertise on board, no redirects were set up until 6 months later. I ran stats right before the old domain was redirected and have a report showing that he had roughly 750 external links from 300 root domains. We redirected the old domain to the new domain in mid Jan 2012. Those external links are still not showing in Open Site Explorer for the new domain. I've tested it a dozen times, and the old domain definitely points to the new domain. How long should it take before the new domain picks up those external links? Should I do anything else to help the process along?
Algorithm Updates | | smsinc0 -
Large site with faceted navigation using rel=canonical, but Google still has issues
First off, I just wanted to mention I did post this on one other forum so I hope that is not completely against the rules here or anything. Just trying to get an idea from some of the pros at both sources. Hope this is received well. Now for the question..... "Googlebot found an extremely high number of URLs on your site:" Gotta love these messages in GWT. Anyway, I wanted to get some other opinions here so if anyone has experienced something similar or has any recommendations I would love to hear them. First off, the site is very large and utilizes faceted navigation to help visitors sift through results. I have implemented rel=canonical for many months now to have each page url that is created based on the faceted nav filters, push back to the main category page. However, I still get these damn messages from Google every month or so saying that they found too many pages on the site. My main concern obviously is wasting crawler time on all these pages that I am trying to do what they ask in these instances and tell them to ignore and find the content on page x. So at this point I am thinking about possibly using robots.txt file to handle these, but wanted to see what others around here thought before I dive into this arduous task. Plus I am a little ticked off that Google is not following a standard they helped bring to the table. Thanks for those who take the time to respond in advance.
Algorithm Updates | | PeteGregory0 -
Does Google do domain level topic modeling? If so, are off-site factors such as search traffic volume taken into account?
80% of my site's organic traffic is coming through a resource that is only somewhat related. Does Google think the main topic of my site is terms this resource targets thus bumping the terms I care about to a sub-topic level of sorts? If this is the case, would putting the resource information into a sub-domain help to solve the problem?
Algorithm Updates | | tatermarketing0