How authentic is a dynamic footer from bots' perspective?
-
I have a very meta level question. Well, I was working on dynamic footer for the website: http://www.askme.com/, you can check the same in the footer. Now, if you refresh this page and check the content, you'll be able to see a different combination of the links in every section. I'm calling it a dynamic footer here, as the values are absolutely dynamic in this case.
**Why are we doing this? **For every section in the footer, we have X number of links, but we can show only 25 links in each section. Here, the value of X can be greater than 25 as well (let's say X=50). So, I'm randomizing the list of entries I have for a section and then picking 25 elements from it i.e random 25 elements from the list of entries every time you're refreshing the page.
Benefits from SEO perspective? This will help me exposing all the URLs to bots (in multiple crawls) and will add page freshness element as well.
**What's the problem, if it is? **I'm wondering how bots will treat this as, at any time bot might see us showing different content to bots and something else to users. Will bot consider this as cloaking (a black hat technique)? Or, bots won't consider it as a black hat technique as I'm refreshing the data every single time, even if its bot who's hitting me consecutively twice to understand what I'm doing.
-
Thank you so much Sir Alan. I really appreciate your efforts for compiling this detailed response to my questions. Have noted down all the points along with how better I can handle them, will soon come up with a better fat footer.
-
Nitin
You're dealing with multiple considerations and multiple issues in this setup.
First, it's a matter of link distribution. When you link to x pages from page 1, this informs search engines "we think these are important destination pages". If you change those links every day, or on every refresh, and if crawlers also encounter those changes, it's going to strain that communication.
This is something that happens naturally on news sites - news changes on a regular basis. So it's not completely invalid and alien to search algorithms to see or deal with. And thus it's not likely their systems would consider this black hat.
The scale and frequency of the changes is more of a concern because of that constantly changing link value distribution issue.
Either X cities are really "top" cities, or they are not.
Next, that link value distribution is further weakened by the volume of links. 25 links per section, three sections - that's 75 links. Added to the links at the top of the page, the "scrolling" links in the main content area of the home page, and the actual "footer" links (black background) so it dilutes link equity even further. (Think "going too thin" with too many links).
On category pages it's "only" 50 links in two sub-footer sections. Yet the total number of links even on a category page is a concern.
And on category pages, all those links dilute the primary focus of any main category page. If a category page is "Cell Phone Accessories in Bangalore", then all of those links in the "Top Cities" section dilute the location. All the links in the "Trending Searches" section dilute the non-geo focus.
What we end up with here then is an attempt to "link to all the things". This is never a best practice strategy.
Best practice strategies require a refined experience across the board. Consistency of signals, combined with not over-straining link equity distribution, and combined with refined, non-diluted topical focus are the best path to the most success long-term.
So in the example of where I said initially that news sites change the actual links shown when new news comes along, the best news sites do that while not constantly changing the primary categories featured, and where the overwhelming majority of links on a single category page are not diluted with lots of links to other categories. Consistency is critical.
SO - where any one or a handful of these issues might themselves not be a critical flaw scale big problem, the cumulative negative impact just harms the site's ability to communicate a quality consistent message.
The combined problem here then needs to be recognized as exponentially more problematic because of the scale of what you are doing across the entire site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why My Website's Rank still in Millions
I am getting enough Traffic on my website on best weed killer on affiliate but Moz still showing its Rank in millions. What would be the best strategy to improve the rankings.???
White Hat / Black Hat SEO | | sarahelen0 -
Hiding ad code from bots
Hi. I have a client who is about to deploy ads on their site. To avoid bots clicking on those ads and skewing data, the company would like to prevent any bots from seeing any ads and, of course, that includes Googlebot. This seems like it could be cloaking and I'd rather not have a different version of the sites for bots. However, knowing that this will likely happen, I'm wondering how big of a problem it could be if they do this. This change isn't done to manipulate Googlebot's understanding of the page (ads don't affect rankings, etc.) and it will only be a very minimal impact on the page overall. So, if they go down this road and hide ads from bots, I'm trying to determine how big of a risk this could be. I found some old articles discussing this with some suggesting it was a problem and others saying it might be okay in some cases (links below). But I couldn't find any recent articles about this. Wondering if anybody has seen anything new or has a new perspective to share on this issue? Is it a problem if all bots (including Googlebot) are unable to see ads? https://moz.com/blog/white-hat-cloaking-it-exists-its-permitted-its-useful
White Hat / Black Hat SEO | | Matthew_Edgar
https://www.webmasterworld.com/google/4535445.htm
https://www.youtube.com/watch?v=wBO-1ETf_dY0 -
What do you think of this "SEO software" that uses Rand's "proven method" ?
I saw an ad on Search Engine Roundtable and the call to action was... "What is the #1 metric that Google uses to rank websites?" I thought, "I gotta know that!". (I usually don't click ads but this one tempted me.) So I clicked in and saw a method "proven by Rand Fishkin" that will "boost the rankings of your website". This company has software that will use Rand's proven method (plus data from another unattributed test to boost the rankings of your website). I am not going to use this software. The video made my BS meter ring. But if you want to see it.... http://crowdsearch.me/special-backdoor/ Rather than use this "software", I would suggest using kickass title tags that deliver the searcher to kickass content. That has worked really well for me for years. Great title tags and great content will produce the same results. The bonus for you is that the great content will give you a real website.
White Hat / Black Hat SEO | | EGOL1 -
Advanced Outside Perspective Requested to Combat Negative SEO
**Situation: **We are a digital marketing agency that has been doing SEO for 6 years. For many years, we maintained exceptional rankings and online visibility.However, I suppose with great rankings comes great vulnerability. Last year, we became the target of a pretty aggressive and malicious negative SEO campaign from another other SEO(s) in our industry - I'm assuming they're competitors. Overnight, there were 10,000+ links built on various spam domains using the anchor text: negative marketing services poor seo butt crack kickass ... and more (see attached image) The issue we face are: Time Investment - Enormous investment of time and energy to contact each web admin for link removal. Hard to Keep Up - When we think we're getting somewhere, new links come out of the woodwork. Disavow Doesn't Work - Though we've tried to generally avoid the disavow tool, we've had to use it for a few domains. However, it's difficult to say how much effect, if any, it's had on the negative links. As you can imagine, we've seen an enormous drop in organic traffic since this all started. It's unfortunate that SEO has come to this point, but I still see a lot of value in what we do and hope that spammers don't completely ruin it for us one day. Moz Community - I come to you seeking some new insight, advice, similar experiences or anything else that may help! Are there any other agencies that have experienced the same issue? Any new ways to combat really aggressive negative SEO link building? Thanks everyone! UUPPplJ
White Hat / Black Hat SEO | | ByteLaunch0 -
Are directory listings still appropriate in 2013? Aren't they old-style SEO and Penguin-worthy?
We have been reviewing our off-page SEO strategy for clients and as part of that process, we are looking at a number of superb info-graphics on the subject. I see that some of current ones still list "Directories" as being part of their off-page strategy. Aren't these directories mainly there for link-building purposes and provide Users no real benefit? I don't think I've ever seen a directory that I would use, apart for SEO research. Surely Google's Penguin algorithm would see directories in the same way and give them less value, or even penalise websites that use them to try to boost page rank? If I were to list my websites on directories it wouldn't be to share my lovely content with people that use directories to find great sites, it would be to sneakily build page rank. Am I missing the point? Thanks
White Hat / Black Hat SEO | | Crumpled_Dog
Scott0 -
It Shows as "google results" but it's an incoming links, is it spaming me...?
Hello everyone I have 2 issues to share: 1) We have a site (personal-loans.org), In the past few weeks we notice that there are sites that have links to our site and we get traffic from them but...! when you go online to these sites they show you that all they do is provide "google search" results, because we where in first page on the results we had hits there as well what leads me to think that this is the reason we are at page 7 now after yesterday the ranking was at page 4. these are some of these sites so you can see it: internetpayadvances.com fastlivecashadvance.com assistancemoney.com scoutcashnow.com officialpayday.net Does anyone else got to see anything like that...??? I have many more links like that, these are only 5 out of 9 that had hits yesterday only, site traffic went from 250-300 to 63 a day... For the same site - it was on google search results 1st page and ranked 4-7, even after the big penguin changes. What we did notice is that A LOT of non related sites like surfing (yes ocean surfing) and sites that had no content AT ALL - all the text was inside of an image and ranked 3! 3rd on payday loans search result. (and the rest was and still just looks the same with different content...) Google say they want quality but does not do homework for the 2nd largest search for keywords such as loans and payday loans market, same goes for the cash advance. Please help, need your advice.... Thanks
White Hat / Black Hat SEO | | Yonnir0 -
"Unnatural Linking" Warning/Penalty - Anyone's company help with overcoming this?
I have a few sites where I didn't manage the quality of my vendors and now am staring at some GWT warnings for unnatural linking. I'm assuming a penalty is coming down the pipe and unfortunately these aren't my sites so looking to get on the ball with unwinding anything we can as soon as possible. Does anyone's company have experience or could pass along a reference to another company who successfully dealt with these issues? A few items coming to mind include solid and speedy processes to removing offending links, and properly dealing with the resubmission request?
White Hat / Black Hat SEO | | b2bmarketer0 -
Why doesn't Google find different domains - same content?
I have been slowly working to remove near duplicate content from my own website for different locals. Google seems to be doing noting to combat the duplicate content of one of my competitors showing up all over southern California. For Example: Your Local #1 Rancho Bernardo Pest Control Experts | 858-352 ... <cite>www.pestcontrolranchobernardo.com/</cite>CachedYou +1'd this publicly. UndoPest Control Rancho Bernardo Pros specializes in the eradication of all household pests including ants, roaches, etc. Call Today @ 858-352-7728. Your Local #1 Oceanside Pest Control Experts | 760-486-2807 ... <cite>www.pestcontrol-oceanside.info/</cite>CachedYou +1'd this publicly. UndoPest Control Oceanside Pros specializes in the eradication of all household pests including ants, roaches, etc. Call Today @ 760-486-2807. The competitor is getting high page 1 listing for massively duplicated content across web domains. Will Google find this black hat workmanship? Meanwhile, he's sucking up my business. Do the results of the competitor's success also speak to the possibility that Google does in fact rank based on the name of the url - something that gets debated all the time? Thanks for your insights. Gerry
White Hat / Black Hat SEO | | GerryWeitz0