Best way to view Global Navigation bar from GoogleBot's perspective
-
Hi,
Links in the global navigation bar of our website do not show up when we look at Google cache --> text only version of the page.
These links use "style="<a class="attribute-value">display:none;</a>" when we looked at HTML source.
But if I use "user agent switcher" add-on in Firefox and set it to Googlebot, the links in global nav are displayed.
I am wondering what is the best way to find out if Google can/can not see the links.
Thanks for the help!
Supriya.
-
The global navigation is created by an out-of-the-box widget which is using style=display:none.
If I use "fetch as Googlebot" on this page, the links are displayed in code.
So I believe Googlebot is able to see the links but "display:none" doesn't make sense
from SEO POV.
-
What's the reason for the style you are using? Are you trying to hide them for a "benefit"?
If another agent can see it, Google can see it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to handle deletion of a forum subdomain?
Hello All Our site www.xxxx.com has long had a forum subdomain forum.xxxx.com. We have decided to sunset the forum. We find that the 'Ask a Question' function on product pages and our social media presence are more effective ways of answering customers' product & project technical Qs. Simply shutting down the forum server is going to return thousands of 404s for forum.xxxx.com, which I can't imagine would be helpful for the SEO of www.xxxx.com even though my understanding is that subdomains are sort of handled differently than the main site. We really tremendously on natural search traffic for www.xxxx.com, so I am loathe to make any moves that would hurt us. I was thinking we should just keep the forum server up but return 410s for everything on it, including the roughly ~3,000 indexed pages until they are removed from the index, then shut it down. The IT team also gave the option of simply pointing the URL to our main URL, which sorta scares me because it would then 200 and return the same experience hitting it from forum.xxxx.com as www.xxxx.com, which sounds like a very bad idea. (Yes, we do have canonicals on www.xxxx.com). In your opinion, what is the best way to handle this matter? Thank You
Intermediate & Advanced SEO | | jamestown0 -
What's the best way to A/B test new version of your website having different URL structure?
Hi Mozzers, Hope you're doing good. Well, we have a website, up and running for a decent tenure with millions of pages indexed in search engines. We're planning to go live with a new version of it i.e a new experience for our users, some changes in site architecture which includes change in URL structure for existing URLs and introduction of some new URLs as well. Now, my question is, what's the best way to do a A/B test with the new version? We can't launch it for a part of users (say, we'll make it live for 50% of the users, an remaining 50% of the users will see old/existing site only) because the URL structure is changed now and bots will get confused if they start landing on different versions. Will this work if I reduce crawl rate to ZERO during this A/B tenure? How will this impact us from SEO perspective? How will those old to new 301 URL redirects will affect our users? Have you ever faced/handled this kind of scenario? If yes, please share how you handled this along with the impact. If this is something new to you, would love to know your recommendations before taking the final call on this. Note: We're taking care of all existing URLs, properly 301 redirecting them to their newer versions but there are some new URLs which are supported only on newer version (architectural changes I mentioned above), and these URLs aren't backward compatible, can't redirect them to a valid URL on old version.
Intermediate & Advanced SEO | | _nitman0 -
Hreflang targeted website using the root directory's description & title
Hi there, Recently I applied the href lang tags like so: Unfortunately, the Australian site uses the same description and title as the US site (which was the root directory initially), am i doing something wrong? Would appreciate any response, thanks!
Intermediate & Advanced SEO | | oliverkuchies0 -
Are clean mobile URL's necessary?
Adding code to redirect/clean up ugly URL's slows down mobile site performance, so it is necessary if we are already using rel=alternate tags on our desktop/www pages?
Intermediate & Advanced SEO | | recbrands0 -
Why my site it's not being indexed?
Hello.... I got to tell that I feel like a newbie (I am, but know I feel like it)... We were working with a client until january this year, they kept going on their own until september that they contacted us again... Someone on the team that handled things while we were gone, updated it´s robots.txt file to Disallow everything... for maybe 3 weeks before we were back in.... Additionally they were working on a different subdomain, the new version of the site and of course the didn't block the robots on that one. So now the whole site it's been duplicated, even it´s content, the exact same pages exist on the suddomain that was public the same time the other one was blocked. We came in changes the robots.txt file on both server, resend all the sitemaps, sent our URL on google+... everything the book says... but the site it´s not getting indexed. It's been 5 weeks now and no response what so ever. We were highly positioned on several important keywords and now it's gone. I now you guys can help, any advice will be highly appreciated. thanks Dan
Intermediate & Advanced SEO | | daniel.alvarez0 -
Best way to set up anchor text on parked pages?
Our company is no longer offering a series of products, much to the disappointment of our SEO team since we've spent a long time building up the pages and getting them ranked organically. The pages all have decent page rank and in some cases rank #1 for the primary keyword. We have a sister company that we acquired a year ago and they still offer these products on their website. They are a completely separate company with their own website which existed long before we acquired them and we have nothing to do with their website. Our team has proposed that rather than take down the URLs on our site for the products we no longer offer, to put a message saying something like "sorry we don't offer this anymore but you may be interested in this.." and then link to our sister company with anchor text so that they can get some benefit from our SEO efforts if we can't. The question/issue is how should we do that since there will be a lot of pages from the same domain, about 20 pages, all linking to a few pages on a different domain. Should the anchor text be varied unbranded or branded? On the one hand I think if we change up the anchor text used to link to another page many times from a single domain that looks strange and transparent to google. On the other hand unbranded text would be the better descriptor for users since we are deep linking to the product not the homepage of the other site.
Intermediate & Advanced SEO | | edu-SEO0 -
Re-Direct Users But Don't Affect Googlebot
This is a fairly technical question... I have a site which has 4 subdomains, all targeting a specific language. The brand owners don't want German users to see the prices on the French sub domain and are forcing users into a re-direct to the relevant subddomain, based on their IP address. If a user comes from a different country, (ie the US) they are forced on the UK sub domain. The client is insistent on keeping control of who sees what (I know that's a debate in it's own right), but these re-directs we're implementing to make that happen, are really making it difficult to get all the subdomains indexed as I think googlebot is also getting re-directed and is failing to do it's job. Is there are a way of re-directing users, but not Googlebot?
Intermediate & Advanced SEO | | eventurerob0 -
.com ranking over other ccTLD's that were created
We had a ecommerce website that used to function as the website for every other locale we had around the world. For example the French version was Domain.com/fr_FR/ or a German version in English would be Domain.com/en_DE/. Recently we moved all of our larger international locales to their corresponding ccTLD so no we have Domain.fr and Domain.de.(This happened about two months ago) The problem with this is that we are getting hardly any organic traffic and sales on these new TLD's. I am thinking this is because they are new but I am not positive. If you compare the traffic we used to see on the old domain versus the traffic we see on the new domain it is a lot less. I am currently going through to make sure that all of the old pages are not up and the next thing I want to know is for the old pages would it be better to use a 301 re-direct or a rel=canonical to the new ccTLD to avoid duplicate content and those old pages from out ranking our new pages? Also what are some other causes for our traffic being down so much? It just seems that there is a much bigger problem but I don't know what it could be.
Intermediate & Advanced SEO | | DRSearchEngOpt0