Best way to link to 1000 city landing pages from index page in a way that google follows/crawls these links (without building country pages)?
-
Currently we have direct links to the top 100 country and city landing pages on our index page of the root domain.
I would like to add in the index page for each country a link "more cities" which then loads dynamically (without reloading the page and without redirecting to another page) a list with links to all cities in this country.
I do not want to dillute "link juice" to my top 100 country and city landing pages on the index page.
I would still like google to be able to crawl and follow these links to cities that I load dynamically later.In this particular case typical site hiearchy of country pages with links to all cities is not an option.
Any recommendations on how best to implement?
-
thanks Dimitrii
-
It wouldn't matter if links are in the hidden div or not. As long as they are in the code - it will "dilute".
-
Dimitrii thanks a lot. Yes all would be follow links.
I guess my main concern is the "dillution" of link juice. In several places I read that the concept of optimizing site architecture for optimizing for "link juice" would be a concept of the past. But from all my understanding how google values links it makes sense that it could affect rankings of my core landing pages if I have on the index page links to 1000 instead of 100 core landing pages.
Also my doubt is whether adding links in hidden DIV would affect dillution. Here I guess the answer can only be given by somebody who added lots of links in hidden DIV and monitored whether or not it had a noticeable impact on rankings.
-
So, even though google supposedly can crawl ajax, there are always troubles with it. Check this article: http://searchengineland.com/can-now-trust-google-crawl-ajax-sites-235267
And here is a thing about diluting. If Google can indeed crawl ajax, it will be the same in terms of dilution, since it would be able to see all the links. Here is another question - are all those links follow?
-
Technically basically there would be 2 options:
- javascript: hiding DIV
- ajax (loading it only once user clicks) => here I understand that google can follow ajax links but will do this much more inconsistent since following ajax links requires more of google's ressources.
Ideally I would like to use javascript (hiding DIV), but I am concerned about dilluting "link juice" to my top 100 landing pages. Would this be a relevant concern?
-
Howdy.
So, if you want those links to be crawlable, it means that those links have to be specifically in the code of the website. It can be invisible to the user, but it has to be "visible" to crawlers, meaning you have to see it in source code or something. So, what you can do is have "see more" button, which simply only displays content, which is already loaded. Another way is to have completely separate page not available to users with all the links there and noindex, follow meta robots.
I don't see other ways though. Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If a page ranks in the wrong country and is redirected, does that problem pass to the new page?
Hi guys, I'm having a weird problem: A new multilingual site was launched about 2 months ago. It has correct hreflang tags and Geo targetting in GSC for every language version. We redirected some relevant pages (with good PA) from another website of our client's. It turned out that the pages were not ranking in the correct country markets (for example, the en-gb page ranking in the USA). The pages from our site seem to have the same problem. Do you think they inherited it due to the redirects? Is it possible that Google will sort things out over some time, given the fact that the new pages have correct hreflangs? Is there stuff we could do to help ranking in the correct country markets?
Intermediate & Advanced SEO | | ParisChildress1 -
Any idea why Google Search Console stopped showing "Internal Links" and "Links to your site"
Our default eCommerce property (https://www.pure-elegance.com) used to show several dozen External Links and several thousand Internal Links on Google Search Console. As of this Friday both those links are showing "No Data Available". I checked other related properties (https://pure-elegance.com, http:pure-elegance.com and http://www.pure-elegance.com) and all of them are showing the same. Our other statistics (like Search Analytics etc.) remain unchanged. Any idea what might have caused this and how to resolve this?
Intermediate & Advanced SEO | | SudipG0 -
Is their value in linking to PPC landing pages and using rel="canonical"
I have ppc landing pages that are similar to my seo page. The pages are shorter with less text with a focus on converting visitors further along in the purchase cycle. My questions are: 1. Is there a benefit for having the orphan ppc pages indexed or should I no index them? 2. If indexing does provide benefits, should I create links from my site to the ppc pages or should I just submit them in a sitemap? 3. If indexed, should I use rel="canonical" and point the ppc versions to the appropriate organic page? Thanks,
Intermediate & Advanced SEO | | BrandExpSteve0 -
Best way to fix 404 crawl errors caused by Private blog posts in WordPress?
Going over Moz Crawl error report and WMT's Crawl errors for a new client site... I found 44 High Priority Crawl Errors = 404 Not Found I found that those 44 blog pages were set to Private Mode (WordPress theme), causing the 404 issue.
Intermediate & Advanced SEO | | SEOEND
I was reviewing the blog content for those 44 pages to see why those 2010 blog posts, were set to private mode. Well, I noticed that all those 44 blog posts were pretty much copied from other external blog posts. So i'm thinking previous agency placed those pages under private mode, to avoid getting hit for duplicate content issues. All other blog posts posted after 2011 looked like unique content, non scraped. So my question to all is: What is the best way to fix the issue caused by these 44 pages? A. Remove those 44 blog posts that used verbatim scraped content from other external blogs.
B. Update the content on each of those 44 blog posts, then set to Public mode, instead of Private.
C. ? (open to recommendations) I didn't find any external links pointing to any of those 44 blog pages, so I was considering in removing those blog posts. However not sure if that will affect site in anyway. Open to recommendations before making a decision...
Thanks0 -
Do search engines crawl links on 404 pages?
I'm currently in the process of redesigning my site's 404 page. I know there's all sorts of best practices from UX standpoint but what about search engines? Since these pages are roadblocks in the crawl process, I was wondering if there's a way to help the search engine continue its crawl. Does putting links to "recent posts" or something along those lines allow the bot to continue on its way or does the crawl stop at that point because the 404 HTTP status code is thrown in the header response?
Intermediate & Advanced SEO | | brad-causes0 -
Whats the best way to remove search indexed pages on magento?
A new client ( aqmp.com.br/ )call me yestarday and she told me since they moved on magento they droped down more than US$ 20.000 in sales revenue ( monthly)... I´ve just checked the webmaster tool and I´ve just discovered the number of crawled pages went from 3.260 to 75.000 since magento started... magento is creating lots of pages with queries like search and filters. Example: http://aqmp.com.br/acessorios/lencos.html http://aqmp.com.br/acessorios/lencos.html?mode=grid http://aqmp.com.br/acessorios/lencos.html?dir=desc&order=name Add a instruction on robots.txt is the best way to remove unnecessary pages of the search engine?
Intermediate & Advanced SEO | | SeoMartin10 -
Need to duplicate the index for Google in a way that's correct
Usually duplicated content is a brief to fix. I find myself in a little predicament: I have a network of career oriented websites in several countries. the problem is that for each country we use a "master" site that aggregates all ads working as a portal. The smaller nisched sites have some of the same info as the "master" sites since it is relevant for that site. The "master" sites have naturally gained the index for the majority of these ads. So the main issue is how to maintain the ads on the master sites and still make the nische sites content become indexed in a way that doesn't break Google guide lines. I can of course fix this in various ways ranging from iframes(no index though) and bullet listing and small adjustments to the headers and titles on the content on the nisched sites, but it feels like I'm cheating if I'm going down that path. So the question is: Have someone else stumbled upon a similar problem? If so...? How did you fix it.
Intermediate & Advanced SEO | | Gustav-Northclick0 -
How to best migrate / rebrand our organisation without losing the SEO on our current website?
I searched the current questions and found similar questions but nothing as specific as what I wanted, so... We are a graphic design school in Melbourne Australia. We have a website - www.graphicschool.com.au - that ranks fairly well in Google for our particular search terms. We have rebranded the organisation and want to change the website domain name to the new branding - www.grenadi.vic.edu.au - but obviously do not want to lose our hard earned SEO and rankings. We only have two strategies so far, and are not sure what the pros and cons to either strategy are, or whether there is a better way. The two ideas we have are: Option 1) Just swap the domain name. We were thinking about swapping the domain name and setting up 301 redirects to tell Google that the old page that ranked well is now this page 'x' on the new site. I've read that you lose all your valuable links doing this because they are domain specific and the 301 doesn't forward your links. Option 2) Build a second website. This idea is that we would build a second website with our new domain name and branding and build up that site until it ranks as highly as the first site and then start to remove the first site. We're planning on completely redeveloping the current website anyway and changing and adding lots more content as well so this option is not out of the question. Any help, thoughts, suggestions or further options would be greatly appreciated. Feel free to discuss. Can I also please suggest that a new category is added under 'The SEO Process' - something like 'rebranding / migration' Cheers, Anthony
Intermediate & Advanced SEO | | Grenadi0