Does Frequency of content updates affect likelyhood outbound links will be indexed?
-
I have several pages on our website with low pr, that also themselves link to lots and lots of pages that are service/product specific. Since there are so many outbound links, I know that the small amount of PR will be spread thin as it is. My question is, if I were to supply fresh content to the top level pages, and change it often, would that influence whether or not google indexes the underlying pages? Also if I supply fresh content to the underlying pages, once google crawls them, would that guarantee that google considers them 'important' enough to be indexed"
I guess my real question is, can freshness of content and frequency of update convince google that the underlying pages are 'worthy of being indexed', and can producing fresh content on those pages 'keep google's interest', so to speak, despite having little if any pagerank.
-
Hello Ilya,
There are several good responses here, and I think some of them would depend on how large your site is and what types of pages they are. Judging by your URL example below, I'm guessing it is real estate related or at least that you have localized pages in different geographic areas.
You have a few issues here. First, this video might help, but it is sort of outdated and misleading in some ways. There may not be a set limit (i.e. we're only going to index 10k pages) but how much of your site gets indexed, and how often it gets crawled is based largely on the quality of your site (assuming all other factors are there, such as sitemaps and crawlable navigation, etc...). And the quality of your site depends on many, many different factors. Of course the two most important for this discussion would probably be uniqueness/usefulness of the content, and the amount of links the site and sections of the site, as well as the deep pages have.
The more links you can get into those deep pages, the more likely it is that Google is going to crawl more often, and index those pages. You said you "can't" get links into those pages. If you can't get links into them, they probably aren't "quality" and therein lies your problem.
If by "can't" you just mean there isn't enough time in the day for you to build links into ALL of these pages, you can still build links into as many as you can. This will get the bots crawling down to that level of your site more often, and make it more likely that this level of your site will be indexed.
Here is another useful link, although it is dated as well:
http://www.seomoz.org/blog/googles-indexation-capHaving fresh content (with a fresh "last modified" date) usually does, in my experience, entice Googlebot to come back more often. Does that translate into "indexing" more pages? I don't know. But I do know that having better content and more links into those inner pages does translate into more indexation, and not just for the pages linked to externally, but for that entire section/folder/directory of your site.
Consider user-generated content on those pages if you can. A lot of VERY popular review and realestate sites' deep pages would go unindexed without it.
-
We shouldn't confuse a query that deserves freshness (QDF) with enticing Google to recrawl a page or set of pages by giving them fresh content. Maybe I read your response wrong, but those are two different things. QDF would apply, for instance, if you were writing an article right now about the nuclear disaster in Japan; not if you were updating a page from three years ago about how to lose weight after pregnancy, or how to optimize a webpage.
-
From my experience, adding fresh content on a regular basis, even when the pages are rather empty, will make Google crawl more and more your website. As crawl budget gets bigger, deeper pages will be crawled.
Although I never worked on a similar case to yours, I would suggest adding fresh content on a regular basis and link those new pages on the homepage to get them crawled ASAP. Put internal links to the pages you want to be crawled in those new pages if they are revelant.
-
Not as much. You may have to engineer some process for feed generation. The idea is to have the content in RSS and help it propogate through stuff like ping.
-
It can, as Rand has said in the past, results deserve freshness, that is, results seem to always include a few such pages.
-
saibose...do you think a service like linklicious? (link->rss) would work?
-
the 100 links is more of a guideline and not a strict rule as such. Your 1st objective should be to enable the page to be indexed. If Query Deserves Freshness(QDF) algorithms in Google will eventually index your URL. Its a matter of time with you linking to that page from atleast 1 page.
My advice would be to link it from more pages (if possible) and keep the content fresh.
Maybe you can even try the RSS idea as well.
-
I guess it would depend a little how you're doing it, however the best way to get Google to crawl your product pages is to get links directly to them from other sites that are being crawled often/ have authority. I would also suggest creating a (XML) sitemap and submit it to them if you haven't already.
If all your links are coming to your homepage (not uncommon in smaller sites) then Google's going to usually enter your site that way and if there's a lot of links on the homepage and the site only has a little authority then it has to prioritise how many and which pages to visit.
Having regular content updates may get Google to change which pages it crawls at any one time, though some of your other pages may then have longer cache dates.
Ultimately if your site structure is good enough then you really need to work on building links to the product pages to regularly 'convince' Google to crawl them. Though adding relevant content is one way of doing this
-
Thank you guys.
Anthony, I am not sure I agree; indexing and crawling are 2 different things. I guess that is really what I'm getting at here. I can force google to crawl my whole site daily (or almost daily) with rss feeds, sitemaps, proper structure, frequent updates, etc....but WILL that freshness of content force google to go hm....despite the page being very insignificant, it might be important enough to go into my index.
Saibose, unfortunately i'm well beyond the 100 link limit....I am noticing quite a bit of the pages that ARE indexed, ARE ranking since they're well optimized through on-page and they are targeting extremely long-tail keyphrases. So my main goal is to convince goal to index these pages because once I do, they will rank.
What I have done so far:
1. Made sure that the page is easily accessible from at least 1 page on the website
2. Create a sitemap (proper sitemap index and several underlying sitemap files).
3. Submitted the sitemaps and increase google crawl rate; (I noted google is crawling around 1700 pages/day on my site.
4. Made sure that the page is at most 3 levels deep. (site/state/city) (we'er talking about city level pages)
5. created proper urls (/site/state/city)
I think maybe I misspoke. I am not doubting that google will 'crawl' the page. What I am asking is if I can't link externally to it, and the internal page rank passed is very small, will adding fresh content and making google think that the page gets updated frequently convince google to index it? Does frequent crawling finally force indexing or is it possible google may say "no matter how often you update this page, its just NOT important enough for me to index it," if noone links to it outside your site.
-
I think you are getting at the concept of continually updating the content on a few pages of your site to make sure they are indexed by google. If the page is not indexed already, that means it likely isn't being crawled by google at all so changing the content on the page won't make much of a difference.
Instead, make sure the page you want indexed is easily found within the website's internal linking structure, preferably only a handful of clicks away from the homepage. An even better way to make sure the page is indexed is to get a few external links pointed at it. If you are simply trying to achieve indexation and not expecting the page to rank high in the SERPs, something as easy as bookmarking the site to a few websites and tweeting it once or twice will probably get the job done.
As for your comment on whether or not google will consider your page 'important' enough to be indexed, I don't think you will have a problem with that as long as you are writing unique content.
-
The problem is very common for content heavy websites where content lies somewhere way down the hiearchy.
I am considering or assuming a few things here:
1. The webpage you are referring to is already crawled atleast once.
2. It is accessible from atleast one link on your homepage
3. It does not have a huge number of outbound links ..that is, around 100(within and outside your domain).
Your 1st task should be to get Google to crawl the page (s)
1. get a tool like gsite crawler and crawl your entire website. Create and submit a XML sitemap of your website to Google webmaster tools. Create links from your pages that are already indexed to this page (pages). That way, Google bot will find its way eventually.
2. Update fresh content on the page. Create a RSS feed of the content updates very frequently and serve it up front on the homepage or an important page of your website (which ranks well in Google).
All said, you have to wait and watch. There is no way you can forcefully ask Google to crawl your webpage. Also, updating your homepage content (just text with no link to your deep pages) wouldnt help in speeding up the process. But, its a good practice to keep your homepage content fresh so that Google bots visit your website regularly and you get Google love.
Hope that answers your question.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix duplicate content for homepage and index.html
Hello, I know this probably gets asked quite a lot but I haven't found a recent post about this in 2018 on Moz Q&A, so I thought I would check in and see what the best route/solution for this issue might be. I'm always really worried about making any (potentially bad/wrong) changes to the site, as it's my livelihood, so I'm hoping someone can point me in the right direction. Moz, SEMRush and several other SEO tools are all reporting that I have duplicate content for my homepage and index.html (same identical page). According to Moz, my homepage (without index.html) has PA 29 and index.html has PA 15. They are both showing Status 200. I read that you can either do a 301 redirect or add rel=canonical I currently have a 301 setup for my http to https page and don't have any rel=canonical added to the site/page. What is the best and safest way to get rid of duplicate content and merge the my non index and index.html homepages together these days? I read that both 301 and canonical pass on link juice but I don't know what the best route for me is given what I said above. Thank you for reading, any input is greatly appreciated!
On-Page Optimization | | dreservices0 -
Many have stolen our content. Rewrite vs. DMCA content removal?
Hello, We own a medical tourism website and many other sites have stolen (copied and pasted) our content. Our content is more than 2 years old, so we thought we could rewrite the content - but Which is a more wiser decision from you guys' experience? Archive our current content at a different URL and upload a fresh content in the current URL Claim our originality to Google and ask the stolen sites to remove our content. Thank you and appreciate your time.
On-Page Optimization | | joony0 -
Geo-targeted content and SEO?
I am wondering, what effect does geo-targeted "cookie cutter" content have on SEO. For example, one might have a list of "Top US Comedians", which appears as "Top UK Comedians" for users from the United Kingdom. The data would be populated with information from a database in both cases, but would be completely different for each region, with the exception of a few words. Is this essentially giving Google's (US-based) crawler different content to users? I know that plenty of sites do it, but is it legitimate? Would it be better to redirect to a unique page, based on location, rather than change the content of one static page? I know what the logical SEO answer is here, but even some of the big players use the "wrong" tactic. I am very interested to hear your thoughts.
On-Page Optimization | | HalogenDigital0 -
Why does the on page report reports a full path link as Cannibalize link?
On the seomoz on page report i get a cannibalize error. This is due to a link being full path. When i change the link to relative path then there is no Cannibalize error. Should i change the internal links of the site to relative path? I would appreciate your help.
On-Page Optimization | | pickaweb0 -
Content Tabs and Keyword Stuffing
I am in the process of drawing up content templates to guide my company's marketing team in creating SEO optimized content as we move over our retail website to a new platform. On each product page, we will have multiple tabs that are crawl-able, each one containing different chunks of information on the products. Within each tab, I was thinking of breaking up the content and adding SEO value by using headers (h2 or h3) that have a keyword included. So, for example: "How The PRODUCT NAME Works" and "User Manuals for your PRODUCT NAME." Between the multiple tabs, in headers alone, the main keyword for the product (which will usually be the product name) will be on the page 7 times. Between this and the keywords that are part of the actual content (ex: product description), is this too many keyword instances? I know headers are often skimmed or skipped when used to simply break up the content, so I don't think they will impact user experience too much. However, I would love some feedback on if you agree with that and if you think I should cut down on the number of keywords or if I am headed in the right direction. Thanks!
On-Page Optimization | | Marketing.SCG0 -
Internal linking best practice
See example: car rental - sedans - bmw car rental - sedans - audi car rental - sedans - ford (internal links to sedans - audi, ford) or (internal links to suv - bmw) car rental - suv - bmw car rental - suv - audi car rental - suv - ford (internal links to suv- audi, ford) or (internal links to sedans- bmw...) Should I cross link only between the product page under each category or can I link between different products under different categories? From a user point of view, I think it will give him more options if he wants to choose the same brand but a bigger vehicle although I have read numerous posts saying that we should be internally linking most of the time within the same category. User experience or SEO?
On-Page Optimization | | echo10 -
If I have too many on-page links can I reduce it with nofollow tags or do the links have to be removed?
On my site I have a top nav drop down menu but once visitors go to one particularly large subsection, that menu is repeated on the left for easier viewing. As a result, I shoot over 100 links on page. Can I put nofollow or noindex tags on the left side links and reduce my "official" on-page links count or do I have to actually eliminate some of the links? Thanks, Oak
On-Page Optimization | | CSA-2316710 -
Number of links in breadcrumb
Does google take the number of links in breadcrumb as a indicator to the depth of the page in the site structure? That will mean that a page with 5 links in the breadcrumb is a page very far away from the index, therefore a less important page.
On-Page Optimization | | seo.academy0