Does Frequency of content updates affect likelyhood outbound links will be indexed?
-
I have several pages on our website with low pr, that also themselves link to lots and lots of pages that are service/product specific. Since there are so many outbound links, I know that the small amount of PR will be spread thin as it is. My question is, if I were to supply fresh content to the top level pages, and change it often, would that influence whether or not google indexes the underlying pages? Also if I supply fresh content to the underlying pages, once google crawls them, would that guarantee that google considers them 'important' enough to be indexed"
I guess my real question is, can freshness of content and frequency of update convince google that the underlying pages are 'worthy of being indexed', and can producing fresh content on those pages 'keep google's interest', so to speak, despite having little if any pagerank.
-
Hello Ilya,
There are several good responses here, and I think some of them would depend on how large your site is and what types of pages they are. Judging by your URL example below, I'm guessing it is real estate related or at least that you have localized pages in different geographic areas.
You have a few issues here. First, this video might help, but it is sort of outdated and misleading in some ways. There may not be a set limit (i.e. we're only going to index 10k pages) but how much of your site gets indexed, and how often it gets crawled is based largely on the quality of your site (assuming all other factors are there, such as sitemaps and crawlable navigation, etc...). And the quality of your site depends on many, many different factors. Of course the two most important for this discussion would probably be uniqueness/usefulness of the content, and the amount of links the site and sections of the site, as well as the deep pages have.
The more links you can get into those deep pages, the more likely it is that Google is going to crawl more often, and index those pages. You said you "can't" get links into those pages. If you can't get links into them, they probably aren't "quality" and therein lies your problem.
If by "can't" you just mean there isn't enough time in the day for you to build links into ALL of these pages, you can still build links into as many as you can. This will get the bots crawling down to that level of your site more often, and make it more likely that this level of your site will be indexed.
Here is another useful link, although it is dated as well:
http://www.seomoz.org/blog/googles-indexation-capHaving fresh content (with a fresh "last modified" date) usually does, in my experience, entice Googlebot to come back more often. Does that translate into "indexing" more pages? I don't know. But I do know that having better content and more links into those inner pages does translate into more indexation, and not just for the pages linked to externally, but for that entire section/folder/directory of your site.
Consider user-generated content on those pages if you can. A lot of VERY popular review and realestate sites' deep pages would go unindexed without it.
-
We shouldn't confuse a query that deserves freshness (QDF) with enticing Google to recrawl a page or set of pages by giving them fresh content. Maybe I read your response wrong, but those are two different things. QDF would apply, for instance, if you were writing an article right now about the nuclear disaster in Japan; not if you were updating a page from three years ago about how to lose weight after pregnancy, or how to optimize a webpage.
-
From my experience, adding fresh content on a regular basis, even when the pages are rather empty, will make Google crawl more and more your website. As crawl budget gets bigger, deeper pages will be crawled.
Although I never worked on a similar case to yours, I would suggest adding fresh content on a regular basis and link those new pages on the homepage to get them crawled ASAP. Put internal links to the pages you want to be crawled in those new pages if they are revelant.
-
Not as much. You may have to engineer some process for feed generation. The idea is to have the content in RSS and help it propogate through stuff like ping.
-
It can, as Rand has said in the past, results deserve freshness, that is, results seem to always include a few such pages.
-
saibose...do you think a service like linklicious? (link->rss) would work?
-
the 100 links is more of a guideline and not a strict rule as such. Your 1st objective should be to enable the page to be indexed. If Query Deserves Freshness(QDF) algorithms in Google will eventually index your URL. Its a matter of time with you linking to that page from atleast 1 page.
My advice would be to link it from more pages (if possible) and keep the content fresh.
Maybe you can even try the RSS idea as well.
-
I guess it would depend a little how you're doing it, however the best way to get Google to crawl your product pages is to get links directly to them from other sites that are being crawled often/ have authority. I would also suggest creating a (XML) sitemap and submit it to them if you haven't already.
If all your links are coming to your homepage (not uncommon in smaller sites) then Google's going to usually enter your site that way and if there's a lot of links on the homepage and the site only has a little authority then it has to prioritise how many and which pages to visit.
Having regular content updates may get Google to change which pages it crawls at any one time, though some of your other pages may then have longer cache dates.
Ultimately if your site structure is good enough then you really need to work on building links to the product pages to regularly 'convince' Google to crawl them. Though adding relevant content is one way of doing this
-
Thank you guys.
Anthony, I am not sure I agree; indexing and crawling are 2 different things. I guess that is really what I'm getting at here. I can force google to crawl my whole site daily (or almost daily) with rss feeds, sitemaps, proper structure, frequent updates, etc....but WILL that freshness of content force google to go hm....despite the page being very insignificant, it might be important enough to go into my index.
Saibose, unfortunately i'm well beyond the 100 link limit....I am noticing quite a bit of the pages that ARE indexed, ARE ranking since they're well optimized through on-page and they are targeting extremely long-tail keyphrases. So my main goal is to convince goal to index these pages because once I do, they will rank.
What I have done so far:
1. Made sure that the page is easily accessible from at least 1 page on the website
2. Create a sitemap (proper sitemap index and several underlying sitemap files).
3. Submitted the sitemaps and increase google crawl rate; (I noted google is crawling around 1700 pages/day on my site.
4. Made sure that the page is at most 3 levels deep. (site/state/city) (we'er talking about city level pages)
5. created proper urls (/site/state/city)
I think maybe I misspoke. I am not doubting that google will 'crawl' the page. What I am asking is if I can't link externally to it, and the internal page rank passed is very small, will adding fresh content and making google think that the page gets updated frequently convince google to index it? Does frequent crawling finally force indexing or is it possible google may say "no matter how often you update this page, its just NOT important enough for me to index it," if noone links to it outside your site.
-
I think you are getting at the concept of continually updating the content on a few pages of your site to make sure they are indexed by google. If the page is not indexed already, that means it likely isn't being crawled by google at all so changing the content on the page won't make much of a difference.
Instead, make sure the page you want indexed is easily found within the website's internal linking structure, preferably only a handful of clicks away from the homepage. An even better way to make sure the page is indexed is to get a few external links pointed at it. If you are simply trying to achieve indexation and not expecting the page to rank high in the SERPs, something as easy as bookmarking the site to a few websites and tweeting it once or twice will probably get the job done.
As for your comment on whether or not google will consider your page 'important' enough to be indexed, I don't think you will have a problem with that as long as you are writing unique content.
-
The problem is very common for content heavy websites where content lies somewhere way down the hiearchy.
I am considering or assuming a few things here:
1. The webpage you are referring to is already crawled atleast once.
2. It is accessible from atleast one link on your homepage
3. It does not have a huge number of outbound links ..that is, around 100(within and outside your domain).
Your 1st task should be to get Google to crawl the page (s)
1. get a tool like gsite crawler and crawl your entire website. Create and submit a XML sitemap of your website to Google webmaster tools. Create links from your pages that are already indexed to this page (pages). That way, Google bot will find its way eventually.
2. Update fresh content on the page. Create a RSS feed of the content updates very frequently and serve it up front on the homepage or an important page of your website (which ranks well in Google).
All said, you have to wait and watch. There is no way you can forcefully ask Google to crawl your webpage. Also, updating your homepage content (just text with no link to your deep pages) wouldnt help in speeding up the process. But, its a good practice to keep your homepage content fresh so that Google bots visit your website regularly and you get Google love.
Hope that answers your question.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Taking more than a day to index after the content changed
Hi everyone, As i got stuck with the confusion that - one of our website pages for the business located in Sharjah contents has been moderated and inspected the URL to google for index with new tags and contents. This is the URL which made the changes: https://www.socprollect-mea.com/sharjah-free-zone-company-registration/ and As i came to know that our page reflecting an issue "Valid items with warnings" once after inspecting the URL in the search console. Something which seems interesting and never experienced before - which is showing: "Products"warning - something like that. I came to know that - Missing field "Brand" and showing no global identifier. Does anybody know what it is and can u able to rectify this concern and get me a solution to index our URL faster on Google search. please?
On-Page Optimization | | nazfazy0 -
Google Parsing jQuery Links as Real Links
While trying to diagnose a recent Google penalty I found out that links were being parsed by Google even though they were made using jQuery. I had the linkify plugin on my site and configured it to convert URLs to links on all of my pages. Today I found links to other sites of mine from sites that should not have been linking to them and found that the links came from pages whose links were generated via jQuery. This makes me wonder, how do I know if Google is counting javascript generated links? Is it possible that my native ad widgets are creating links that Google might count? Since I don't own any of the sites that advertise via the widgets I don't know how to tell if they are getting link juice or not. It used to be that Google didn't parse javascript, so you could add as many links to your site via javascript as you wanted without being seen by Google as linking to those sites. Does anyone know of a jQuery plugin that does turn URLs into clickable links that Google won't parse as real links?
On-Page Optimization | | STDCarriers0 -
Category Page Content
Hey Mozzers, I've recently been doing a content audit on the category and sub-category pages on our site. The old pages had the following "profile" Above The Fold
On-Page Optimization | | ATP
Page Heading
Image Links to Categories / Products
Below the Fold
The rest of the Image Links to Categories / Products
600 words+ of content duplicated from articles, sub categories and products My criticisms of the page were
1. No content (text) above the fold
2. Page content was mostly duplicated content
3. No keyword structure, many pages competed for the same keywords and often unwanted pages outranked the desired page for the keyword. I cleaned this up to the following structure Above The Fold
H1 Page Heading 80-200 Word of Content (Including a link to supporting article)
H2 Page Heading (Expansion or variance of the H1 making sure relevant) 80-200 150 Words of Content
Image Links to Categories / Products
Below the Fold
The rest of the Image Links to Categories / Products The new pages are now all unique content, targeted towards 1-2 themed keywords. I have a few worries I was hoping you could address. 1. The new pages are only 180-300 words of text, simply because that is all that is needed to describe that category and provide some supporting information. the pages previously contained 600 words. Should I be looking to get more content on these pages?
2. If i do need more content, It wont fit "above the fold" without pushing the products and sub categories below the fold, which isn't ideal. Should I be putting it there anyway or should I insert additional text below the products and below the fold or would this just be a waste.
3. Keyword Structure. I have designed each page to target a selction of keywords, for example.
a) The main widget pages targets all general "widget" terms and provides supporting infromation
b) The sub-category blue widget page targets anything related and terms such as "Navy Widgets" because navy widgets are a type of blue widget etc"
Is this keyword structure over-optimised or exactly what I should be doing. I dont want to spread content to thin by being over selective in my categories Any other critisms or comment welcome0 -
Is This A Reason To Move Content?
Dear All, I am questioning my initial decisions when I planned a site due to reading lots of info on moz. Although what I have read has made me question what I have already done, I can't find anything that is specific to my exact case, so here goes. I recently built a shopping cart in OpenCart. I want the site to have lots of information on the products it sells. I have populated each category with at least 1000 words of content that is specific to the products in that category, also I have some information pages that have no products in them at all, just copy. So the shopping site actually has a few pages that look like a static website and a few that look like a normal shopping cart. My thought behind this was I wanted the pages with lots of info to rank and become authoritative, in some way elevating the whole site. I have recently put a blog on the site, and a combination of that, and reading Moz has lead me think that I should move all the content from the category pages to the blog, and deep link each blog post to it's relevant products and category. From what I have read it would be easier to get the blog ranking and acknowledged as an authority rather than 30 category pages. Also each 1500+ word category page will make at least 3-4 nice blog posts, and each post can be focused on a single keyword rather than a large category page that has maybe 3-4 keywords it's trying to rank for. Also the blog is much better optimised than a standard OC category page (even using extensions with them). The only negative I can see is moving the content, but the site is less that 2 months old, and the amount of link juice it has is negligible. Does google cut new sites a bit of slack in these situations of moving content around, or will I be seen as 'up to something' by google? I guess my question is, am I barking up the right tree? Or is the old adage 'a little information is dangerous' true in this case, and I just about to make a load of work for the sake of it with no real benefit. However, if I am to make such a dramatic change to the sites architecture I think the time is now, before things start gaining juice & rank. I hope I have explained my situation clearly and I thank anyone who can offer me any advice. Great forum, Thank you, Ian
On-Page Optimization | | cookie7770 -
Duplicate Page Content
Hi there, We keep getting duplicate page content issues. However, its not actually the same page.
On-Page Optimization | | HamiltonIsland
E.G - There might be 5 pages in say a Media Release section of the website. And each URL says page 1, 2 etc etc. However, its still coming up as duplicate. How can this be fixed so Moz knows its actually different content?0 -
Static content VS Dynamic changing content what is best
We have collected a lot of reviews and we want to use them on our Categories pages. We are going to be updating the top 6 reviews per categories every 4 days. There will be another page to see all of the reviews. Is there any advantage to have the reviews static for 1 or 2 weeks vs. having unique new ones pulled from the data base every time the page is refreshed? We know there is an advantage if we keep them on the page forever with long tail; however, we have created a new page with all of the reviews they can go to.
On-Page Optimization | | DoRM0 -
Internal Followed Links and Total Internal Links as 1
It is showing Internal Followed Links and Total Internal Links as 1 in OpenSiteExplorer Tool http://www.expresscasket.com/ http://www.opensiteexplorer.org/comparisons?site=www.expresscasket.com Not able to understand and identify the problem and fix it. But when i check in google webmasters tool, it is showing lots of internal links. Does it differ those internal links and your trace of internal links
On-Page Optimization | | expresscasket0 -
301 redirect link
Hi, I found some explanations on seomoz about permanently links, but I'm not sure, if I understood right, what to do. Our website has been created with a wrong structure and I have to change the URL of a couple of pages. E. G. http://www.ix-tours.com/Youth/ixdestBrusselYouth.aspx should be changed to http://www.ix-tours.com/DE/Jugend/BruesselJ.aspx to allow search identify it as a german page. What to do? Should I delete all content from the old page and insert the redirect to the knew one? The code as follows has to be inserted in the head section? Thank you for your help Brgds georg
On-Page Optimization | | itmlage0