Internal structure update
-
How often does google update the internal linking structure of a website ?
Thank you,
-
Again, yes and no to having to wait. I wrote an article that was 1st page (and still is) within just 4 hours. That was down to Google+
With so many factors, it is hard to try and judge what has made a difference. Also be aware that sometimes a site can get a false bump for a shirt period of time after a major change, only to settle back again once Google has analysed it all again - just be sure it wasn't something like this as in 12 years, I have never seen just a change of structure make a dramatic change.
Hope you get everything sorted
-
I agree and i don't agree the reason being and is that many months ago I did a internal linking structure change ( without changing anything else ) on my site and my ranking skyrocketed !!
I then reverted that changed to make the structure even better and since then I have never recovered that ranking , I even reverted to my old ranking...
It is the reason why I believe the issue doesn't come from onepage factors, I just think it is just a matter of when google is going to decide to get the data live ... ( and I guess i was luck enough to do my change last time when they pushed that data live .. )
and according to me they are very slow ... and try to make everything very confusing ... but I think the key to ranking once you have figured " everything out " is to wait and wait and wait...
-
I wouldn't have thought a re-direct would have been the sole culprit. And don't forget, there are off-page factors as well, so if Google looks at your links, it may be that they need more work.
Dont forget here that there are 200+ primary metrics that Google look at, and each of those metrics has metrics of their own. For example, I did listen to Matt Cutts talking about this some time ago and hinted that some carry as many as 4000 individual rules. Not saying this is still the case as this was a few years ago, but it makes sense with Google working lots into page quality algorithms, that there are going to be multiple signals that will be triggers.
Hard to go into great detail about the SEO on your site without doing a thorough analysis.
Andy
-
I have checked everything and tried all the on page factors and I can guarantee they are all good.
The only thing that wrong was a re-direct from my subpages to my homepage ( I just changed that a week ago and I am waiting to see if it will change anything in terms of ranking). Do you think it will ?
So far I saw a change for my homepage on a highly competitive keyword ( in 6 days I went from page 9 to page 6 ) but not my subpages and I am wondering if it will go any further in the next few weeks of if i will be stuck there until google does its " FULL UPDATE ".
-
Well, yes and no. If all that has changed is internal linking, then that might not have been enough to tip the scales for Google. There could be other on-page factors holding this back.
-
I agree thank you.
However, does it seem bizarre or normal to you that a site ranking hasn't changed even 6 months after its structure has been redone.
-
Correct - that can only do this piecemeal. Especially so on larger sites where a crawl can take place over a number of days. Even then, not every page is guaranteed to be crawled - certainly not initially.
-
I understand now, what you meant is that they don't push all the data live at once ? is that correct ?
-
I know they crawl everything, all I was saying is that they don't do this all in one go.
-
Google crawls everything when they crawl a site. However how many pages they crawl each day is a different story.
Google then ranks a site according to external and internal structure.. external is done "live", internal is calculated fairely quickly ( at least I think ) but " Pushed live " very rarely...
Has anyone any information on that ( time frame )
I am probably asking a question that only someone working for google can answer... but i am still giving it a try in case someone had luck and can answer this multi-million dollar question
-
When Google crawl a site, they don't necessarily come along and crawl everything in on go and they have never released (to my knowledge) any specific time frames relating to this.
However, what I would suggest is share a few pages on Google+. This seems to send a spider along pretty quickly and could instigate a full crawl over a shorter space of time.
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do internal search results get indexed by Google?
Hi all, Most of the URLs that are created by using the internal search function of a website/web shop shouldn't be indexed since they create duplicate content or waste crawl budget. The standard way to go is to 'noindex, follow' these pages or sometimes to use robots.txt to disallow crawling of these pages. The first question I have is how these pages actually would get indexed in the first place if you wouldn't use one of the options above. Crawlers follow links to index a website's pages. If a random visitor comes to your site and uses the search function, this creates a URL. There are no links leading to this URL, it is not in a sitemap, it can't be found through navigating on the website,... so how can search engines index these URLs that were generated by using an internal search function? Second question: let's say somebody embeds a link on his website pointing to a URL from your website that was created by an internal search. Now let's assume you used robots.txt to make sure these URLs weren't indexed. This means Google won't even crawl those pages. Is it possible then that the link that was used on another website will show an empty page after a while, since Google doesn't even crawl this page? Thanks for your thoughts guys.
Intermediate & Advanced SEO | | Mat_C0 -
International SEO and Indexing in Country Specific Search Engines
Hey everyone! My company has recently migrated to a new domain (www.napoleon.com) which includes migrating many TLD separate domains to the new. We have structured the website to have multi-language and regions, including regional content, product offerings etc. Our structure is as follows just to give an example. napoleon.com/en/ca/
Intermediate & Advanced SEO | | Napoleon.com
napoleon.com/fr/ca/
napoleon.com/en/us/
napoleon.com/de/de Currently, specifically the homepage version of the USA website is indexing into Canadian Search Engines, and I can't figure out why. It has been roughly 6 weeks since launch. Any thoughts on this? Thank you Dustin0 -
Using hreflang for international pages - is this how you do it?
My client is trying to achieve a global presence in select countries, and then track traffic from their international pages in Google Analytics. The content for the international pages is pretty much the same as for USA pages, but the form and a few other details are different due to how product licensing has to be set up. I don’t want to risk losing ranking for existing USA pages due to issues like duplicate content etc. What is the best way to approach this? This is my first foray into this and I’ve been scanning the MOZ topics but a number of the conversations are going over my head,so suggestions will need to be pretty simple 🙂 Is it a case of adding hreflang code to each page and creating different URLs for tracking. For example:
Intermediate & Advanced SEO | | Caro-O
URL for USA: https://company.com/en-US/products/product-name/
URL for Canada: https://company.com/en-ca/products/product-name /
URL for German Language Content: https://company.com/de/products/product-name /
URL for rest of the world: https://company.com/en/products/product-name /1 -
Google Is Indexing My Internal Search Results - What should i do?
Hello, We are using a CMS/E-Commerce platform which isn't really built with SEO in mind, this has led us to the following problem.... a large number of internal (product search) search result pages, which aren't "search engine friendly" or "user friendly", are being indexed by google and are driving traffic to the site, generating our client revenue. We want to remove these pages and stop them from being indexed, replacing them with static category pages - essentially moving the traffic from the search results to static pages. We feel this is necessary as our current situation is a short-term (accidental) win and later down the line as more pages become indexed we don't want to incur a penalty . We're hesitant to do a blanket de-indexation of all ?search results pages because we would lose revenue and traffic in the short term, while trying to improve the rankings of our optimised static pages. The idea is to really move up our static pages in Google's index, and when their performance is strong enough, to de-index all of the internal search results pages. Our main focus is to improve user experience and not have customers enter the site through unexpected pages. All thoughts or recommendations are welcome. Thanks
Intermediate & Advanced SEO | | iThinkMedia0 -
Google Hummingbird Update - Any Changes ?
Google has update with the new alogrithm and did you see any effects and as they are not revelaing the techinicaly how they work ? What's your opinion ?
Intermediate & Advanced SEO | | Esaky0 -
Advice on URL structure for competing against EMDs of a hot keyword
Here is the question, illustrated with an example: A law client focuses on personal injury. Their domain is nondescript. The question comes into the URL structure for an article section of the site (I think I know what most people here will say, but want to raise this anyway). This section will have several hundred 'personal injury' articles at launch, with 100+ added each month by writers. Most articles do not mention 'personal injury' in the titles or in the content, but focus on the many areas in which people can hurt themselves :-). Spreading a single keyword emphasis across many pages/posts is considered poor form by many, but the counter-argument is that hundreds of articles, all with 'personal injury' in the URL, could increase the overall authority of the site for that term (and may compete more strongly with EMD competitors). For instance, let's say Competitor A has this article: www.acmepersonalinjury.com/articles/tips-if-in-car-accident And we had the following options: Option A: www.baddomain.com/articles/tips-if-in-car-accident Option B: www.baddomain.com/personal-injury-articles/tips-if-in-car-accident Of course, for the term "car accident", Option A seems on equal footing with the ACME competitor. But, what about the overall performance of the "personal injury" keyword (a HOT keyword in this space)? Would ACME always have an advantage (however slight) due to its domain? Would Option B help in this regard? The downside of course is that this pushes "car accident" further down in the URL string, making all articles perhaps less competitive on their individual keywords.
Intermediate & Advanced SEO | | warpsmith0 -
How can I penalise my own site in an international search?
Perhaps penalise isn't the right word, but we have two ecommerce sites. One at .com and one at .com.au. For the com.au site we would like only that site to appear for our brand name search in google.com.au. For the .com site we would like only that site to appear for our brand name search in google.com. I've targeted each site in the respective country in Google Webmaster Tools and published the Australian and English address on the respective site. What I'm concerned about is people on Google.com.au searching our brand and clicking through to the .com site. Is there anything I can do to lower the ranking of my .com site in Google.com.au?
Intermediate & Advanced SEO | | Benj250 -
Our URLs have changed. Do we request our external links be updated as well?
Hello Forum, We've re-launched our website with a new, SEO-friendly URL structure. We have also set up 301 redirects from our old URLs to the new ones. Now, is there any benefit to asking those external websites that link to us to update their links with our new URLs? What is the SEO best practice? Thanks for your insight.
Intermediate & Advanced SEO | | pano0