Internal structure update
-
How often does google update the internal linking structure of a website ?
Thank you,
-
Again, yes and no to having to wait. I wrote an article that was 1st page (and still is) within just 4 hours. That was down to Google+
With so many factors, it is hard to try and judge what has made a difference. Also be aware that sometimes a site can get a false bump for a shirt period of time after a major change, only to settle back again once Google has analysed it all again - just be sure it wasn't something like this as in 12 years, I have never seen just a change of structure make a dramatic change.
Hope you get everything sorted
-
I agree and i don't agree the reason being and is that many months ago I did a internal linking structure change ( without changing anything else ) on my site and my ranking skyrocketed !!
I then reverted that changed to make the structure even better and since then I have never recovered that ranking , I even reverted to my old ranking...
It is the reason why I believe the issue doesn't come from onepage factors, I just think it is just a matter of when google is going to decide to get the data live ... ( and I guess i was luck enough to do my change last time when they pushed that data live .. )
and according to me they are very slow ... and try to make everything very confusing ... but I think the key to ranking once you have figured " everything out " is to wait and wait and wait...
-
I wouldn't have thought a re-direct would have been the sole culprit. And don't forget, there are off-page factors as well, so if Google looks at your links, it may be that they need more work.
Dont forget here that there are 200+ primary metrics that Google look at, and each of those metrics has metrics of their own. For example, I did listen to Matt Cutts talking about this some time ago and hinted that some carry as many as 4000 individual rules. Not saying this is still the case as this was a few years ago, but it makes sense with Google working lots into page quality algorithms, that there are going to be multiple signals that will be triggers.
Hard to go into great detail about the SEO on your site without doing a thorough analysis.
Andy
-
I have checked everything and tried all the on page factors and I can guarantee they are all good.
The only thing that wrong was a re-direct from my subpages to my homepage ( I just changed that a week ago and I am waiting to see if it will change anything in terms of ranking). Do you think it will ?
So far I saw a change for my homepage on a highly competitive keyword ( in 6 days I went from page 9 to page 6 ) but not my subpages and I am wondering if it will go any further in the next few weeks of if i will be stuck there until google does its " FULL UPDATE ".
-
Well, yes and no. If all that has changed is internal linking, then that might not have been enough to tip the scales for Google. There could be other on-page factors holding this back.
-
I agree thank you.
However, does it seem bizarre or normal to you that a site ranking hasn't changed even 6 months after its structure has been redone.
-
Correct - that can only do this piecemeal. Especially so on larger sites where a crawl can take place over a number of days. Even then, not every page is guaranteed to be crawled - certainly not initially.
-
I understand now, what you meant is that they don't push all the data live at once ? is that correct ?
-
I know they crawl everything, all I was saying is that they don't do this all in one go.
-
Google crawls everything when they crawl a site. However how many pages they crawl each day is a different story.
Google then ranks a site according to external and internal structure.. external is done "live", internal is calculated fairely quickly ( at least I think ) but " Pushed live " very rarely...
Has anyone any information on that ( time frame )
I am probably asking a question that only someone working for google can answer... but i am still giving it a try in case someone had luck and can answer this multi-million dollar question
-
When Google crawl a site, they don't necessarily come along and crawl everything in on go and they have never released (to my knowledge) any specific time frames relating to this.
However, what I would suggest is share a few pages on Google+. This seems to send a spider along pretty quickly and could instigate a full crawl over a shorter space of time.
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best way to structure website URLs ?
Hi, can anyone help me to understand if having category folder in URL matters or not? how to google treat a URL? for example, I have the URL www.protoexpress.com/pcb/certification but not sure google will treat it a whole or in separate parts? if in separate parts, is it safe to use pcb/pcb-certification? or it will be considered as keyword stuffing? Thank you in anticipation,
Intermediate & Advanced SEO | | SierraPCB1 -
URL change - Sitemap update / redirect
Hi everyone Recently we performed a massive, hybrid site migration (CMS, URL, site structure change) without losing any traffic (yay!). Today I am finding out that our developers+copy writers decided to change Some URLs (pages are the same) without notifying anyone (I'm not going into details why). Anyhow, some URLs in site map changed, so old URLs don't exist anymore. Here is the example: OLD (in sitemap, indexed): https://www.domain.com/destinations/massachusetts/dennis-port NEW: https://www.domain.com/destinations/massachusetts/cape-cod Also, you should know that there is a number of redirects that happened in the past (whole site) Example : Last couple years redirections: HTTP to HTTPS non-www to www trailing slash to no trailing slash Most recent (a month ago ) Site Migration Redirects (URLs / site structure change) So I could add new URLs to the sitemap and resubmit in GSC. My dilemma is what to do with old URL? So we already have a ton of redirects and adding another one is not something I'm in favor of because of redirect loops and issues that can affect our SEO efforts. I would suggest to change the original, most recent 301 redirects and point to the new URL ( pre-migration 301 redirect to newly created URL). The goal is not to send mixed signals to SEs and not to lose visibility. Any advice? Please let me know if you need more clarification. Thank you
Intermediate & Advanced SEO | | bgvsiteadmin0 -
Did We Implement Structured Data Correctly?
Our designer/developer recently implemented structured data on our pages. I'm trying to become more educated on how it works since I'm the SEO marketing specialist on the team and the one that writes and publishes the majority of our content. I'm aware it's extremely important and needs to be done, I just don't know how to do it yet. The developer was on our team for over a year, we recently let him go. Now, I'm going through all the pages to make sure it's done correctly. I'm using the structured data testing tool to look at the pages and have been playing with the structured data markup helper. I would REALLY appreciate it if one of my fellow MOZ fans & family can help me determine if it's done correctly. We do not currently have any schema plugs installed that I know of. So I'm not sure how he implemented the Schema code. I would like to know what I need to do moving forward to the additional content we publish as well as what to do to correctly implement Schema if not already. When I manually look at one of our FAQ pages I see multiple schema data formats detected... I'm not sure if we're supposed to have multiple or just one----> https://www.screencast.com/t/TjHphL7jsI I also noticed in the Question schema data for that same page... the accepted answer is empty. I would image that should have the short version of the answer to the question in it?--->https://www.screencast.com/t/e6ppXkhXd7QS Here's a screenshot of our structured data info from Google search console---> https://www.screencast.com/t/KHj4BGgdrZ4m HELP please! Our website consists of 25-30 "product" pages https://www.medicarefaq.com/medigap/ https://www.medicarefaq.com/medicare-supplement/ https://www.medicarefaq.com/medigap/plan-f/ https://www.medicarefaq.com/medicare-supplement/plan-f/ We currently have about 75 FAQ pages and adding 4-6 per month. This is what brings in most our traffic. https://www.medicarefaq.com/faqs/2018-top-medicare-supplement-insurance-plans/ https://www.medicarefaq.com/faqs/2018-medicare-high-deductible-plan-f-changes https://www.medicarefaq.com/faqs/medicare-guaranteed-issue-rights We have 100 state specific pages (two for each state) https://www.medicarefaq.com/medicare-supplement/florida/ https://www.medicarefaq.com/medigap/florida/ https://www.medicarefaq.com/medicare-supplement/California/ https://www.medicarefaq.com/medigap/California/ We have 20ish carrier specific pages https://www.medicarefaq.com/medicare-supplement/humana/ https://www.medicarefaq.com/medicare-supplement/mutual-of-omaha/ Then we have about 30 blog pages so far and are publishing new blog posts weekly https://www.medicarefaq.com/blog/average-age-retirement-rising/ https://www.medicarefaq.com/blog/social-security-benefit-increase-announced-2018 https://www.medicarefaq.com/blog/new-california-bill-force-drugmakers-explain-price-hikes
Intermediate & Advanced SEO | | LindsayE0 -
301 forwarding old urls to new urls - when should you update sitemap?
Hello Mozzers, If you are amending your urls - 301ing to new URLs - when in the process should you update your sitemap to reflect the new urls? I have heard some suggest you should submit a new sitemap alongside old sitemap to support indexing of new URLs, but I've no idea whether that advice is valid or not. Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Making sense of MLB.com domain structure
Although the subject of subdomains has been discussed quite often on these boards, I never found a clear answer to something I am pondering. I am about to launch a network of 8 to 10 related sites - all sharing a the same concept, layout, etc. but each site possessing unique content. My concept will be somewhat similar to how MLB.com (Major League Baseball) is set up. Each of the 30 teams in the league has it's unique content as a subdomain. My goal in the initial research was to try to find the answer to this question - **"Do the subdomains of a network contribute any increased value to the Root Domain? ** As I was trying to find the answer to my question and analyzing how MLB.com did it, I began to notice some structure that made very little sense to me and am hoping an expert can explain why they are doing it the way they are. Let me try to illustrate: Root Domain = http://mlb.com (actually redirects to: http://mlb.mlb.com/index.jsp) This root domain serves universal content that appeals to all fans of the league and also as a portal to the other subdomains from the main navigation. SubDomain Example = http://tampabay.rays.mlb.com/index.jsp **Already there are a couple of questions. ** 1. Why does MLB.com redirect to http://mlb.mlb.com/index.jsp ? - why the mlb subdomain? 2. - Why two subdomains for tampabay.rays.mlb.com/index.jsp.? Why not just make the subdomain "tampabayrays", "newyorkmets", "newyorkyankees" etc. **Here is where things get a little more complicated and confusing for me. **
Intermediate & Advanced SEO | | bluelynxmarketing
From the home page, if I click on an article about the San Francisco Giants, I was half expecting to be led to content hosted from the http://sanfrancisco.giants.mlb subdomain but instead the URL was: http://mlb.mlb.com/news/article.jsp?ymd=20121030&content_id=40129938&vkey=news_mlb&c_id=mlb I can understand the breakdown of this URL
YMD = Year, Month, Date
Content ID = Identifying the content
VKey = news_MLB (clicked from the "news section found from the mlb subdomain.
c_id=mlb (?) Now, if I go to the San Francisco Giants page, I see a link to the same exact article but the URL is this: http://sanfrancisco.giants.mlb.com/news/article.jsp?ymd=20121030&content_id=40129938&vkey=news_sf&c_id=sf It get's even stranger...when I went to the Chicago Cubs subdomain, the URL to the same exact article does not even link to the general mlb.mlb.com content, instead the URL looks like this: http://chicago.cubs.mlb.com/news/article.jsp?ymd=20121030&content_id=40129938&vkey=news_mlb&c_id=mlb When I looked at the header from the http://chicago.cubs.mlb.com ULR, I could see the OG:URL as: http://sanfrancisco.giants.mlb.com/news/article.jsp?ymd=20121030&content_id=40129938&vkey=news_sf&c_id=sf but I did not see anything relating to rel=canonical I am sure there is a logical answer to this as the content management for a site like MLB.COM must be a real challenge. However, it seems that they would have some major issues with duplicate content. So aside from MLB's complex structure...I am also still searching for the answer to my initial question which is - **"Do the subdomains of a network contribute any increased value to the Root Domain?" For example, does http://tampabay.rays.mlb.com/index.jsp bring value to http://mlb.com? And what if the subdomain is marketed as http://raysbaseball.com and then redirected to the subdomain? Thanks in advance. **0 -
Handful of internal pages penguin penalized. 302 them or let them 404?
We have a site that is for the most part doing great, but the internal pages that received too much link building received some penguin penalties (no warning in WMT) but it's fairly obvious. Has anyone tried letting these pages 404 and just creating new URL's? Or 302 redirecting the old URL's to new ones?
Intermediate & Advanced SEO | | iAnalyst.com0 -
Best way to migrate to a new URL structure
Hello everyone, We’re changing our URL structure from something like this: example.com/index.php?language=English To something like this: example.com**/english/**index.php The change is implemented with mod_rewrite so all the old URLs can still work We have hundreds of thousands of pages that are currently indexed with the old URL structure What’s the best way to get Google to rapidly update its index and to maintain as much ranking as possible? 301 redirect all the old URLs to the new equivalent format? If we detect that the URL is in an old format, render the page with a canonical tag pointing to the new equivalent format as well as adding a noindex, nofollow tag? Something else? Thanks for your input!
Intermediate & Advanced SEO | | anthematic0 -
Panda Update - Challenge!
I met with a new client last week. They were very negatively impacted by the Panda update. Initially I thought the reason was pretty straight-forward and had to do with duplicate content. After my meeting with the developer, I'm stumped and I'd appreciate any ideas. Here are a few details to give you some background. The site is a very nice looking (2.0) website with good content. Basically they sell fonts. That's why I thought there could be some duplicate content issues. The developer assured me that the product detail pages are unique and he has the rel=canonical tag properly in place. I don't see any issues with the code, the content is good (not shallow), there's no advertising on the site, XML sitemap is up to date, Google webmaster indicates that the site is getting crawled with no issues. The only thing I can come up with is that it is either: Something off-page related to links or Related to the font descriptions - maybe they are getting copied and pasted from other sites...and they don't look like unique content to Google. If anyone has ideas or would like more info to help please send me a message. I greatly appreciate any feedback. Thank you, friends! LHC
Intermediate & Advanced SEO | | lhc670