We just fixed a Meta refresh, unified our link profile and now our rankings are going crazy
-
Crazy in a bad way!I am hoping that perhaps some of you have experienced this scenario before and can shed some light on what might be happening.Here is what happened:We recently fixed a meta refresh that was on our site's homepage. It was completely fragmenting our link profile. All of our external links were being counted towards one URL, and our internal links were counting for the other URL. In addition to that, our most authoritative URL, because it was subject to a meta refresh, was not passing any of its authority to our other pages.Here is what happened to our link profile:Total External Links: Before - 2,757 After - **4,311 **Total Internal Links: Befpre - 125 After - 3,221
Total Links: Before - 2,882 After - 7,532Yeah....huge change. Great right? Well, I have been tracking a set of keywords that were ranking from spots 10-30 in Google. There are about 66 keywords in the set. I started tracking them because at MozCon last July Fabio Riccotta suggested that targeting keywords showing up on page 2 or 3 of the results might be easier to improve than terms that were on the bottom of page 1. So, take a look at this. The first column shows where a particular keyword ranked on 11/8 and the second column shows where it is ranking today and the third column shows the change. For obvious reasons I haven't included the keywords.11/8 11/14 Change****10 44 -34
10 26 -16
10 28 -18
10 34 -24
10 25 -15
15 29 -14
16 33 -17
16 32 -16
17 24 -7
17 53 -36
17 41 -24
18 27 -9
19 42 -23
19 35 -16
19 - Not in top 200
19 30 -11
19 25 -6
19 43 -24
20 33 -13
20 41 -21
20 34 -14
21 46 -25
21 - Not in top 200
21 33 -12
21 40 -19
21 61 -40
22 46 -24
22 35 -13
22 46 -24
23 51 -28
23 49 -26
24 43 -19
24 47 -23
24 45 -21
24 39 -15
25 45 -20
25 50 -25
26 39 -13
26 118 - 92
26 30 -4
26 139 -113
26 57 -31
27 48 -21
27 47 -20
27 47 -20
27 45 -18
27 48 -21
27 59 -32
27 55 -28
27 40 -13
27 48 -21
27 51 -24
27 43 -16
28 66 -38
28 49 -21
28 51 -23
28 58 -30
29 58 -29
29 43 -14
29 41 -12
29 49 -20
29 60 -31
30 42 -12
31 - Not in top 200
31 59 -28
31 68 -37
31 53 -22Needless to say, this is exactly the opposite of what I expected to see after fixing the meta refresh problem. I wouldn't think anything of normal fluctuation, but every single one of these keywords moved down, almost consistently 20-25 spots. The further down a keyword was to begin with, it seems the further it dropped.What do you make of this? Could Google be penalizing us because our link profile changed so dramatically in a short period of time? I should say that we have never taken part in spammy link-building schemes, nor have we ever been contacted by Google with any kind of suspicious link warnings. We've been online since 1996 and are an e-commerce site doing #RCS. Thanks all! -
Totally agree,
Have seen this a few times in the past.
Major SEO changes, big drop in rankings for 2/3 weeks. Then rankings gradually return.
@Dana: Keep us posted, im curious to see if in a few weeks time things have improved
-
Thanks Dr. Pete. I know this is pushing the boundaries of normal Q&A. I appreciate your answer. Yes, one thing at a time I think is a good way to go. I suggested that we try the mod_pagespeed rewrite on the dev site as a first step. I think it would probably be more efficient for us to hire a developer proficient in SEO to handle some of the more technical items. Thanks again!
-
Sorry, I'm not really clear on what the question is - these seem like general IT items unrelated to the SEO problem. The JS rewrites definitely can be tricky and depend completely on the code in question - I can't really provide a general resource.
Not sure how the alias domains tie in, but they definitely need to be part of any redirection scheme. I've used mod_rewrite for pretty large-scale stuff (as do many large sites), but it's possible to write bad/slow rules. It really depends on the scope. I'm not sure if you're talking about 100s or 1000s (10000s, etc.) of pages. Writing the rules for a big site is beyond the scope of any general Q&A. That's something your team is going to have to really dig deep into.
I feel like they might be over-thinking this one issue and trying to fix everything all at once, but I can't say that confidently without understanding the situation. I think it might be better to tackle these things one at a time.
-
Dr. Pete, Our IT manager responded to my request. Can you point me in the right direction to research these things (I am copying and poasting directly from his message): "A few items that I noticed just skimming the forums that we will
need to look at a little closer are:- Java script that is self referencing, as both tab control and the slide show are self referencing
- Alias domains which we have a number of
- HTTPS pages, which for us, is all pages depending on
when a person logs in."
I found info in the GW forum about the mod_pagespeed rewrite module and sent that to him.
He responded "We are currently using mod_rewrite to handle a number of things including 301 redirection. My experience with mod_rewrite does have me very cautious, because it is very easy to “blow up” the site. I would want to run this on the dev site for some time with a concerted testing effort to make sure we do not have issues."
Any references you can recommend would be great. Thank you so much!
-
It's just one of those things where you're always going to be wondering if the bloated code is causing problems, and it's going to drive you nuts. Fix it, and worst case, you'll rule out a cause. Some days, that's the best we can do.
-
Agreed. I worked at another company that had a 19-year-old kid split out the JS. I submitted the request. I'll let you know what happens. Thanks again!
-
I can't prove it would cause substantial improvement, but right now it's just in your way, and you'll never know. To me, that kind of clean-up is a no-brainer, because it's no risk. At worst, it cleans up the code, improves caching (and load times as you said), and makes updating easier. At best, you see measurable gains.
As a former developer and dev-team manager, I have to say, too, that it's not a tough fix to split out that JS. It would probably make the dev teams life easier down the road. If they're acting like it's a Herculean task, then either (1) they just don't want to do it, or (2) you need a better dev team.
-
Thanks Dr. Pete. The marketing team has been complaining about how far the meta tags, etc. are pushed down in our code for years. Unfortunately, there hasn't been enough evidence that this is doing us any harm so it's never been a priority to fix. I believe moving those lines of JS to an external file would, if nothing else, improve our page speed wouldn't it? If our pages load faster it could impact our SEO in a positive way
Thanks again very much for your suggestions
-
Yeah, the canonical should be ok - I just wanted to make sure you had something in place. One minor thing - I'd get that up on the page - with all the JS, the canonical is down on line 436 of the source code. You'd really be better off getting all that script into external files. It shouldn't make a big ranking difference, but it won't hurt.
You do have have a dozen pages that share your home-page TITLE and META description. Some seem to be odd, near-duplicates, where others probably just have duplicate meta data. Either way, I'd clean that up. Run this query in Google to see them:
site:ccisolutions.com intitle:"Acoustics, Sound, Lighting"
...or check Webmaster Tools (or your SEOmoz campaigns). Again, it probably isn't the culprit, but it's not helping.
I'd really dig to see if anything else is going on. The timing could just be coincidence. I find it really hard to believe that the META refresh change alone harmed you, unless this is just a temporary bounce while Google sorts it out. I definitely would NOT put it back - you risk compounding the problem. People rush to reverse things, assuming that will take them back to where they were, and it rarely does. More than 70% of the time, it just makes a bigger mess.
-
Thanks Dr. Pete. Here's the scoop, and I'm happy to provide the actual URLs so you can have a real view of the source code, etc.
The meta refresh was on this URL:
it redirected to this URL:
http://www.ccisolutions.com/StoreFront/IAFDispatcher?iafAction=showMain
We removed the meta refresh, and put "<rel="canonical" href="<a class=" external"="" rel="nofollow" target="_blank">http://www.ccisolutions.com/" /> to the head of both URLs</rel="canonical">
Our IT Manager couldn't get a 301 redirect to work from http://www.ccisolutions.com/StoreFront/IAFDispatcher?iafAction=showMain to http://www.ccisolutions.com, but in another Q&A thread Streamline Metrics mentioned that this really shouldn't matter as long as the canonical tag is properly set up, which I think it is.
What do you think? (and thanks very much!)
-
I tend to agree that it could just be a short-term re-evaluation period, but I do understand that patience is hard to come by in these situations. I have one concern - I assume the META refresh was acting as some kind of redirect to a different URL? When you removed it, did you canonical the other URL somehow? Just removing the refresh wouldn't consolidate the link "juice" of the various URLs, so it could be that you went from one form of fragmentation to another, different form.
That's just speculation, since I don't fully understand the old/new setups. If you can provides some details with fictional URLs, we might be able to dig in deeper.
-
Yes Paul. I agree. I have seen wild fluctuations on other sites that went through big changes. I believe this is probably an example of a time when we have to hang in there and ride through "The Dip."
"Time, Patience and Intelligent Work" is my mantra....but I also have to convince my CEO that the $1,000 we just spent fixing the meta refresh was actually a good thing. Rankings sinking like this aren't helping me make my case.
If an when I hear anything from Google I'll let you and Bryan know.
I'm sure we aren't the only ones who've fixed something technical that fixed a fragmented link profile. It sure would make me feel better to hear someone say "Yes, similar thing happened to me and now we're ricking it!" LOL - well, you can't blame a girl for dreaming!
-
I'll just add, Dana, that this major a change to the site will often cause massive ranking fluctuations as the crawlers work through the site and consolidate what's going on.
Small comfort, but a week really isn't long enough for things to have settled out to the "new normal". It's a good idea to keep looking for issues, but I'd also hold my breath for another week or two (or three) to see what happens as the dust settles. I know it goes against the grain to wait & see, but in this case I really think it's warranted.
Good luck, and keep breathing
Paul
-
Thanks Bryan. Yes, I took your advice and filed a reconsideration request just now. I spelled out exactly what happened with the whole meta refresh fix. This site has so many technical SEO problems that I am just hoping that it's not a completely different problem being caused by something else. I'll let you know what/if I hear anything.
I'd sure love to hear from any other SEOs out there who've ever been in similar situations!
Thanks again.
-
Like I said it can be many factors.. Perhaps making the drastic changed looks like a spam attack...
Total External Links: Before - 2,757 After - **4,311 **
Total Internal Links: Befpre - 125 After - 3,221
Total Links: Before - 2,882 After - 7,532More then doubled the link count. If you send Google a reconsideration request they will look at your issue and probably help you solve it. -
Thanks Bryan. Yes, I checked the link profile last night. Everything looks totally normal. Interestingly, nearly all of the added links to the new link total were Internal, not External, so I don't think the quality of the links is the issue, maybe moreso the quantity.
I don't think a reconsideration request would be appropriate in this instance because we have not been de-indexed. We are just being hit hard by the algo I think.
If that is the case, I would hope that over the next few weeks, as Google sees our internal links not changing so dramatically, things will settle down.
Any additional thoughts?
-
Perhaps adding the links together ended up pointing too many or a bigger ratio of low quality or non relevant links to your site... Or maybe the anchor link profile is now over optimized, the loss can be due to many reasons... I would recommend checking the new link profile and also making sure everything looks natural. If all is well and are still not ranking, you can send Google the reconsideration request explaining what happened.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal Link Rank Flow
I've read in many articles that pages can "pass" rank to other pages internally. Is anyone aware of any well done internal linking case studies which confirm this? If my homepage has the strongest Page Authority, would linking to another page deeper into my website from my homepage boost my rank for the deeper page in Google (more so than linking to the deep page from a page with lower page authority)?
Technical SEO | | poke10 -
Ranking in other countries.
Hi, To rank in Dubai and UK is it just a matter of mentioning it in a blog post or more technical things need to be setup? we have two sites one empty which i have 301 to the main site would you use the other site for dubia? Regards
Technical SEO | | ReSEOlve0 -
Too many navigational links
Hi there, I have an issue with the amount of internal links on my webpages. Moz campaign manager gives a lot of 'too many on page links' issues. Over 7000.
Technical SEO | | MarcelMoz
I know the importance of a good internal linking structure. 1. Not too many internal links (over approximately 100) is good for flowing through some authority from authoritive pages.
2. Too many internal links can spend all of the 'crawler budget' so the crawlers won't crawl the complete website anymore (right?). This can cause problems with indexing new webpages (right?). This is the situation: The website is a webshop The header contains 6 links, the footer contains 32 links, the homepage contains 42 links, the body content of some category pages contains a variated amount of links from 30 to a maximum of 100 links. Product pages do contain a maximum of 25 links. There is no problem here. Now here's the problem: The website navigation is a dropdown menu that contains 167 links to tier 2. These links are very important for our visitors. They can immediately find the right category/product by it. Removing or shrinking this dropdown is not an option. But the dropdown navigation is causing all of the 'too many on page links' issues. Question: is there a SEO (indexing, PA) problem in this situation which i should solve? What should I solve and how should I solve this? Note: pages have good organic positions and authority. Thanks a lot. Marcel0 -
Our stage site got crawled and we got an unnatural inbound links warning. What now?
live site: www.mybarnwoodframes.com stage site: www.methodseo.net We recently finished a redesign of our site to improve our navigation. Our developer insisted on hosting the stage site on her own server with a separate domain while she worked on it. However, somebody left the site turned on one day and Google crawled the entire thing. Now we have 4,320 pages of 100% identical duplicate content with this other site. We were upset but didn't think that it would have any serious repercussions until we got two orders from customers from the stage site one day. Turns out that the second site was ranking pretty decently for a duplicate site with 0 links, the worst was yet to come however. During the 3 months of the redesign our rankings on our live site dropped and we suffered a 60% drop in organic search traffic. On May 22, 2013 day of the Penguin 2.0 release we received an unnatural inbound links warning. Google webmaster tools shows 4,320 of our 8,000 links coming from the stage site domain to our live site, we figure that was the cause of the warning. We finished the redesign around May 14th and we took down the stage site, but it is still showing up in the search results and the 4,320 links are still showing up in our webmaster tools. 1. Are we correct to assume that it was the stage site that caused the unnatural links warning? 2. Do you think that it was the stage site that caused the drop in traffic? After doing a link audit I can't find any large amount of horrendously bad links coming to the site. 3. Now that the stage site has been taken down, how do we get it out of Google's indexes? Will it be taken out over time or do we need to do something on our end for it to be delisted? 4. Once it's delisted the links coming from it should go away, in the meantime however, should we disavow all of the links from the stage site? Do we need to file a reconsideration request or should we just be patient and let them go away naturally? 5. Do you think that our rankings will ever recover?
Technical SEO | | gallreddy0 -
Sudden drop in ranking google.co.uk ranking
Anyone else had any sudden drops in rankings this week? Is there an update going on? One of my primary keywords has dropped from 6th to 49th in the google.co.uk search results. Not in webmaster tools to flag an issue. I have downloaded the links from webmaster and it does look if some content has been scraped and then linked back to us from a large number of sites that we have never sort links from. I have upload the google disavow link tool. Only one keyword appears to be effected not all of them? Any ideas? Thanks
Technical SEO | | highwayfive0 -
Exchange Links - Problem or Not ?
There's a company that sells a real estate portal sites ready for several companies.
Technical SEO | | imoveiscamposdojordao
And when they install this system they always leave each site in a file calledimobiliarias.php that lists all properties that use your system, so there is a hugeexchange of links between the same sites.
So you can see with the Open Site Explorer that all sites have the same Backlinks.
This would not cause problems with regard to exchange links?
Loss of position or something? Thank you guys.! Sorry. 😛 Google Translator.0 -
Warnings for blocked by blocked by meta-robots/meta robots Nofollow...how to resolve?
Hello, I see hundreds of notices for blocked by meta-robots/meta robots nofollow and it appears it is linked to the comments on my site which I assume I would not want to be crawled. Is this the case and these notices are actually a positive thing? Please advise how to clear them up if these notices can be potentially harmful for my SEO. Thanks, Talia
Technical SEO | | M80Marketing0 -
Dismal content rankings
Hi, I realize this is a very broad question, but I am going to ask it anyways in the hopes that someone might have some insight. I have created a great deal of unique content for the site http://www.healthchoices.ca. You can select a video category from the top dropdown, then click on a video beside the provider box to see. The articles I've written are accessible by the View Article tab under each video. I have worked hard to make the articles informative and they are all unique with quotes from expert physicians. Even for strange health conditions that don't have a lot of competition - I don't see us appearing. Our search results are quite dismal for the amount of content we have. I guess I'm checking to see if anyone is able to point me in the right direction at all? If anything jumps out... Thanks, Erin
Technical SEO | | erinhealthchoices0