We just fixed a Meta refresh, unified our link profile and now our rankings are going crazy
-
Crazy in a bad way!I am hoping that perhaps some of you have experienced this scenario before and can shed some light on what might be happening.Here is what happened:We recently fixed a meta refresh that was on our site's homepage. It was completely fragmenting our link profile. All of our external links were being counted towards one URL, and our internal links were counting for the other URL. In addition to that, our most authoritative URL, because it was subject to a meta refresh, was not passing any of its authority to our other pages.Here is what happened to our link profile:Total External Links: Before - 2,757 After - **4,311 **Total Internal Links: Befpre - 125 After - 3,221
Total Links: Before - 2,882 After - 7,532Yeah....huge change. Great right? Well, I have been tracking a set of keywords that were ranking from spots 10-30 in Google. There are about 66 keywords in the set. I started tracking them because at MozCon last July Fabio Riccotta suggested that targeting keywords showing up on page 2 or 3 of the results might be easier to improve than terms that were on the bottom of page 1. So, take a look at this. The first column shows where a particular keyword ranked on 11/8 and the second column shows where it is ranking today and the third column shows the change. For obvious reasons I haven't included the keywords.11/8 11/14 Change****10 44 -34
10 26 -16
10 28 -18
10 34 -24
10 25 -15
15 29 -14
16 33 -17
16 32 -16
17 24 -7
17 53 -36
17 41 -24
18 27 -9
19 42 -23
19 35 -16
19 - Not in top 200
19 30 -11
19 25 -6
19 43 -24
20 33 -13
20 41 -21
20 34 -14
21 46 -25
21 - Not in top 200
21 33 -12
21 40 -19
21 61 -40
22 46 -24
22 35 -13
22 46 -24
23 51 -28
23 49 -26
24 43 -19
24 47 -23
24 45 -21
24 39 -15
25 45 -20
25 50 -25
26 39 -13
26 118 - 92
26 30 -4
26 139 -113
26 57 -31
27 48 -21
27 47 -20
27 47 -20
27 45 -18
27 48 -21
27 59 -32
27 55 -28
27 40 -13
27 48 -21
27 51 -24
27 43 -16
28 66 -38
28 49 -21
28 51 -23
28 58 -30
29 58 -29
29 43 -14
29 41 -12
29 49 -20
29 60 -31
30 42 -12
31 - Not in top 200
31 59 -28
31 68 -37
31 53 -22Needless to say, this is exactly the opposite of what I expected to see after fixing the meta refresh problem. I wouldn't think anything of normal fluctuation, but every single one of these keywords moved down, almost consistently 20-25 spots. The further down a keyword was to begin with, it seems the further it dropped.What do you make of this? Could Google be penalizing us because our link profile changed so dramatically in a short period of time? I should say that we have never taken part in spammy link-building schemes, nor have we ever been contacted by Google with any kind of suspicious link warnings. We've been online since 1996 and are an e-commerce site doing #RCS. Thanks all! -
Totally agree,
Have seen this a few times in the past.
Major SEO changes, big drop in rankings for 2/3 weeks. Then rankings gradually return.
@Dana: Keep us posted, im curious to see if in a few weeks time things have improved
-
Thanks Dr. Pete. I know this is pushing the boundaries of normal Q&A. I appreciate your answer. Yes, one thing at a time I think is a good way to go. I suggested that we try the mod_pagespeed rewrite on the dev site as a first step. I think it would probably be more efficient for us to hire a developer proficient in SEO to handle some of the more technical items. Thanks again!
-
Sorry, I'm not really clear on what the question is - these seem like general IT items unrelated to the SEO problem. The JS rewrites definitely can be tricky and depend completely on the code in question - I can't really provide a general resource.
Not sure how the alias domains tie in, but they definitely need to be part of any redirection scheme. I've used mod_rewrite for pretty large-scale stuff (as do many large sites), but it's possible to write bad/slow rules. It really depends on the scope. I'm not sure if you're talking about 100s or 1000s (10000s, etc.) of pages. Writing the rules for a big site is beyond the scope of any general Q&A. That's something your team is going to have to really dig deep into.
I feel like they might be over-thinking this one issue and trying to fix everything all at once, but I can't say that confidently without understanding the situation. I think it might be better to tackle these things one at a time.
-
Dr. Pete, Our IT manager responded to my request. Can you point me in the right direction to research these things (I am copying and poasting directly from his message): "A few items that I noticed just skimming the forums that we will
need to look at a little closer are:- Java script that is self referencing, as both tab control and the slide show are self referencing
- Alias domains which we have a number of
- HTTPS pages, which for us, is all pages depending on
when a person logs in."
I found info in the GW forum about the mod_pagespeed rewrite module and sent that to him.
He responded "We are currently using mod_rewrite to handle a number of things including 301 redirection. My experience with mod_rewrite does have me very cautious, because it is very easy to “blow up” the site. I would want to run this on the dev site for some time with a concerted testing effort to make sure we do not have issues."
Any references you can recommend would be great. Thank you so much!
-
It's just one of those things where you're always going to be wondering if the bloated code is causing problems, and it's going to drive you nuts. Fix it, and worst case, you'll rule out a cause. Some days, that's the best we can do.
-
Agreed. I worked at another company that had a 19-year-old kid split out the JS. I submitted the request. I'll let you know what happens. Thanks again!
-
I can't prove it would cause substantial improvement, but right now it's just in your way, and you'll never know. To me, that kind of clean-up is a no-brainer, because it's no risk. At worst, it cleans up the code, improves caching (and load times as you said), and makes updating easier. At best, you see measurable gains.
As a former developer and dev-team manager, I have to say, too, that it's not a tough fix to split out that JS. It would probably make the dev teams life easier down the road. If they're acting like it's a Herculean task, then either (1) they just don't want to do it, or (2) you need a better dev team.
-
Thanks Dr. Pete. The marketing team has been complaining about how far the meta tags, etc. are pushed down in our code for years. Unfortunately, there hasn't been enough evidence that this is doing us any harm so it's never been a priority to fix. I believe moving those lines of JS to an external file would, if nothing else, improve our page speed wouldn't it? If our pages load faster it could impact our SEO in a positive way
Thanks again very much for your suggestions
-
Yeah, the canonical should be ok - I just wanted to make sure you had something in place. One minor thing - I'd get that up on the page - with all the JS, the canonical is down on line 436 of the source code. You'd really be better off getting all that script into external files. It shouldn't make a big ranking difference, but it won't hurt.
You do have have a dozen pages that share your home-page TITLE and META description. Some seem to be odd, near-duplicates, where others probably just have duplicate meta data. Either way, I'd clean that up. Run this query in Google to see them:
site:ccisolutions.com intitle:"Acoustics, Sound, Lighting"
...or check Webmaster Tools (or your SEOmoz campaigns). Again, it probably isn't the culprit, but it's not helping.
I'd really dig to see if anything else is going on. The timing could just be coincidence. I find it really hard to believe that the META refresh change alone harmed you, unless this is just a temporary bounce while Google sorts it out. I definitely would NOT put it back - you risk compounding the problem. People rush to reverse things, assuming that will take them back to where they were, and it rarely does. More than 70% of the time, it just makes a bigger mess.
-
Thanks Dr. Pete. Here's the scoop, and I'm happy to provide the actual URLs so you can have a real view of the source code, etc.
The meta refresh was on this URL:
it redirected to this URL:
http://www.ccisolutions.com/StoreFront/IAFDispatcher?iafAction=showMain
We removed the meta refresh, and put "<rel="canonical" href="<a class=" external"="" rel="nofollow" target="_blank">http://www.ccisolutions.com/" /> to the head of both URLs</rel="canonical">
Our IT Manager couldn't get a 301 redirect to work from http://www.ccisolutions.com/StoreFront/IAFDispatcher?iafAction=showMain to http://www.ccisolutions.com, but in another Q&A thread Streamline Metrics mentioned that this really shouldn't matter as long as the canonical tag is properly set up, which I think it is.
What do you think? (and thanks very much!)
-
I tend to agree that it could just be a short-term re-evaluation period, but I do understand that patience is hard to come by in these situations. I have one concern - I assume the META refresh was acting as some kind of redirect to a different URL? When you removed it, did you canonical the other URL somehow? Just removing the refresh wouldn't consolidate the link "juice" of the various URLs, so it could be that you went from one form of fragmentation to another, different form.
That's just speculation, since I don't fully understand the old/new setups. If you can provides some details with fictional URLs, we might be able to dig in deeper.
-
Yes Paul. I agree. I have seen wild fluctuations on other sites that went through big changes. I believe this is probably an example of a time when we have to hang in there and ride through "The Dip."
"Time, Patience and Intelligent Work" is my mantra....but I also have to convince my CEO that the $1,000 we just spent fixing the meta refresh was actually a good thing. Rankings sinking like this aren't helping me make my case.
If an when I hear anything from Google I'll let you and Bryan know.
I'm sure we aren't the only ones who've fixed something technical that fixed a fragmented link profile. It sure would make me feel better to hear someone say "Yes, similar thing happened to me and now we're ricking it!" LOL - well, you can't blame a girl for dreaming!
-
I'll just add, Dana, that this major a change to the site will often cause massive ranking fluctuations as the crawlers work through the site and consolidate what's going on.
Small comfort, but a week really isn't long enough for things to have settled out to the "new normal". It's a good idea to keep looking for issues, but I'd also hold my breath for another week or two (or three) to see what happens as the dust settles. I know it goes against the grain to wait & see, but in this case I really think it's warranted.
Good luck, and keep breathing
Paul
-
Thanks Bryan. Yes, I took your advice and filed a reconsideration request just now. I spelled out exactly what happened with the whole meta refresh fix. This site has so many technical SEO problems that I am just hoping that it's not a completely different problem being caused by something else. I'll let you know what/if I hear anything.
I'd sure love to hear from any other SEOs out there who've ever been in similar situations!
Thanks again.
-
Like I said it can be many factors.. Perhaps making the drastic changed looks like a spam attack...
Total External Links: Before - 2,757 After - **4,311 **
Total Internal Links: Befpre - 125 After - 3,221
Total Links: Before - 2,882 After - 7,532More then doubled the link count. If you send Google a reconsideration request they will look at your issue and probably help you solve it. -
Thanks Bryan. Yes, I checked the link profile last night. Everything looks totally normal. Interestingly, nearly all of the added links to the new link total were Internal, not External, so I don't think the quality of the links is the issue, maybe moreso the quantity.
I don't think a reconsideration request would be appropriate in this instance because we have not been de-indexed. We are just being hit hard by the algo I think.
If that is the case, I would hope that over the next few weeks, as Google sees our internal links not changing so dramatically, things will settle down.
Any additional thoughts?
-
Perhaps adding the links together ended up pointing too many or a bigger ratio of low quality or non relevant links to your site... Or maybe the anchor link profile is now over optimized, the loss can be due to many reasons... I would recommend checking the new link profile and also making sure everything looks natural. If all is well and are still not ranking, you can send Google the reconsideration request explaining what happened.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multilingual Sitewide Links
Multilingual links in the footer section is being counted as backlink and we are getting tons of backlinks from all the 7 lingual websites. Is there a solution where we eliminate these links and still having the option to navigate to other lingual pages? vr24NAv
Technical SEO | | comfortclick0 -
How can I stop a tracking link from being indexed while still passing link equity?
I have a marketing campaign landing page and it uses a tracking URL to track clicks. The tracking links look something like this: http://this-is-the-origin-url.com/clkn/http/destination-url.com/ The problem is that Google is indexing these links as pages in the SERPs. Of course when they get indexed and then clicked, they show a 400 error because the /clkn/ link doesn't represent an actual page with content on it. The tracking link is set up to instantly 301 redirect to http://destination-url.com. Right now my dev team has blocked these links from crawlers by adding Disallow: /clkn/ in the robots.txt file, however, this blocks the flow of link equity to the destination page. How can I stop these links from being indexed without blocking the flow of link equity to the destination URL?
Technical SEO | | UnbounceVan0 -
Then why my site is not ranking
My website's DA and PAs are good compare with my competitors. Then why my site is not ranking.
Technical SEO | | Somanathan0 -
Toxic Link Removal
Greetings Moz Community: Recently I received an site audit from a MOZ certified SEO firm. The audit concluded that technically the site did not have major problems (unique content, good architecture). But the audit identified a high number of toxic links. Out of 1,300 links approximately 40% were classified as suspicious, 55% as toxic and 5% as healthy. After identifying the specific toxic links, the SEO firm wants to make a Google disavow request, then manually request that the links be removed, and then make final disavow request of Google for the removal of remaining bad links. They believe that they can get about 60% of the bad links removed. Only after the removal process is complete do they think it would be appropriate to start building new links. Is there a risk that this strategy will result in a drop of traffic with so many links removed (even if they are bad)? For me (and I am a novice) it would seem more prudent to build links at the same time that toxic links are being removed. According to the SEO firm, the value of the new links in the eyes of Google would be reduced if there were many toxic links to the site; that this approach would be a waste of resources. While I want to move forward efficiently I absolutely want to avoid a risk of a drop of traffic. I might add that I have not received any messages from Google regarding bad links. But my firm did engage in link building in several instances and our traffic did drop after the Penguin update of April 2012. Also, is there value in having a professional SEO firm remove the links and build new ones? Or is this something I can do on my own? I like the idea of having a pro take care of this, but the costs (Audit, coding, design, content strategy, local SEO, link removal, link building, copywriting) are really adding up. Any thoughts??? THANKS,
Technical SEO | | Kingalan1
Alan0 -
Reducing Alexa Ranking
I have a site mexat, it's current Alexa Rank is 4,497. So, I want to reduce alexa for this site upto 1500 in One Month. I have done following activities- Install Alexa toolbar on my computer. 2)Put Alexa Header code in Source code of my site. 3)I have a big traffic on daily basis upto 1500 visitors. 4)I am posting Blogs, Social Bookmarking, web 2.0 Profile Creation on daily basis. But I didn't get good result. So please share me some Useful Strategy for this project.
Technical SEO | | afycon0 -
Ranking going down and down and disappears.
I asked a question a few weeks ago about a main keyword that we are targeting that is fluctuating up and down. The keyword is "trash bags" and that is what my company sells. All different colors, thicknesses and sizes. This isn't just a random keyword we are trying to optimize for, this is our business. Before I started optimizing the website for "trash bags" we used the term "garbage bags", now that we started we have been off the charts and on the charts, but we have never regained rankings. The trend finally looked like it was going upwards... but now I see I dropped off of google again. Is this normal? Should I be worried that google is penalizing me for this keyword (There is many links that have trash bags in the anchor text - but we do sell that!)? Here is a screenshot of our ranking history for trash bags: https://www.diigo.com/item/image/3vpdp/no01
Technical SEO | | EcomLkwd1 -
Link juice and max number of links clarification
I understand roughly that "Link Juice" is passed by dividing PR by the number of links on a page. I also understand the juice available is reduced by some portion on each iteration. 50 PR page 10 links on page 5 * .9 = 4.5 PR goes to each link. Correct? If so and knowing Google stops counting links somewhere around 100, how would it impact the flow to have over 100 links? IE 50 PR page 150 links on the page .33 *.9 = .29PR to each link BUT only for 100 of them. After that, the juice is just lost? Also, I assume Google, to the best of its ability, organizes the links in order of importance such that content links are counted before footer links etc.
Technical SEO | | sprynewmedia0 -
Why does my site rank so badly
its my turn to ask the interminable question why does my site rank so badly? site is: marriagerecords.org.uk. it was #1 for 'marriage records' on google for about 6 months. then it was 5th to 10th for about 2 months. now it is nowhere for this phrase and anything else, none of the pages I have written rank for anything. I have spent hours upon hours researching original content and I have got some great backlinks from sites like wrexham.gov.uk and somerset.gov.uk (some dont show in opensiteexplorer yet). im guessing im over-optimizing something but i'd love some concrete fixes if anyone could suggest any. thanks, tom
Technical SEO | | lethal0r0