We just fixed a Meta refresh, unified our link profile and now our rankings are going crazy
-
Crazy in a bad way!I am hoping that perhaps some of you have experienced this scenario before and can shed some light on what might be happening.Here is what happened:We recently fixed a meta refresh that was on our site's homepage. It was completely fragmenting our link profile. All of our external links were being counted towards one URL, and our internal links were counting for the other URL. In addition to that, our most authoritative URL, because it was subject to a meta refresh, was not passing any of its authority to our other pages.Here is what happened to our link profile:Total External Links: Before - 2,757 After - **4,311 **Total Internal Links: Befpre - 125 After - 3,221
Total Links: Before - 2,882 After - 7,532Yeah....huge change. Great right? Well, I have been tracking a set of keywords that were ranking from spots 10-30 in Google. There are about 66 keywords in the set. I started tracking them because at MozCon last July Fabio Riccotta suggested that targeting keywords showing up on page 2 or 3 of the results might be easier to improve than terms that were on the bottom of page 1. So, take a look at this. The first column shows where a particular keyword ranked on 11/8 and the second column shows where it is ranking today and the third column shows the change. For obvious reasons I haven't included the keywords.11/8 11/14 Change****10 44 -34
10 26 -16
10 28 -18
10 34 -24
10 25 -15
15 29 -14
16 33 -17
16 32 -16
17 24 -7
17 53 -36
17 41 -24
18 27 -9
19 42 -23
19 35 -16
19 - Not in top 200
19 30 -11
19 25 -6
19 43 -24
20 33 -13
20 41 -21
20 34 -14
21 46 -25
21 - Not in top 200
21 33 -12
21 40 -19
21 61 -40
22 46 -24
22 35 -13
22 46 -24
23 51 -28
23 49 -26
24 43 -19
24 47 -23
24 45 -21
24 39 -15
25 45 -20
25 50 -25
26 39 -13
26 118 - 92
26 30 -4
26 139 -113
26 57 -31
27 48 -21
27 47 -20
27 47 -20
27 45 -18
27 48 -21
27 59 -32
27 55 -28
27 40 -13
27 48 -21
27 51 -24
27 43 -16
28 66 -38
28 49 -21
28 51 -23
28 58 -30
29 58 -29
29 43 -14
29 41 -12
29 49 -20
29 60 -31
30 42 -12
31 - Not in top 200
31 59 -28
31 68 -37
31 53 -22Needless to say, this is exactly the opposite of what I expected to see after fixing the meta refresh problem. I wouldn't think anything of normal fluctuation, but every single one of these keywords moved down, almost consistently 20-25 spots. The further down a keyword was to begin with, it seems the further it dropped.What do you make of this? Could Google be penalizing us because our link profile changed so dramatically in a short period of time? I should say that we have never taken part in spammy link-building schemes, nor have we ever been contacted by Google with any kind of suspicious link warnings. We've been online since 1996 and are an e-commerce site doing #RCS. Thanks all! -
Totally agree,
Have seen this a few times in the past.
Major SEO changes, big drop in rankings for 2/3 weeks. Then rankings gradually return.
@Dana: Keep us posted, im curious to see if in a few weeks time things have improved
-
Thanks Dr. Pete. I know this is pushing the boundaries of normal Q&A. I appreciate your answer. Yes, one thing at a time I think is a good way to go. I suggested that we try the mod_pagespeed rewrite on the dev site as a first step. I think it would probably be more efficient for us to hire a developer proficient in SEO to handle some of the more technical items. Thanks again!
-
Sorry, I'm not really clear on what the question is - these seem like general IT items unrelated to the SEO problem. The JS rewrites definitely can be tricky and depend completely on the code in question - I can't really provide a general resource.
Not sure how the alias domains tie in, but they definitely need to be part of any redirection scheme. I've used mod_rewrite for pretty large-scale stuff (as do many large sites), but it's possible to write bad/slow rules. It really depends on the scope. I'm not sure if you're talking about 100s or 1000s (10000s, etc.) of pages. Writing the rules for a big site is beyond the scope of any general Q&A. That's something your team is going to have to really dig deep into.
I feel like they might be over-thinking this one issue and trying to fix everything all at once, but I can't say that confidently without understanding the situation. I think it might be better to tackle these things one at a time.
-
Dr. Pete, Our IT manager responded to my request. Can you point me in the right direction to research these things (I am copying and poasting directly from his message): "A few items that I noticed just skimming the forums that we will
need to look at a little closer are:- Java script that is self referencing, as both tab control and the slide show are self referencing
- Alias domains which we have a number of
- HTTPS pages, which for us, is all pages depending on
when a person logs in."
I found info in the GW forum about the mod_pagespeed rewrite module and sent that to him.
He responded "We are currently using mod_rewrite to handle a number of things including 301 redirection. My experience with mod_rewrite does have me very cautious, because it is very easy to “blow up” the site. I would want to run this on the dev site for some time with a concerted testing effort to make sure we do not have issues."
Any references you can recommend would be great. Thank you so much!
-
It's just one of those things where you're always going to be wondering if the bloated code is causing problems, and it's going to drive you nuts. Fix it, and worst case, you'll rule out a cause. Some days, that's the best we can do.
-
Agreed. I worked at another company that had a 19-year-old kid split out the JS. I submitted the request. I'll let you know what happens. Thanks again!
-
I can't prove it would cause substantial improvement, but right now it's just in your way, and you'll never know. To me, that kind of clean-up is a no-brainer, because it's no risk. At worst, it cleans up the code, improves caching (and load times as you said), and makes updating easier. At best, you see measurable gains.
As a former developer and dev-team manager, I have to say, too, that it's not a tough fix to split out that JS. It would probably make the dev teams life easier down the road. If they're acting like it's a Herculean task, then either (1) they just don't want to do it, or (2) you need a better dev team.
-
Thanks Dr. Pete. The marketing team has been complaining about how far the meta tags, etc. are pushed down in our code for years. Unfortunately, there hasn't been enough evidence that this is doing us any harm so it's never been a priority to fix. I believe moving those lines of JS to an external file would, if nothing else, improve our page speed wouldn't it? If our pages load faster it could impact our SEO in a positive way
Thanks again very much for your suggestions
-
Yeah, the canonical should be ok - I just wanted to make sure you had something in place. One minor thing - I'd get that up on the page - with all the JS, the canonical is down on line 436 of the source code. You'd really be better off getting all that script into external files. It shouldn't make a big ranking difference, but it won't hurt.
You do have have a dozen pages that share your home-page TITLE and META description. Some seem to be odd, near-duplicates, where others probably just have duplicate meta data. Either way, I'd clean that up. Run this query in Google to see them:
site:ccisolutions.com intitle:"Acoustics, Sound, Lighting"
...or check Webmaster Tools (or your SEOmoz campaigns). Again, it probably isn't the culprit, but it's not helping.
I'd really dig to see if anything else is going on. The timing could just be coincidence. I find it really hard to believe that the META refresh change alone harmed you, unless this is just a temporary bounce while Google sorts it out. I definitely would NOT put it back - you risk compounding the problem. People rush to reverse things, assuming that will take them back to where they were, and it rarely does. More than 70% of the time, it just makes a bigger mess.
-
Thanks Dr. Pete. Here's the scoop, and I'm happy to provide the actual URLs so you can have a real view of the source code, etc.
The meta refresh was on this URL:
it redirected to this URL:
http://www.ccisolutions.com/StoreFront/IAFDispatcher?iafAction=showMain
We removed the meta refresh, and put "<rel="canonical" href="<a class=" external"="" rel="nofollow" target="_blank">http://www.ccisolutions.com/" /> to the head of both URLs</rel="canonical">
Our IT Manager couldn't get a 301 redirect to work from http://www.ccisolutions.com/StoreFront/IAFDispatcher?iafAction=showMain to http://www.ccisolutions.com, but in another Q&A thread Streamline Metrics mentioned that this really shouldn't matter as long as the canonical tag is properly set up, which I think it is.
What do you think? (and thanks very much!)
-
I tend to agree that it could just be a short-term re-evaluation period, but I do understand that patience is hard to come by in these situations. I have one concern - I assume the META refresh was acting as some kind of redirect to a different URL? When you removed it, did you canonical the other URL somehow? Just removing the refresh wouldn't consolidate the link "juice" of the various URLs, so it could be that you went from one form of fragmentation to another, different form.
That's just speculation, since I don't fully understand the old/new setups. If you can provides some details with fictional URLs, we might be able to dig in deeper.
-
Yes Paul. I agree. I have seen wild fluctuations on other sites that went through big changes. I believe this is probably an example of a time when we have to hang in there and ride through "The Dip."
"Time, Patience and Intelligent Work" is my mantra....but I also have to convince my CEO that the $1,000 we just spent fixing the meta refresh was actually a good thing. Rankings sinking like this aren't helping me make my case.
If an when I hear anything from Google I'll let you and Bryan know.
I'm sure we aren't the only ones who've fixed something technical that fixed a fragmented link profile. It sure would make me feel better to hear someone say "Yes, similar thing happened to me and now we're ricking it!" LOL - well, you can't blame a girl for dreaming!
-
I'll just add, Dana, that this major a change to the site will often cause massive ranking fluctuations as the crawlers work through the site and consolidate what's going on.
Small comfort, but a week really isn't long enough for things to have settled out to the "new normal". It's a good idea to keep looking for issues, but I'd also hold my breath for another week or two (or three) to see what happens as the dust settles. I know it goes against the grain to wait & see, but in this case I really think it's warranted.
Good luck, and keep breathing
Paul
-
Thanks Bryan. Yes, I took your advice and filed a reconsideration request just now. I spelled out exactly what happened with the whole meta refresh fix. This site has so many technical SEO problems that I am just hoping that it's not a completely different problem being caused by something else. I'll let you know what/if I hear anything.
I'd sure love to hear from any other SEOs out there who've ever been in similar situations!
Thanks again.
-
Like I said it can be many factors.. Perhaps making the drastic changed looks like a spam attack...
Total External Links: Before - 2,757 After - **4,311 **
Total Internal Links: Befpre - 125 After - 3,221
Total Links: Before - 2,882 After - 7,532More then doubled the link count. If you send Google a reconsideration request they will look at your issue and probably help you solve it. -
Thanks Bryan. Yes, I checked the link profile last night. Everything looks totally normal. Interestingly, nearly all of the added links to the new link total were Internal, not External, so I don't think the quality of the links is the issue, maybe moreso the quantity.
I don't think a reconsideration request would be appropriate in this instance because we have not been de-indexed. We are just being hit hard by the algo I think.
If that is the case, I would hope that over the next few weeks, as Google sees our internal links not changing so dramatically, things will settle down.
Any additional thoughts?
-
Perhaps adding the links together ended up pointing too many or a bigger ratio of low quality or non relevant links to your site... Or maybe the anchor link profile is now over optimized, the loss can be due to many reasons... I would recommend checking the new link profile and also making sure everything looks natural. If all is well and are still not ranking, you can send Google the reconsideration request explaining what happened.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I redirect a link even if the link is still on the site
Hi Folks, I've got a client who has a duplicate content because they actually create duplicate content and store the same piece of content in 2 different places. When they generate this duplicate content, it creates a 2nd link on the site going to the duplicate content. Now they want the 2nd link to always redirect to the first link, but for architecture reasons, they can't remove the 2nd link from the site navigation. We can't use rel-canonical because they don't want visitors going to that 2nd page. Here is my question: Are there any adverse SEO implications to maintaining a link on a site that always redirects to a different page? I've already gone down the road of "don't deliberately create duplicate content" with the client. They've heard me, but won't change. So, what are your thoughts? Thanks!
Technical SEO | | Rock330 -
Mobile site not ranking
Hello, I have a m.site.com version of my original site. It is about 1/10 the size, and no matter what I do-I can't get the site to rank. I've added more pages and specified canonical etc etc. Should I add as many pages as my larger site has? Are there specific places I should be submitting this version beyond the typical? I am at a loss, so any help would be greatly appreciated! Thanks! L
Technical SEO | | lfrazer1 -
How is this site ranking so well? Their link profile is awful and website is messy and difficult to use?
Hi folks, This question has been baffling me for some time now and I'm still struggling to get to the bottom of it. www.sterlingbuild.co.uk is the website of choice for Google when it comes to searches relating to roof windows, velux windows, fakro windows etc. I can't understand why? Their link profile is atrocious. I'm struggling to find one 'high quality' link in their profile at all. Most of their links are guest blog posts which Google is apparently now treating as spam, or links from other sites that they own - also spam. The design of the site is incredibly messy and confusing. But one of the biggest flaws of the site (which I am suspicious may also be what is helping them) is they list every single different size of window as a different product. So whereas with most websites in this market, you search for the type of window you want e.g. a VELUX GGL 3050 window, and then choose the size you need from a drop-down menu, Sterlingbuild list every size as a different product. So you have to scroll through reams of product listings to find the window type in the right size before you get to any information about the product itself. Not to mention, their site is riddled with duplicate content because 12 different sizes of product are not different products, they are the same product, just a different size, so they have the identical product description for numerous separate pages basically selling the same product. How on earth has Google decided this is the best website in the marketplace when it comes to roof windows?
Technical SEO | | LukeyB301 -
Toxic Link Removal
Greetings Moz Community: Recently I received an site audit from a MOZ certified SEO firm. The audit concluded that technically the site did not have major problems (unique content, good architecture). But the audit identified a high number of toxic links. Out of 1,300 links approximately 40% were classified as suspicious, 55% as toxic and 5% as healthy. After identifying the specific toxic links, the SEO firm wants to make a Google disavow request, then manually request that the links be removed, and then make final disavow request of Google for the removal of remaining bad links. They believe that they can get about 60% of the bad links removed. Only after the removal process is complete do they think it would be appropriate to start building new links. Is there a risk that this strategy will result in a drop of traffic with so many links removed (even if they are bad)? For me (and I am a novice) it would seem more prudent to build links at the same time that toxic links are being removed. According to the SEO firm, the value of the new links in the eyes of Google would be reduced if there were many toxic links to the site; that this approach would be a waste of resources. While I want to move forward efficiently I absolutely want to avoid a risk of a drop of traffic. I might add that I have not received any messages from Google regarding bad links. But my firm did engage in link building in several instances and our traffic did drop after the Penguin update of April 2012. Also, is there value in having a professional SEO firm remove the links and build new ones? Or is this something I can do on my own? I like the idea of having a pro take care of this, but the costs (Audit, coding, design, content strategy, local SEO, link removal, link building, copywriting) are really adding up. Any thoughts??? THANKS,
Technical SEO | | Kingalan1
Alan0 -
Not sure which way to go or what to do?
Hi there, I have been a pro member of SEOmoz for a while now but this is my question in the forum and although I have looked through so much helpful information I was wondering if someone could give me some further advice and guidance? I have a 3 year old ecommerce website personalisedmugs.co.uk which until May 2012 had some excellent growth, we then lost around 50% of traffic due to reduced organic rankings in google. We then noticed a further drop again in September. From researching information I believe this drop was from the penguin update and EMD update? Since these updates we have: *Stopped working with a company in India whom was looking after SEO for us for 18 months redeveloped/designed website and upgraded software version constantly refreshed website with content as we always have done Modified internal anchor text (this did seem keyword rich) My next steps I believe before giving up 😞 is checking our links coming into website? Is anybody able to please help me with regards to our links or point me in the right direction. I have no idea where to start or what do now? Someone may see something really obvious so any help or guidance is greatly appreciated to assist me in gaining some UK organic rankings back. Kind Regards, Mark
Technical SEO | | SparkyMarky0 -
Local Keywords Not Ranking Well in a Geographic Location (but Rank Very Well Outside of Geographic Location)
Has anyone experienced, in the last few months, an issue where a website that once ranked well for 'local' terms in Google stopped ranking well for those terms (but saw a ranking decrease only within the geographic location contained within those keywords)? For example only, some 'root' keywords could be: Chicago dentist Chicago dentists dentist Chicago dentists Chicago What happens is that when a searcher searches from within the geographic area of Chicago, IL, the target website no longer ranks on the 1st page for these types of keyword phrases, but they used to rank in the top 3 perhaps. However, if someone was to search for the same keyword phrases from another city outside of Chicago or set a custom location (such as Illinois or even Milwaukee, WI perhaps) in their Google search, the target website appears to have normal (high) 1st page rankings for these types of terms. My own theory: At first I thought it was a Penguin related issue but the client's rankings overall haven't appeared to have been affected on the date(s) of Penguin updates. Authority Labs and Raven Tools (which uses Authority Labs data) did not detect any ranking decrease and still reports all the local keyword rankings as high on the 1st page of Google. However, when the client themselves goes to check their own rankings (as they are within that affected geographic area), they are no where to be found on the 1st page. :S After some digging I found that (one of) the company's Google Places listings (the main office listing) became an 'unsupported' status in Google Maps. So now I am thinking that this phenomenon is due to the fact that other listings are now appearing in search results for the same location. For example, in this case, an individual dentist's Google Places listing (who works within the dental office) is being displayed instead of the actual dental office's listing. Also, the dentist's name on the Google Places listing is being swapped out by Google with the name of the dental office, but if you click through to the Google Places listing, it shows the name of the individual Dentist. Anyone encounter a similar issue or have any other theories besides the Google Places issue?
Technical SEO | | OrionGroup0 -
Subdomain and Domain Rankings
I have read here that domain names with keywords might add a boost to your search rank For instance using a completely inane example monkey-fights.com might get a boost compared to mfl.com (monkey fighting league) when searching for "monkey fights" There seems to be a hot debate as to how much bonus the first domain might get over the second, but leaving that aside for the moment. Question 1. Would monkey-fights.mfl.com get the same kind of bonus as a root domain bonus? Question 2. If the answer to 1 above was yes would a 301 redirect from the suddomain URL to root domain URL retain that bonus I was just thinking on how hard it is to get root domains these days that are not either being squatted on etc. and if this might be a way to get the same bonus, or maybe subdomains are less bonus prone and so it would be a waste of time Thanks
Technical SEO | | bThere0 -
If two links from one page link to another, how can I get the second link's anchor text to count?
I am working on an e-commerce site and on the category pages each of the product listings link to the product page twice. The first is an image link and then the second is the product name. I want to get the anchor text of the second link to count. If I no-follow the image link will that help at all? If not is there a way to do this?
Technical SEO | | JordanJudson0