Why did I drop ranking after setting up perm redirect, sitemap, and Google places??
-
I have a site that was ranking in the top two for my search terms. We had a funky url (it contained hyphens) and was advised to change it for SEO, so I setup a perm redirect through my web host (before it was a temporary one I think) At the same time I installed a sitemap plugin for Wordpress and also registered for a Google Places account. I can't remember the exact order I did this -- does it matter? Anyway, within a couple days of doing the above, my ranking dropped to the bottom of the second page. I would like to fix this, but I'm not sure. I need help please!
-
The site is three months old and I did the 301 about two weeks ago.
We can rule out the EMD problem since the old domain was just the name of the business. Plus everything was fine until two weeks ago when I messed it up, and that would be too coincidental.
Is there any issue with redirecting an html site to a php site?
-
I am not totally sure if it is a big difference. However, my Dr. House side to my brain still wants to focus on the 301s.
The only exception I can think of is if your previous domain violated the EMD Algorithm; exact keywords domain name for a website. I am wondering if Google devalued it and whatever backlinks that were helping your website?!? I am so paranoid about Google's filters...I swear it's like they have rat traps everywhere and when they go out to look at one caught in a specific trap, they audit it for field testing. You could be just unlucky?! Also the keywords you are competing with also would have a big factor if Google randomly sampled your site and audited it??? Any thoughts?
Also do you have a Webmaster Tools account? Google would report any directories that are not found. If they are there this would definitely prove my point of 301s not properly written. If so then click all of them and mark them as fixed and let Google recrawl.
*New websites with no much authority tend to do a Google dance. So if you're site is less than 12 months old or even 2 years old without a huge domain authority it could take a while to rebound.
-
Hi Chad,
I have an htaccess file that is in there already and it said that with the exception of the "Options+FollowSymLinks", so I added that line in. Could that be it? Is there something I could try next?
Thanks so much!
-
Looking at this I would definitely say or question the actual .htaccess rewite. It's a little clearer to me that your site lost back-links or page flow. Also the domain rank is an important metric of "authority". This is the stuff related to your website's actual content. Backlinks "deep or inner links" to various pages helps bolster the home page and "domain authority score". So I believe the actual 301 redirect technique is in question.
I know you said you did a redirect in the html files direct. I have never learned or heard of that technique (an example would be great aka markup/code); however, if your website is on an Apache Server, you would need to create .htaccess file and use this code:
Options +FollowSymLinks
RewriteEngine on
RewriteCond %{HTTP_HOST} !^newdomain.com
RewriteRule ^(.*)$ http://www.newdomain.com/$1 [R=301,L]Redirect /yourdirectory http://www.newdomain.com/newdirectory // for any other unquie content you want to redirect manually
*this is the way to do it, you can always Google or YouTube a video to show how to setup a .htaccess file on your server or on your cpu and then some special settings in filezilla (or whatever ftp software you use) so you can see the file. In FTP software .htaccess files are hidden, so you would create it and go nuts trying to locate the actual file. Filezillia I know has a feature or setting to unhide it so you can transfer it to the route of your server.
Let me know if this works!
-
Hi, thanks for your response. Yes, I did a 301 from the old domain to the new domain and the redesigned website. I had initially redirected the site using code in each of the html files, and that was fine, and my rank was climbing with the new site for awhile, before I did the 301 through the host. I went to ahrefs and copied and pasted the results below. I'm an SEO newb, so I'm not sure how to analyze the results.
Old Site:
URL Rank: 0Ahrefs Domain Rank: 0.17Backlinks: 58
Referring domains: 1New Site:
URL Rank: 7.9Ahrefs Domain Rank: 0Backlinks: 3Referring domains: 1
-
Hi Quick question, did you redirect 301 to new domain? I tried to read this, but I am just a tad unclear.
In theory a 301 to all or the exact anchor structure should pass all your page juice; however, sometimes doing this will enact a filter that spot checks for a link building footprint that violates Penguin Algorithm.
I know this sounds like a stretch, you might want to audit the link building you did initially. You can use Ahrefs (they have a free version) or if you want to take it to the next level use Authority Labs. They have great tools and reporting to help you look at your back-links.
If you have a ton of exact anchor matches and the same type of links on the same sites (site-wide-links), this can serve as a signal that could of had some of the links devalued.
The last option is if your previous url had exact anchors for keywords your ranked for. I am assuming you were able to obtain a branded url and this could affect to.
Reach back out to me, I can chat a little to help you hone in on the real issue.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sudden site drop google, not banned or penalised?
Hi all, We've been working on our site weldingmart.com for a while. 4 weeks ago we got a sudden drop from google rankings even with our own brand name. No clear cause found, and decided to walk through all technicalities of the SEO fundament. thus we did the following
Technical SEO | | jkossel
-> Setup google webmasters tools
no issues found, a few 404's, few 410's sure thats all ok,
-> Setup robots.txt to only index homepages, lister pages, content and product detail pages (disabled all filters and search queries) Also we banned russian and spammy bots for performance-sake.
-> Added sitemaps, and around 14k pages seemed to be already indexed.
-> When searching for "site:weldingmart.com" i can find 14k pages.
-> we had a low 33/100 page speed score and improved this to 76/100 So we did a lot of clean up and improved a lot of items. but still 2-weeks in. we still have no ranking improvements. Before we went down we had around 100 clicks a day from google. now 5 avg. by the way i think a main issue is the low link count of course but still googling your own name should return us in top3 right. Is there something we are missing, do we need more time. I just want to verify that we do not mis anything!1 -
What are best options for website built with navigation drop-down menus in JavaScript, to get those menus indexed by Google?
This concerns f5.com, a large website with navigation menus that drop down when hovered over. The sub nav items (example: “DDoS Protection”) are not cached by Google and therefore do not distribute internal links properly to help those sub-pages rank well. Best option naturally is to change the nav menus from JS to CSS but barring that, is there another option? Will Schema SiteNavigationElement work as an alternate?
Technical SEO | | CarlLarson0 -
Why is google webmaster tools ignoring my url parameter settings
I have set up several url parameters in webmaster tools that do things like select a specific products colour or size. I have set the parameter in google to "narrows" the page and selected to crawl no urls but in the duplicate content section each of these are still shown as being 2 pages with the same content. Is this just normal, i.e. showing me that they are the same anyway or is google deliberately ignoring my settings (which I assume it does when they are sure they know better or think I have made a mistake)?
Technical SEO | | mark_baird0 -
Sitemaps
Hi, I have doubt using sitemaps My web page is a news we page and we have thousands of articles in every section. For example we have an area that is called technology We have articles since 1999!! So the question is how can Make googl robot index them? Months ago when you enter the section technology we used to have a paginator without limits, but we notice that this query consume a lot of CPU per user every time was clicked. So we decide to limit to 10 pages with 1 records. Now it works great BUT I can see in google webmaster tools that our index decreased dramatically The answer is very easy, the bot doesn't have a way to get older technoly news articles because we limit he query to 150 records total Well, the Questin is how can I fix this? Options: 1) leave the query without limits 2) create a new button " all tech news" with a different query without a limit but paginated with (for example) 200 records each page 3) Create a sitemap that contain all the tech articles Any idea? Really thanks.
Technical SEO | | informatica8100 -
Sudden Drop in Keyword Rankings
We launched http://www.manufacturedfun.com/ earlier this year and had been ranking 1 & 2 on Google SERPs for the keywords we optimized, but last week we experienced a sudden drop in rankings that pretty much took us off the radar. For instance, 'popcorn machines' went as far back as page 5 and 'popcorn poppers' dropped even further to page 9. We are currently working on fixing the numerous Duplicate Page Title and Duplicate Page Content errors identified by SEOmoz, but since we have had those for about 6 months and ranked well anyway, I wonder if there's something else that we are missing. Any insight you can offer will be sincerely appreciated. Thank you!
Technical SEO | | GRIP-SEO0 -
Rankings Last, google says no flags
I have a site thatsit.net.au I have never really promoted, but it used to
Technical SEO | | AlanMosley
rank ok, the other day I decided I should spend some time on it as its my own
site and potential customers expect to see my own site rank well. I did a bit
of checking and in Bing I come up in first 4 for Web Development Perth, that is in
the .au TLD for Australia,
but in Google I am like 700+. If I take this line out of my home page and put it in quotes “a local Perth web development company” I see I am the only person in the world to have
this long tail query in both Bing and Google. If I search for the same without
quotes in Bing I come up first. If I do so in Google, I come up absolutely last.
I thought I must be flagged, but Google have replied that there is no manual
action taken on my site. I have no answer for this, It is hard to believe if I am the only person in
the world to have a long tail term that I would come last for it. Any ideas? I hope Matt Cutts reads this, and can come up with an explanation
besides write good and useful content.0 -
Severe rank drop due to overwritten robots.txt
Hi, Last week we made a change to drupal core for an update to our website. We accidentally overwrote our good robots.txt that blocked hundreds of pages with the default drupal robots.txt. Several hours after that happened (and we didn't catch the mistake) our rankings dropped from mostly first, second place in Google organic to bottom and mid first page. Basically I believe we flooded the index with very low quality pages at once and threw a red flag and we got de-ranked. We have since fixed the robots.txt and have been re-crawled but have not seen a return in rank. Would this be a safe assumption of what happened? I haven't seen any other sites getting hit in the retail vertical yet in regards to any Panda 2.3 type of update. Will we see a return in our results anytime soon? Thanks, Justin
Technical SEO | | BrettKrasnove0 -
Redirect Flash Site for Google Only - Is this against TOS?
A photographer client has a flash website, purchased as from a (well respected) template company. The main site is at the root domain, and the HTML version is at www.example.com/?load=html If I visit the site on a browser without Flash installed, I am re-directed automatically to the HTML version. I'm concerned as the site has some great links and the HTML version is well optimised, but doesn't appear anywhere in Google for chosen keywords (ranks perfectly for brand related searches). Google is indexing the Flash version of the site, but I would rather it didn't (there's no real content (just Javascript to load the SWF) and all of the pages load under one URL). How can I block the Flash version from Google but still make the incoming links count towards the HTMl version of the site? If I re-direct Google to the HTML version, is this cloaking, and is it frowned upon? Thanks for any advice you can offer.
Technical SEO | | cmaddison0