Google Indexing - what did I missed??
-
Hello, all SEOers~
I just renewed my web site about 3 weeks ago, and in order to preserve SEO values as much as possible, I did 301 redirect, XML Sitemap and so on for minimize the possible data losses.
But the problem is that about week later from site renewal, my team some how made mistake and removed all 301 redirects. So now my old site URLs are all gone from Google Indexing and my new site is not getting any index from Google. My traffic and rankings are also gone....OMG
I checked Google Webmaster Tool, but it didn't say any special message other than Google bot founds increase of 404 error which is obvious.
Also I used "fetch as google bot" from webmaster tool to increase chance to index but it seems like not working much.
I am re-doing 301 redirect within today, but I am not sure it means anything anymore.
Any advise or opinion?? Thanks in advance~!
-
thanks for your kind advice.
I will try to follow up your suggestions~ thanks
-
Hi there,
Complete your 301 redirects, but do it 1-on-1 basis - one old URL toward one new URL. DO NOT redirect all your URLs to your home page. (after you do that, verify to make sure is indeed 301 redirect and not other types of redirects like 302).
a) The most beneficial way is to 301 redirect as much as possible following a structural way: the old categories to the new categories and so on. Don't worry there are no limits on how many 301 redirects you can use, just don't loop them with intermediary redirects, like: old URL -> 301 -> intermediary URL -> 301 -> final active URL. Go directly from the old to the new, final, active URL in 1 step if possible.
b) Verify if in your Webmaster Tools there are old Sitemaps. If there are, delete the old ones and create new ones that have to contain only new URLs.
c) Make the same move for the robots.txt file as well. (If you don't have a robots.txt file, create one and place it in the root of your domain name, e.g. www.example.com/robots.txt )
d) If possible, use all instances of "fetch as google bot" and then subscribe those URLs for crawling but do it as much as possible for the main node pages from your website (e.g. main categories), don't waste this function for final product pages, as Googlebot will go link-by-link from the categories and will re-discover all your URLs.
e) Be patient, the Page Rank and old traffic flow won't happen over night. It can take up to 3 months for Googlebot to re-discover and re-index all the pages of your website (i know it's a long time but usually happens a lot sooner).
f) Keep a close eye on Webmaster Tools account and make sure you solve any problems that appear in a due time.
g) Scan your entire new website with a software to make sure you don't have broken links, it's important. If you find any broken links, solve them imediately.
I hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google is indexing bad URLS
Hi All, The site I am working on is built on Wordpress. The plugin Revolution Slider was downloaded. While no longer utilized, it still remained on the site for some time. This plugin began creating hundreds of URLs containing nothing but code on the page. I noticed these URLs were being indexed by Google. The URLs follow the structure: www.mysite.com/wp-content/uploads/revslider/templates/this-part-changes/ I have done the following to prevent these URLs from being created & indexed: 1. Added a directive in my Htaccess to 404 all of these URLs 2. Blocked /wp-content/uploads/revslider/ in my robots.txt 3. Manually de-inedex each URL using the GSC tool 4. Deleted the plugin However, new URLs still appear in Google's index, despite being blocked by robots.txt and resolving to a 404. Can anyone suggest any next steps? I Thanks!
Technical SEO | | Tom3_150 -
No Longer Indexed in Google (Says Redirected)
Just recently my page, http:/www./waikoloavacationrentals.com/mauna-lani-terrace, was no longer indexed by google. The sub pages from it still are. I have not done anything sketchy with the page. When I went into the google fetch it says that it is redirected. Any ideas what is this all about? Here is what it says for the fetch: Http/1.1 301 moved permanently
Technical SEO | | RobDalton
Server: nginx
Date: Tue, 07 Mar 2017 00:43:26GMT
Content-Type: text/html
Content-Length: 178
Connection: keep-alive
Keep-Alive: timeout=20
Location: http://waikoloavacationrentals.com/mauna-lani-terrace <title>301 moved permanently</title> <center> 301 moved permanently </center> <center>nginx</center>0 -
Sitemap indexation
3 days ago I sent in a new sitemap for a new platform. Its 23.412 pages but until now its only 4 pages (!!) that are indexed according to the Webmaster Tools. Why so few? Our stage-enviroment got indexed (more than 50K pages) in a few days by a mistake.
Technical SEO | | Morten_Hjort0 -
Https indexed...how?
Hello Moz, Since a while i am struggling with a SEO case: At the moment a https version of a homepage of a client of us is indexed in Google. Thats really strange because the url is redirected to an other website url for three weeks now. And we did everything to make clear to google that he has to index the other url.
Technical SEO | | Searchresult
So we have a few homepage urls A https://www.website.nl
B https://www.websites.nl/category
C http://www.websites.nl/category What we did: Redirected A with a 301 to B, a redirect from A or B to C is difficult because of the security issue with the ssl certificate. We put the right canonical url (VERSION C) on every version of the homepage(A,B) We only put the canonical urls in the sitemap.xml, only version C and uploaded it to Google Webmastertools We changed all important internal links to Version C We also get some valuable external backlinks to Version C Is there something i missed or i forget to say to Google hey look you've got the wrong url indexed, you have to index version C? How is it possible Google still prefers Version A after doing al those changes three weeks a go? I'am really looking forward to your answer. Thanks a lot in advanced! Greetz Djacko0 -
Lots of Pages Dropped Out of Google's Index?
Until yesterday, my website had about 1200 pages indexed in Google. I did lots of changes: removed low quality content, rewrote passable content to make it better, wrote high quality content, got lots of likes and shares on social networks, etc. Now this morning I see that out of 1252 pages submitted, only 691 are indexed. Is that a temporary situation related to the recent updates? Anyone seeing this? What should I interpret about this?
Technical SEO | | sbrault740 -
Google places address missing
I have a google places acct that used to rank fairly well. Then i changed addresses and updated the places account. It stopped ranking and whats worse is that the address will not show up in the listing. I have gone back in and edited it, verified it, done everything, but the address does not show on the places page or google results. It shows the city but not the actual address. Ideas?
Technical SEO | | webfeatseo0 -
Crawling and indexing content
If a page element (div, e.g.) is initially hidden and shown only by a hover descriptor or Javascript call, will Google crawl and index it’s content?
Technical SEO | | Mont0 -
Dealing with indexable Ajax
Hello there, My site is basically an Ajax application. We assume lots of people link into deep pages on the site, but bots won't be able to read past the hashmarks, meaning all links appear to go to our home page. So, we have decided to form our Ajax for indexing. And so many questions remain. First, only Google handles indexable Ajax, so we need to keep our static "SEO" pages up for Bing and Yahoo. Bummer, dude, more to manage. 1. How do others deal with the differences here? 2. If we have indexable Ajax and static pages, can these be perceived as duplicate content? Maybe the answer is to disallow google bot from indexing the static pages we made. 3. What does your canonical URL become? Can you tell different search engines to read different canonical URLs? So many more questions, but I'll stop there. Curious if anyone here has thoughts (or experience) on the matter. Erin
Technical SEO | | ErinTM2