Website is not indexed in Google, please help with suggestions
-
Our client website was removed from Google index. Anybody could recommend how to speed up process of re index:
- Webmaster tools done
- SM done (Twitter, FB)
- sitemap.xml done
- backlinks in process
- PPC done
- Robots.txt is fine
Guys any recommendations are welcome, client is very unhappy.
Thank you
-
What do you mean you had no other choice? What forced you to add these tags to your client's site?! Because they were updating it?
...uhh.. This thread is bizarre. Anyway sounds like all of it was a non-issue. Mark is absolutely correct about your real issue though. Get some redirects in there asap.
-
Just an aside - you're going to have indexation issues - you have both www and non-www versions live on the site, with no canonicals pointing to one version. You also have index.php as a live page linked to from the logo. I'd definitely recommend implementing canonical tags across the site.
Mark
-
HaHa - did not even check, yeah you are indexed... recent though, no cache on some of the pages has been created yet, so prob within a week?
Homepage was crawled on April 25th
-
When I search new homes developer st modwen in Google.com (no quote marks & I'm in the UK), this page from your site is at No3 and your homepage is at No4.
When I search st modwen homes, you're at No1. I'm no expert, but that doesn't look like being de-indexed to me.
Or do you simply mean your rankings for the term new homes developer have dropped?
Also, I don't understand this: _buit we had no other option client was changing content on live site, so we had to noindex, nofollow. _
-
Thank you, I already done G+. Regarding noindex, no follow I completely agree, we had no other choice. Thank you again.
-
Getting G+'s on fresh QUALITY content is one of the best ways to quick index with Google in my opinion these days.
Just a suggestion, I would NEVER noindex an indexed site just because of content changes.
make a clone, point the domain using vhost, temporarily to a subdirectory make your changes then re-point domain in vhost (or if cpanel just use a pointer) - This way no one is the wiser to the changes INCLUDING Google.
Or just make the changes, the ramificatrions of noindex are much more long lasting than a content change (unless it was left in shambles for weeks)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My product category pages are not being indexed on google can someone help?
My website has been indexed on google and all of its pages can be found on google except for the product category pages - which are where we want our traffic heading to, so this is a big problem for us. Our website is www.skirtinguk.com And an example of a page that isn't being indexed is https://www.skirtinguk.com/product-category/mdf-skirting-board/
Intermediate & Advanced SEO | | chelseaskirtinguk0 -
HTTP HTTPS Migration Gone Wrong - Please Help!
We have a large (25,000 Products) ecommerce website, and we did an HTTP=>HTTPS migration on 3/14/17, and our rankings went in the tank, but they are slowly coming back. We initially lost 80% of our organic traffic. We are currently down about 50%. Here are some of the issues. In retrospect, we may have been too aggressive in the move. We didn't post our old sitemaps on the new site until about 5 days into the move. We created a new HTTPS property in search console. Our redirects were 302, not 301 We also had some other redirect issues We changed our URL taxonomy from http://www.oursite.com/category-name.html to https://www.oursite.com/category-name (removed the .html) We changed our filters plugin. Proper canonicals were used, but the filters can generate N! canonical pages. I added some parameters (and posted to Search Console) and noindex for pages with multiple filter choices to cut down on our crawl budget yesterday. Here are some observations: Google is crawling like crazy. Since the move, 120,000+ pages per day. These are clearly the filtered pages, but they do have canonicals. Our old sitemaps got error messages "Roboted Out". When we test URLs in Google's robots.txt tester, they test fine. Very Odd. At this point, in search console
Intermediate & Advanced SEO | | GWMSEO
a. HTTPS Property has 23,000 pages indexed
b. HTTP Property has 7800 pages indexed
c. The crawl of our old category sitemap (852 categories) is still pending, and it was posted and submitted on Friday 3/17 Our average daily organic traffic in search console before the move was +/-5,800 clicks. The most recent Search Console had HTTP: 645 Clicks HTTPS: 2000 clicks. Our rank tracker shows a massive drop over 2 days, bottoming out, and then some recovery over the next 3 days. HTTP site is showing 500,000 backlinks. HTTPS is showing 23,000 backilinks. I am planning on resubmitting the old sitemaps today in an attempt to remap our redirects to 301s. Is this typical? Any ideas?0 -
Google does not favour php websites?
Hi there. An SEO company recently told me that google does not favour php development? This seems rather sketchy, I have not read that google doesn't favour this anywhere, did I just miss that part of SEO or are these guys blowing a little smoke?
Intermediate & Advanced SEO | | ProsperoDigital1 -
I have removed over 2000+ pages but Google still says i have 3000+ pages indexed
Good Afternoon, I run a office equipment website called top4office.co.uk. My predecessor decided that he would make an exact copy of the content on our existing site top4office.com and place it on the top4office.co.uk domain which included over 2k of thin pages. Since coming in i have hired a copywriter who has rewritten all the important content and I have removed over 2k pages of thin pages. I have set up 301's and blocked the thin pages using robots.txt and then used Google's removal tool to remove the pages from the index which was successfully done. But, although they were removed and can now longer be found in Google, when i use site:top4office.co.uk i still have over 3k of indexed pages (Originally i had 3700). Does anyone have any ideas why this is happening and more importantly how i can fix it? Our ranking on this site is woeful in comparison to what it was in 2011. I have a deadline and was wondering how quickly, in your opinion, do you think all these changes will impact my SERPs rankings? Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0 -
Huge Google index on E-commerce site
Hi Guys, Refering back to my original post I would first like to thank you guys for all the advice. We implemented canonical url's all over the site and noindexed some url's with robots.txt and the site already went from 100.000+ url's indexed to 87.000 urls indexed in GWT. My question: Is there way to speed this up?
Intermediate & Advanced SEO | | ssiebn7
I do know about the way to remove url's from index (with noindex of robots.txt condition) but this is a very intensive way to do so. I was hoping you guys maybe have a solution for this.. 🙂0 -
Our login pages are being indexed by Google - How do you remove them?
Each of our login pages show up under different subdomains of our website. Currently these are accessible by Google which is a huge competitive advantage for our competitors looking for our client list. We've done a few things to try to rectify the problem: - No index/archive to each login page Robot.txt to all subdomains to block search engines gone into webmaster tools and added the subdomain of one of our bigger clients then requested to remove it from Google (This would be great to do for every subdomain but we have a LOT of clients and it would require tons of backend work to make this happen.) Other than the last option, is there something we can do that will remove subdomains from being viewed from search engines? We know the robots.txt are working since the message on search results say: "A description for this result is not available because of this site's robots.txt – learn more." But we'd like the whole link to disappear.. Any suggestions?
Intermediate & Advanced SEO | | desmond.liang1 -
Does Google Index an Alert Div w/Delayed Hide
We have a div at the top of a client's the page that displays an alert to the user. After 30 seconds it is rendered hidden. Does Google index this? Does Google take this into account when it ranks the page?
Intermediate & Advanced SEO | | WEOMedia0 -
International IP redirection - help please!
Hi, We have a new client who has built a brand in the UK on a xyz.com domain. The "xyz.com" is now a brand and features on all marketing. Lots of SEO work has taken place and the UK site has good rankings and traffic. They have now expanded to the US and with offline marketing leading the way, xyz.com is the brand being pushed in the US. So with the launch of the offline marketing US IP's are now redirected to a US version of the site (subfolder) with relevant pricing and messaging. This is great for users, but with Googlebot being on a US IP it is also being redirected and the UK pages have now dropped out of the index. The solution we need would ideally have both UK and US users searching for xyz.com, but would see them land on respective static pages with correct prices. Ideally no link authority would be moved via redirection of users. We have considered the following solutions Move UK site to subfolder /uk and redirect UK ips to this subfolder (and so not googlebot) downside of this is it will massively impact the UK rankings which are the core driver of the business - also would this be deemed as illegal cloaking? natural links will always be to the xyz.com page and so longer term the US homepage will gain authority and UK homepage will be more reliant on artificial linkbuilding. Use a overlay that detects IP address and requests users to select relevant country (and cookies to redirect on second visit) this has been rejected by ecommerce team as will increase bounce rate% & we dont want users to be able to see other countries due to prduct and price differences. Use a homepage with country selection (and cookies to redirect on second visit) this has been rejected by ecommerce team as will increase bounce rate% & we dont want users to be able to see other countries due to prduct and price differences. Is there an easy solution to this problem that we're overlooking? Is there another way of legal cloaking we could use here? Many thanks in advance for any help here
Intermediate & Advanced SEO | | Red_Mud_Rookie0