Problem with indexing
-
Hello, we've changed our CMS recently, everything seems to work well, but for some reason google, and other crawlers can't see or index other pages than main. There is no restriction in robots, nor any other visible issue. Please help if you can.
Website: http://www.design-glassware.com/
-
I just want to clarify what I was saying above. In my case, even though I put brand spanking new posts on my blog they weren't getting indexed. I had about 6 duplicate posts and 4 new ones and none of them were getting indexed. But, when I dealt with the duplicate content issues for those 6 (i.e got rid of the original copies and redirected the old site to the new) then all of my content suddenly got indexed. Just sayin'
-
Thank you. I'm sure it's not a duplicate content problem. For example even when I try to put new URL as destination in Ad words, and it didn't took as working.
I think it's more of a technical issue, not SEO, probably was a wrong question here.
Thank you again everyone for helping me out.
-
In my case above, I had new pages as well that weren't being indexed, but they did get indexed once I dealt with the duplicate content. Perhaps Google put some sort of block on my whole blog because of the duplicates? If your information is duplicated anywhere you need to check this.
Also, do you have webmaster tools set up? This will tell you a lot about what is happening with your domain. Have you submitted a sitemap? This will help as well.
It may also help to build a few external links to some of the new pages. This will help Google find them.
-
You may have to resubmit your site through webmaster tools AFTER fixing the forementioned issues.
-
good catch bootleg. I think if you get the dupe content and 301's working correctly google will love you a little more.
-
Yes, I've tried Xenu (right after you asked me), it shows broken links.
-
There are some new pages we've made, and still no indexing. Google visited our page quite often, and it's been almost 1 month since we've changed the URLs.
-
Thank you for the heads up. We know about the duplicate content problem, just our webmaster is somewhat slow.
The URLs are new, and we will do 301 redirect and "canonical".
-
How long has it been since you've had the new CMS? It can sometimes take weeks to get a site crawled and indexed.
I'll share with you something that happened to me recently that may be applicable. I recently started a new site and wrote a few blog posts. Then I changed my mind and decided that those posts would be better suited to another site I had. So, I added them to another site I had. But, they wouldn't index! I realized that I hadn't deleted the content off of my first site. So, when Google came across the new blog I'm assuming they saw all the content as duplicate and didn't index it. I immediately took the content off of the original domain and I 301'd the original domain to the new one and within hours all of my posts were indexed.
-
There's nothing wrong with the links and with your robots.txt. But ...
You obviously have a huge duplicate content problem. Just look at the search result the site: query returns. There are a lot of old links (from the old CMS I suppose). Click through the links and you'll always get to the main page, but the URL remains the same. I.e. no redirect to the correct page with the new URL or to the Homepage.
You should have a look at your old URLs and redirect them with a HTTP 301 redirect to the new URL.
As a quick fix add a rel="canonical" tag to your new pages. This way google will index the correct page (and only the correct page) on its next crawl. Setting up the redirects should have happened prior to switching the CMS ...
-
Have you tried crawling it with Xenu?
-
are you using the exact same URL's with the new CMS as the old cms? if not have you done 301's?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No Longer Indexed in Google (Says Redirected)
Just recently my page, http:/www./waikoloavacationrentals.com/mauna-lani-terrace, was no longer indexed by google. The sub pages from it still are. I have not done anything sketchy with the page. When I went into the google fetch it says that it is redirected. Any ideas what is this all about? Here is what it says for the fetch: Http/1.1 301 moved permanently
Technical SEO | | RobDalton
Server: nginx
Date: Tue, 07 Mar 2017 00:43:26GMT
Content-Type: text/html
Content-Length: 178
Connection: keep-alive
Keep-Alive: timeout=20
Location: http://waikoloavacationrentals.com/mauna-lani-terrace <title>301 moved permanently</title> <center> 301 moved permanently </center> <center>nginx</center>0 -
Google Indexing of Site Map
We recently launched a new site - on June 4th we submitted our site map to google and almost instantly had all 25,000 URL's crawled (yay!). On June 18th, we made some updates to the title & description tags for the majority of pages on our site and added new content to our home page so we submitted a new sitemap. So far the results have been underwhelming and google has indexed a very low number of the updated pages. As a result, only a handful of the new titles and descriptions are showing up on the SERP pages. Any ideas as to why this might be? What are the tricks to having google re-index all of the URLs in a sitemap?
Technical SEO | | Emily_A0 -
Micro-site homepage not being indexed
http://www.reebok.com/en-US/reebokonehome/ This is a homepage for an instructor network micro-site on Reebok.com The robots.txt file was excluding the /en-US/ directory, we've since removed that exclusion, and resubmitted this URL for indexing via Google Webmaster but we are still not seeing it in the index. Any advice would be very helpful, we may be missing some blocking issue or perhaps we just need to wait longer?
Technical SEO | | PatrickDugan0 -
Should We Index These Category Pages?
Currently we have marked category pages like http://www.yournextshoes.com/celebrities/kim-kardashian/ as follow/noindex as they essentially do not include any original content. On the other hand, for someone searching for Kim Kardashian shoes, it's a highly relevant page as we provide links to all the Kim Kardashian shoe sightings that we have covered. Should we index the category pages or leave them unindexed?
Technical SEO | | Jantaro0 -
Canonical needed after no index
Hi do you need to point canonical from a subpage to main page if you have already marked a no index on the subpage, like when google is not indexing it so do we need canonicals now as is it passing any juice?
Technical SEO | | razasaeed0 -
Website is not indexed in Google
Hi Guys, I have a problem with a website from a customer. His website is not indexed in Google (except for the homepage). I could not find anything that can possibly be the cause. I already checked the robots.txt, sitemap, and plugins on the website. In the HTML code i also couldn't find anything which makes indexing harder than usual. This is the website i am talking about: http://www.xxxx.nl/ (Dutch) The only thing that i am guessing now is the Google sandbox, but even that is quite unlikely. I hope you guys discover something i could not find! Thanks in advance 🙂
Technical SEO | | B.Great0 -
Missing files in Google and Bing Index
We uploaded our sitemap a while back and we are no longer see around 8 out of 33 pages. We try submitting the sitemap again about 1-2 weeks ago and there but no additional pages are seen when I do site: option in both search engines. I reviewed the sitemap and it includes all the pages. I am not seeing any errors in the seo moz for these pages. Any ideas what I should try?
Technical SEO | | EZSchoolApps0 -
Spider Indexed Disallowed URLs
Hi there, In order to reduce the huge amount of duplicate content and titles for a cliënt, we have disallowed all spiders for some areas of the site in August via the robots.txt-file. This was followed by a huge decrease in errors in our SEOmoz crawl report, which, of course, made us satisfied. In the meanwhile, we haven't changed anything in the back-end, robots.txt-file, FTP, website or anything. But our crawl report came in this November and all of a sudden all the errors where back. We've checked the errors and noticed URLs that are definitly disallowed. The disallowment of these URLs is also verified by our Google Webmaster Tools, other robots.txt-checkers and when we search for a disallowed URL in Google, it says that it's blocked for spiders. Where did these errors came from? Was it the SEOmoz spider that broke our disallowment or something? You can see the drop and the increase in errors in the attached image. Thanks in advance. [](<a href=)" target="_blank">a> [](<a href=)" target="_blank">a> LAAFj.jpg
Technical SEO | | ooseoo0