Crawl at a stand still
-
Hello Moz'ers,
More questions about my Shopify migration...it seems that I'm not getting indexed very quickly (it's been over a month since I completed the migration) - I have done the following:
- used an Seo app to find and complete redirects (right away)
- used the same app to straighten out title tags, metas and alt tags
- submitted the sitemap
- re-submitted my main product URL's via Fetch
- checked the Console - no reported blocks or crawl errors
I will mention that I had to assign my blog to a sub-domain because Shopify's blog platform is awful. I had a lot of 404's on the blog, but fixed those. The blog was not a big source of traffic (I'm an ecomm business) Also, I didn't have a lot of backlinks, and most of those came along anyway.
I did have a number of 8XX and 9XX errors, but I spoke to Shopify about them and they found no issues. In the meantime, those issues pretty much disappeared in the MOZ reporting.
Any duplicate page issues now have a 200 code since I straightened out the title tags.
So what am I missing here?
Thanks in advance,
Sharon
www.zeldassong.com -
Hi Dan,
Thank you so very much!! I think things have caught up...I chatted with Google a few days ago and they said everything is ok. I am starting to see some keywords surface; it probably will kick in shortly, as I did quite a bit of SEO work with the new site.
I very much appreciate your help!
Best,
Sharon
-
Hi Sharon
Is there specific content or pages you're not seeing indexed that should be?
I checked with a site search and see that about 282 pages are indexed right now. I crawled the site, got about 578 active URLs, and subtracting /wp-content/ URLs and subpages, that leaves about 280ish URLs (correction, 380ish), which is the number indexed in Google. which is only 100 more than the number indexed in Google.
Perhaps things caught up, let me know if there's a URL not indexed that you expect to be.
Thanks!
-Dan
-
Hi Nicolas,
Yes, I did. I chatted with Google twice on Saturday, and they assured me that the site is being crawled - so, I dunno. I will wait a week or so and recheck the console and MOZ stats.
Thank you for taking the time to respond.
Sharon
-
Hello,
Have you send a new sitemap to help Google knows new URLs ?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl Issues / Partial Fetch Via Google
We recently launched a new site that doesn't have any ads, but in Webmaster Tools under "Fetch as Google" under the rendering of the page I see: Googlebot couldn't get all resources for this page. Here's a list: URL Type Reason Severity https://static.doubleclick.net/instream/ad_status.js
Technical SEO | | vikasnwuScript Blocked Low robots.txt https://googleads.g.doubleclick.net/pagead/id
AJAX Blocked Low robots.txt Not sure where that would be coming from as we don't have any ads running on our site? Also, it's stating the the fetch is a "partial" fetch. Any insight is appreciated.
0 -
Duplicate Page Titles Issue in Campaign Crawl Error Report
Hello All! Looking at my campaign I noticed that I have a large number of 'duplicate page titles' showing up but all they are the various pages at the end of the URL. Such as, http://thelemonbowl.com/tag/chocolate/page/2 as a duplicate of http://thelemonbowl.com/tag/chocolate. Any suggestions on how to address this? Thanks!
Technical SEO | | Rich-DC0 -
Added 301 redirects, pages still earning duplicate content warning
We recently added a number of 301 redirects for duplicate content pages, but even with this addition they are still showing up as duplicate content. Am I missing something here? Or is this a duplicate content warning I should ignore?
Technical SEO | | cglife0 -
Rel Canonical errors after seomoz crawling
Hi to all, I can not find which are the errors in my web pages with the tag cannonical ref. I have to many errors over 500 after seomoz crawling my domain and I don't know how to fix it. I share my URL for root page: http://www.vour.gr My rel canonical tag for this page is: http://www.vour.gr"/> Can anyone help me why i get error for this page? Many thanks.
Technical SEO | | edreamis0 -
Alternatives to SEOmoz's Crawl Diagnistics
I really like SEOmoz's Crawl diagnostics reports, it goes through the pages and finds all sorts of valuable information, I wanted to know if there are any other services that compete against this specific service, to test the accuracy of their crawl diagnistics. Thanks
Technical SEO | | BestOdds0 -
How to create a delayed 301 redirect that still passes juice?
My company is merging one of our sites into another site. At first I was just going to create a 301 redirect from domainA.com to domainB.com but we decided that would be too confusing for customers expecting to see domainA.com so we want to create a page that says something like "We've moved. please visit domainB.com or be redirected after 10 seconds". My question is, how do I create a redirect that has a delay and will this still pass the same amount of juice that a regular 301 redirect would? I've heard that meta refreshes are considered spammy by Google.
Technical SEO | | bewoldt0 -
Problem wth Crawling
Hello, I have a website http://digitaldiscovery.eu here in SEOmoz. Its strange since the last week SEOmoz is crawling only one page! And before it was crwaling all the pages. Whats happening? Help SEOmoz! :))
Technical SEO | | PedroM0 -
Should we use Google's crawl delay setting?
We’ve been noticing a huge uptick in Google’s spidering lately, and along with it a notable worsening of render times. Yesterday, for example, Google spidered our site at a rate of 30:1 (google spider vs. organic traffic.) So in other words, for every organic page request, Google hits the site 30 times. Our render times have lengthened to an avg. of 2 seconds (and up to 2.5 seconds). Before this renewed interest Google has taken in us we were seeing closer to one second average render times, and often half of that. A year ago, the ratio of Spider to Organic was between 6:1 and 10:1. Is requesting a crawl-delay from Googlebot a viable option? Our goal would be only to reduce Googlebot traffic, and hopefully improve render times and organic traffic. Thanks, Trisha
Technical SEO | | lzhao0