Indexing issue?
-
Hey guys when I do a search of site:thetechblock.com query in Google I don't seem to see any recent posts (nothing for August). In Google webmaster I see that the site is being crawled (I think), but I'm not sure.
I also see the the sitemaps are being indexed but again it just seems really odd that I'm not seeing these in Google results.
SEO seems all good too with SEO Moz. Is there something I'm not getting?
-
site command responses seem to yield conflicting results to "site:specific url". You are likely going through a transition period on a recently indexed url, so hold tight.
-
Come across this issue as well. Remember, www vs. non-www version can show differences as well.. it's not always logical why they include bulk of pages w/ one over the other as well. Even on sites defined www as preferred version, I've seen non-www version showing more pages in the index. This is why (one of the many reasons) I was very glad to see more data coming through on indexed pages in Webmaster Tools.
-
Interesting! Thanks for your help guys.
-
I believe it is a Google issue I tried the same thing with my site and received some weird results as well.
-
You are certainly being indexed. I just copied a block of text from a post only 23 minutes old and google returned your site at the top result (good job). I'm not sure that the site: operator is the best way to check the crawl.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirect Issue in .htaccess
Hi, I'm stumped on this, so I'm hoping someone can help. I have a Wordpress site that I migrated to https about a year ago. Shortly after I added some code to my .htaccess file. My intention was to force https and www to all pages. I did see a moderate decline in rankings around the same time, so I feel the code may be wrong. Also, when I run the domain through Open Site Explorer all of the internal links are showing 301 redirects. The code I'm using is below. Thank you in advance for your help! Redirect HTTP to HTTPS RewriteEngine On ensure www. RewriteCond %{HTTP_HOST} !^www. [NC]
Intermediate & Advanced SEO | | JohnWeb12
RewriteRule ^ https://www.%{HTTP_HOST}%{REQUEST_URI} [L,R=301] ensure https RewriteCond %{HTTP:X-Forwarded-Proto} !https
RewriteCond %{HTTPS} off
RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301] BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule> END WordPress USER IP BANNING <limit get="" post="">order allow,deny
deny from 213.238.175.29
deny from 66.249.69.54
allow from all</limit> #Enable gzip compression
AddOutputFilterByType DEFLATE text/plain
AddOutputFilterByType DEFLATE text/html
AddOutputFilterByType DEFLATE text/xml
AddOutputFilterByType DEFLATE text/css
AddOutputFilterByType DEFLATE application/xml
AddOutputFilterByType DEFLATE application/xhtml+xml
AddOutputFilterByType DEFLATE application/rss+xml
AddOutputFilterByType DEFLATE application/javascript
AddOutputFilterByType DEFLATE application/x-javascript #Setting heading expires
<ifmodule mod_expires.c="">ExpiresActive on
ExpiresDefault "access plus 1 month"
ExpiresByType application/javascript "access plus 1 year"
ExpiresByType image/x-ico "access plus 1 year"
ExpiresByType image/jpg "access plus 14 days"
ExpiresByType image/jpeg "access plus 14 days"
ExpiresByType image/gif "access plus 14 days"
ExpiresByType image/png "access plus 14 days"
ExpiresByType text/css "access plus 14 days"</ifmodule>0 -
Solving pagination issues for e-commerce
I would like to ask about a technical SEO issue that may cause duplicate content/crawling issues. For pagination, how the rel=canonical, rel="prev" rel="next" and noindex tag should be implemented. Should all three be within the same page source? Say for example, for one particular category we may have 10 pages of products (product catalogues). So we should noindex page 2 onwards, rel canonical it back to the first page and also rel="prev" and rel="next" each page so Google can understand they contain multiple pages. If we index these multiple pages it will cause duplicate content issues. But I'm not sure whether all 3 tags need adding. It's also my understanding that the search results should be noindexed as it does not provide much value as an entry point in search engines.
Intermediate & Advanced SEO | | Jseddon920 -
Replatforming possible issue with Submitting URLS
We are replatforming an ecommerce site and will need to change 90% of the urls. Many current urls contain uppercase characters and the new system forces all lowercase. We are concerned with submitting the urls all at once to google it might look like spam, receive some sort of penalty or negatively affect organic search. In the last month this site received 428k unique visitors, 3.2 mil page views and has about 10k urls. They are a top 3 competitor in their vertical. We are certainly planning to do all 301 redirects. What can we do additionally to reduce the risk of penalties here?
Intermediate & Advanced SEO | | RocketWeb0 -
Index or not index Categories
We are using Yoast Seo plugin. On the main menu we have only categories which has consist of posts and one page. We have category with villas, category with villa hotels etc. Initially we set to index and include in the sitemap posts and excluded categories, but I guess it was not correct. Would be a better way to index and include categories in the sitemap and exclude the posts in order to avoid the duplicate? It somehow does not make sense for me, If the posts are excluded and the categories included, will not then be the categories empty for google? I guess I will get crazy of this. Somebody has perhaps more experiences with this?
Intermediate & Advanced SEO | | Rebeca10 -
Dev Site Out of SERP But Still Indexed
One of our dev sites get indexed (live site robots.txt was moved to it, that has been corrected) 2-3 weeks ago. I immediately added it to our Webmaster Tools and used the Remove URL tool to get the whole thing out of the SERPs. A site:devurl search in Google now returns no results, but checking Index Status in WMT shows 2,889 pages of it still indexed. How can I get all instances of it completely removed from Google?
Intermediate & Advanced SEO | | Kingof50 -
Correct Syntax for Meta No Index
Hi, Is this syntax ok for the bots, especially Google to pick up? Still waiting for Google to drop lots of duplicate pages caused by parameters out of the index - if there are parameters in the querystring it inserts the code above into the head. Thanks!
Intermediate & Advanced SEO | | bjs20101 -
Google Site Extended Listing Not Indexed
I am trying to get the new Site map to be picked up by Google for the extended listing as its pulling from the old links and returning 404 errors. How can I get the site listing indexed quickly and have the extended listing get updated to point to the right places. This is the site - http://epaperflip.com/Default.aspx This is the search with the extended listing and some 404's - Broad Match search for "epaperflip"
Intermediate & Advanced SEO | | Intergen0 -
Why are new pages not being indexed, and old pages (now in robots.txt) remain in the index?
I currently have a site that was recently restructured, causing much of its content to be reposted, creating new URL's for each page. To avoid duplicates, all of the existing pages were added to the robots file. That said, it has now been over a week - I know Google has recrawled the site - and when I search for term X, it is stil the old page that is ranking, with the new one nowhere to be seen. I'm assuming it's a cached version, but why are so many of the old pages still appearing in the index? Furthermore, all "tags" pages (it's a Q&A site, like this one) were also added to the robots a few months ago, yet I think they are all still appearing in the index. Anyone got any ideas about why this is happening, and how I can get my new pages indexed?
Intermediate & Advanced SEO | | corp08030