Do uncrawled but indexed pages affect seo?
-
It's a well known fact that too much thin content can hurt your SEO, but what about when you disallow google to crawl some places and it indexes some of them anyways (No title, no description, just the link)
I am building a shopify store and it's imposible to change the robots.txt using shopify, and they disallow for example, the cart.
Disallow: /cart
But all my pages are linking there, so google has the uncrawled cart in it's index, along with many other uncrawled urls, can this hurt my SEO or trying to remove that from their index is just a waste of time?
-I can't change anything from the robots.txt
-I could try to nofollow those internal links
What do you think?
-
Hi there!
Yes, pages with poor user experience do affect SEO, even more if those pages receive traffic from google.
So there are two scenaries here, regarding whether they receive traffic:1. They do receive traffic, either move to other CMS (like wordpress and woocommerce) or improve those pages experience.
2. Do not receive traffic, leave them as is.Alls this is considering that you are not able to change Robots.txt and do not want to move to other CMS.
Adding nofollow links will not make any difference, because Google does honor the robot.txt, so pages blocked there will not be crawled never. So as to remove them fron index is crucial beind allowed in robots(or what's the sabe, not being disallowed)Hope it helps.
Best Luck.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it bad for SEO to have a page that is not linked to anywhere on your site?
Hi, We had a content manager request to delete a page from our site. Looking at the traffic to the page, I noticed there were a lot of inbound links from credible sites. Rather than deleting the page, we simply removed it from the navigation, so that a user could still access the page by clicking on a link to it from an external site. Questions: Is it bad for SEO to have a page that is not directly accessible from your site? If no: do we keep this page in our Sitemap, or remove it? If yes: what is a better strategy to ensure the inbound links aren't considered "broken links" and also to minimize any negative impact to our SEO? Should we delete the page and 301 redirect users to the parent page for the page we had previously hidden?
Intermediate & Advanced SEO | | jnew9290 -
Do internal links from non-indexed pages matter?
Hi everybody! Here's my question. After a site migration, a client has seen a big drop in rankings. We're trying to narrow down the issue. It seems that they have lost around 15,000 links following the switch, but these came from pages that were blocked in the robots.txt file. I was wondering if there was any research that has been done on the impact of internal links from no-indexed pages. Would be great to hear your thoughts! Sam
Intermediate & Advanced SEO | | Blink-SEO0 -
Page Speed Factors For SEO
Hey Guys, I have developed a page and optimised it. I have got a dilemma, I have 2 variants of the optimised page I could use. The page is responsive and uses bootstrap from an external CDN. The 2 variants: External CDN - This is adding an an extra request and is delivering the entire framework (not ideal for mobile) I've looked in the node/grunt.js route (+unCSS) to remove redundant CSS, which led me to my next variant. Inline CSS. After doing some grunt.js work I shaved out the redundant code from the framework then added it inline. I will also point out that all assets are optimised, all CSS/JS/HTML is minifed. In terms for score the 1st variant is less than the second, but I believe that most users of the internet already have bootstrap cached due to it being so common. The ultimate question comes down to ranking, I'm not entirely sure where I draw the line between development and SEO (I will also ask in Stack Overflow). Which one would rank better? all other factors being equal.
Intermediate & Advanced SEO | | AkashMakwana0 -
How long does google take to show the results in SERP once the pages are indexed ?
Hi...I am a newbie & trying to optimize the website www.peprismine.com. I have 3 questions - A little background about this : Initially, close to 150 pages were indexed by google. However, we decided to remove close to 100 URLs (as they were quite similar). After the changes, we submitted the NEW sitemap (with close to 50 pages) & google has indexed those URLs in sitemap. 1. My pages were indexed by google few days back. How long does google take to display the URL in SERP once the pages get indexed ? 2. Does google give more preference to websites with more number of pages than those with lesser number of pages to display results in SERP (I have just 50 pages). Does the NUMBER of pages really matter ? 3. Does removal / change of URLs have any negative effect on ranking ? (Many of these URLs were not shown on the 1st page) An answer from SEO experts will be highly appreciated. Thnx !
Intermediate & Advanced SEO | | PepMozBot0 -
How do I create a strategy to get rid of dupe content pages but still keep the SEO juice?
We have about 30,000 pages that are variations of "<product-type>prices/<type-of-thing>/<city><state "<="" p=""></state></city></type-of-thing></product-type> These pages are bringing us lots of free conversions because when somebody searches for this exact phrase for their city/state, they are pretty low-funnel. The problem that we are running into is that the pages are showing up as dupe content. One solution we were discussing is to 301-redirect or canonical all the city-state pages back to jus tthe "<type of="" thing="">" level, and then create really solid unique content for the few hundred pages we would have at that point.</type> My concern is this. I still want to rank for the city-state because as I look through our best-converting search-terms, they nearly always have the city-state in the search term, so the search is some variation of " <product-type><type of="" thing=""><city><state>"</state></city></type></product-type> One thing we thought about doing is dynamically changing the meta-data & headers to add the city-state info there. Are there other potential solutions to this?
Intermediate & Advanced SEO | | editabletext0 -
Webmaster Index Page significant drop
Has anyone noticed a significant drop in indexed pages within their Google Webmaster Tools sitemap area? We went from 1300 to 83 from Friday June 23 to today June 25, 2012 and no errors are showing or warnings. Please let me know if anyone else is experiencing this and suggestions to fix this?
Intermediate & Advanced SEO | | datadirect0 -
Corporate pages and SEO help
We own and operate more than two dozen educational related sites. The business team is attempting to standardize some parts of our site hierarchy so that our sitemap.php, about.php, privacy.php and contact.php are all at the root directory. Our sitemap.php is generated by our sitemap.xml files, which are generated from our URLlist.txt files. I need to provide some feedback on this initiative. I'm worried about adding more stand-alone pages to our root directory and as part of a separate optimization in the future I was planning to suggest we group the "privacy", "about" and "contact" pages in a separate folder. We generally try to put our most important pages/directories for SEO in the root as our homepages pass a lot of link juice and have high authority. We do not invest SEO time into optimizing these pages as they're not pages we're trying to rank for, and I've already been looking into even no-following all links to them from our footer, sitemap, etc. I know that adding these "corporate" pages to a site are usually a standard part of the design process but is there any SEO benefit to having them at the root? And along the same lines, is there any SEO harm to having unimportant pages at the root? What do you guys think out there in Moz land?
Intermediate & Advanced SEO | | Eric_edvisors0 -
Does domain WhoIs Privacy affect SEO efforts?
Hi guys, I got a hopefully quick question. I am designing a site currently that is made up of many different domain names as part of a network. I've heard that Google will penalize however is linking is passed back and forth between these domains if the registrant information was the same. I have WhoIS privacy information on all the domains to stop telemarketers and spam as well as (hopefully stop Google from getting suspicious). I'm not doing anything bad or against Google rules but I can see how they might think that if I have a huge network and links are being passed between these. It's a friend of mine who owns like 2000 domains and he wants to put legitimate information on each one and rank them higher, it's an interesting concept but I won't go into to much detail. So my question is basically, does having WhoIS privacy on all these domains, will it affect me in anyway in the SEO process? Will google count the links passing back and forth as legitimate? Or might it get suspicious and think I am spam? Are there ways to see what server it's coming from? Should all these sites be on different servers? Any help is much appreciated!
Intermediate & Advanced SEO | | itechware0