No index?
-
Hi,
I have about 600 posts and most of them are not really optimized (some of them are flash photo gallery).
Should I do no index them?
They are just too many to optimize them.
My website is http://www.soobumimphotography.com/
Thank you
-
Hi,
in my opinion you should 'no index' sites with high bounce rate compare to short time of visit (less than 10 sek). It's true or real bounce rate - visitor spent less than 10 sek on the one site and then left website.
Also good idea is no index or even delete posts with 'heavy' content with long time of page load ... as I think flash galleries could be heavy. Page loading time (long) could also increase bounce rate.
So you have a choice to have 600 posts or ... maybe 100 good posts ... good content vs average or poor content.
Answer is simple
Marek
-
Unless they are creating duplicate content issues or other problems, I personally would not noindex them.
Side note: Did you know your Image Search at the bottom of each page takes you off-site to this location "http://pa.photoshelter.com/c/thadallender/search"? I'd fix the action of that form or remove the widget until it's fixed if I were you. Right now it is not searching your site and is sending people off your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best practice to have gated white paper indexed by Google
Our main website white paper page has an image and brief description of the white paper. Once you click the white paper you are redirected to a form to access the gated white paper. Once you complete that form you are redirected to the white paper pdf which is housed on a subdomain/Hubspot. Because of this, I do not believe our website is getting "credit" for the keywords/content on these pages. Any suggestions on how we can allow the search engines to crawl this content while still keeping it gated? As I understand it a sub domain cannot hep or hurt (aside from critical crawler issues) the main domain. Thank you
On-Page Optimization | | NikCall0 -
Does Google avoid indexing pages that include registered trademark signs?
I am suspecting that Google often hesitates to index pages that have registered trademarks on them that are marked with a ®. For example EGOL® used in the title tag or in the tag at the top of the page. Registered trademarks are everywhere and most retail product pages contain at least one of them. However, most people use the registered trademark names as text in their writing without adding the registered trademark sign of ®. Have you experienced a problem getting such pages indexed or have you read any articles about how Google treats registered trademarks?
On-Page Optimization | | EGOL0 -
Google Console returning 0 pages as being indexed
HI there, I submitted my site notebuster.net to Search Console over a month ago and it is showing 0 pages as being indexed under the index status report. I know this isn't right as I can see that in google alone by typing in (site:notebusters.net) there are 113 pages indexed. Any idea why this might be? Thanks
On-Page Optimization | | CosiCrawley0 -
Help with the indexation of my page
Hi all, I have a problem with my website. When writing site:www.pinesapiensa.com there're no pages indexed although the webmaster tools tells me that the sitemap file has been processed in 13 May and the number of indexed paged are 21. ¿What could be happening? I have to mention that there are two domains "www.piensapiensa.es" and "www.piensapiensa.com" addressing the same website and there's a redirection from piensapiensa.com to piensapiensa.com but it doesn't work properly. Thanks
On-Page Optimization | | juanmiguelcr0 -
Help, a certain directory is not being indexed
Before I start, dont expect this to be too easy. This really has me puzzled and am surprised I am still yet to find a solution for it. Get ready. We have a wordpress website, launched over 6 months ago and have never had an issue getting content such as pages and post pages and categories indexed. However, I some what recently (about 2 months ago) installed a directory plugin (Business Directory Plugin) which lists businesses via unique urls that are accesible from a sub folder. Its these business listings that I absolutely cannot get indexed. The index page to the directory which links to the business pages is indexed, however for some reason google is not indexing all the listing pages which are linked to from this page. Its not an issue of the content being uncrawlable or at least dont think so as when I run crawlers on my site such as xml sitemap crawlers it finds all the pages including the directory pages so I am sure its not an issue of the search engines not finding the content. I have created xml sitemaps and uploaded to webmaster tools, tools recongises that there are many pages in the xml sitemap but google continues to only index a small percentage (everything but my business listings). The directory has been there for about 8 weeks now so I know there is a issue as it should of been indexed by now. See our main website at www.smashrepairbid.com.au and the business directory index page at www.smashrepairbid.com.au/our-shops/ To throw in a curve ball, in looking into this issue and setting up tools we noticed a lot of 404 error pages (nearly 4,000). We were very confused where these were coming from as they were only being generated from search engines - humans could not access the 404s and so we are guessing se's were firing some javascript code to generate them or something else weird. We could see the 404s in the logs so we know they were legit but again feel it was only search engines, this was validated when we added some rules to robots.txt and we saw the errors in the logs stop. We put the rules in robots txt file to try and stop google from indexing the 404 pages as we could not find anyway to fix the site / code (no idea what is causing them). If you do a site search in google you will see all the pages that are omitted in the results. Since adding the rules to robots, our impressions shown through tools have jumped right up (increased by 5 times) so thought this was a good indication of improvement but still not getting the results we want. Does anyone have any clue whats going on or why google and other se's are not indexing this content? Any help would be greatly appreciated and if you need any other information to assist just ask me. Really appreciate anyone who can spare their time to help me, I sure do need it. Thanks.
On-Page Optimization | | ziller0 -
Our sitemap is not indexed well
Hey there, Hope you guys can help. We get the following error: Nested indexing. Another Sitemap index refers to the index of sitemaps. The thing is that we cant find the error they are talking about. Thanks!!!!
On-Page Optimization | | Comunicare0 -
Duplicate page content & title for www.mydomain.com and www.mydomain.com/index.php?
Hi, First post so please be gentle! My Crawl Diagnostics Summary is showing an error relating to duplicate page content and duplicate page title for www.mydomain.com and www.mydomain.com/index.php which are, in my view, the same thing/page? Could anyone shed any light please? Thanks Carl
On-Page Optimization | | Carl2870 -
Does Frequency of content updates affect likelyhood outbound links will be indexed?
I have several pages on our website with low pr, that also themselves link to lots and lots of pages that are service/product specific. Since there are so many outbound links, I know that the small amount of PR will be spread thin as it is. My question is, if I were to supply fresh content to the top level pages, and change it often, would that influence whether or not google indexes the underlying pages? Also if I supply fresh content to the underlying pages, once google crawls them, would that guarantee that google considers them 'important' enough to be indexed" I guess my real question is, can freshness of content and frequency of update convince google that the underlying pages are 'worthy of being indexed', and can producing fresh content on those pages 'keep google's interest', so to speak, despite having little if any pagerank.
On-Page Optimization | | ilyaelbert0