2000 pages indexed in Yahoo, 0 in Google. NO PR, What is wrong?
-
Hello Everyone,
I have a friend with a blog site that has over 2000 pages indexed in Yahoo but none in Google and no page rank. The web site is http://www.livingorganicnews.com/ I know it is not the best site but I am guessing something is wrong and I don't see it.
Can you spot it? Does he have some settings wrong? What should he do?
Thank you.
-
The site is just looking like a site of a blog network. The domain is 5 years old & the home page has DA & PA of 34 still not indexed by Google. I searched for site:livingorganicnews.com in Google which is not giving any results. So it shows that the site is penalized by Google. Use Google webmaster tools for further verification so as to find the reason.
Most probably it's penalized because of being a site of a paid blog network.
-
LOL, the fact that there's a tonne of clearly spun content won't help. I gather this is part of a content scraping or sharing network like LinkVine?
Have you tried reading the articles published? Could do with some quality guidelines for what gets accepted imho
Even when it gets indexed, it's not going to rank anywhere... this is exactly the kind of site that Panda wanted to stop. Regurgitated, nonsensical, spun, tosh that looks as if it was written by a lunatic and only really exists for the sake of its outgoing links, that point to other rubbish.
I'd tell your friend to give up on this site entirely and start looking at less automated ways of doing things. Google is only going to get tougher and tougher on these sites so he's fighting a losing battle.
I don't mean to be rude but I hope it doesn't get indexed ever, what value does it offer to anyone for anything? Most people don't want stuff like that clogging up the web. I don't mean to sound harsh but tell your friend the problem with the site is.... it's crap.
-
Another one of the many not-quite-right things on the site are some of the older posts like http://www.livingorganicnews.com/games/2010/panasonic-announced-the-jungle-handheld-gaming-platform/1965/ which end with "incoming search terms" and several search terms that all hyperlink to that exact same article. Search engines will not see that as providing any value to the user (users are already on that page, they don't need to link to that page) and they will see it as just another attempt to manipulate the engines.
-
It is interesting to have a new set of eyes here. I had noticed his different writing but figured it was because English is not his first language. I will ask if he is actually writing this.
-
Keri is absolutely right.
I did not look at the site's content. It couldn't be much worse. It is a 100% spam site which should never be indexed. Clearly the site is under a penalty.
Google's job is to satisfy a user's search query by giving them the content they seek. If you create a site like that, NO ONE will ever want to get that site as the result of a search query. Google correctly recognizes this fact and removes the site from their database.
-
When there are a couple of thousand other pages like this, yes.
http://www.livingorganicnews.com/games/2011/get-cool-with-selected-berber-carpet-tiles-now/3215/
The subject of the article is about berber carpet tiles, yet the text has links (I used bold) that are totally off base and make no sense. For example:
"The berber carpet tiles might also be renowned for the durability and stain resistance at extended stay motel rates."
"To get rid of the difficult to vacuum Provillus scam dust particles..."
"An important benefit in using berber carpet tiles is a likelihood to eliminate the damaged location alone and replace it with a new carpet tile, a comparatively low-cost way of capatrex scam damage control, to make your ground look just like new."
-
Absolutely.
It is entirely possible he has been removed from Google's index as a result of a penalty. If he links to sites which receive a penalty (mobile casinos would be a very bad choice of sites to link to) then his site could receive a penalty as well.
My suggestion is not to jump to the conclusion the site is under penalty. Start by checking WMT, then if nothing is discovered submit the sitemap. If you don't see any results after a few days, then proceed to inquiring with Google about the site being under a penalty.
-
The text doesn't really seem like a human wrote it. The current most recent article has the title "Religious Credit card debt Enable Provides You With the Meaningful and Economical You Need". Other posts are about acne treatment reviews, alcoholism, and other seemingly random things.
It really looks like it's been through an article spinner. The article about alcoholism ends with "So, Think before you Beverage." Uh..really? Or what about "As emission safety glasses are put on in the office, they need to provide ease and comfort, safe healthy and crystal clear eyesight to make sure they are usually not golf clubs to the wearer." An article I found that wasn't spun is instead indexed 94 other times on the web.
I would say the content is why Google has not indexed it. They can't find the value to the user for returning this in a search result. Is this truly the content that your friend has put up, or has the site gotten hacked?
-
Hello Bryce,
That sounds possible to loose credibillity but could it be the reason for no index?
-
Thank you Ryan,
I will ask him about GWT. Perhaps it is just a sitemap issue but I wonder why Yahoo would spot it and Google would totally miss it. I often see that they have a difference in pages indexed but this is the first time I have seen thousands verses zero.
-
'm thinking that by linking out to Mobile Casinos and Polish Rock Bands, he's probably losing credibility.
-
I didn't notice any obvious problem with your site. Have you logged into Google Webmaster Tools and looked at the site? That would be the logical next step.
The robots.txt file looks fine, there is not any "noindex" tag on the home page, a GA code is present on the page, etc. I would suggest reviewing the site in Google's WMT and look for any issues.
If none are present, the next step would be to submit a sitemap. If your friend does not have a sitemap already set up, you can use http://www.xml-sitemaps.com/ I think the free version only maps 500 pages, but that is enough to get you started.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which pages should I index or have in my XML sitemap?
Hi there, my website is ConcertHotels.com - a site which helps users find hotels close to concert venues. I have a hotel listing page for every concert venue on my site - about 12,000 of them I think (and the same for nearby restaurants). e.g. https://www.concerthotels.com/venue-hotels/madison-square-garden-hotels/304484 Each of these pages list the nearby hotels to that concert venue. Users clicking on the individual hotel are brought through to a hotel (product) page e.g. https://www.concerthotels.com/hotel/the-new-yorker-a-wyndham-hotel/136818 I made a decision years ago to noindex all of the /hotel/ pages since they don't have a huge amount of unique content and aren't the pages I'd like my users to land on . The primary pages on my site are the /venue-hotels/ listing pages. I have similar pages for nearby restaurants, so there are approximately 12,000 venue-restaurants pages, again, one listing page for each concert venue. However, while all of these pages are potentially money-earners, in reality, the vast majority of subsequent hotel bookings have come from a fraction of the 12,000 venues. I would say 2000 venues are key money earning pages, a further 6000 have generated income of a low level, and 4000 are yet to generate income. I have a few related questions: Although there is potential for any of these pages to generate revenue, should I be brutal and simply delete a venue if it hasn't generated revenue within a time period, and just accept that, while it "could" be useful, it hasn't proven to be and isn't worth the link equity. Or should I noindex these "poorly performing pages"? Should all 12,000 pages be listed in my XML sitemap? Or simply the ones that are generating revenue, or perhaps just the ones that have generated significant revenue in the past and have proved to be most important to my business? Thanks Mike
Technical SEO | | mjk260 -
Google has deindexed a page it thinks is set to 'noindex', but is in fact still set to 'index'
A page on our WordPress powered website has had an error message thrown up in GSC to say it is included in the sitemap but set to 'noindex'. The page has also been removed from Google's search results. Page is https://www.onlinemortgageadvisor.co.uk/bad-credit-mortgages/how-to-get-a-mortgage-with-bad-credit/ Looking at the page code, plus using Screaming Frog and Ahrefs crawlers, the page is very clearly still set to 'index'. The SEO plugin we use has not been changed to 'noindex' the page. I have asked for it to be reindexed via GSC but I'm concerned why Google thinks this page was asked to be noindexed. Can anyone help with this one? Has anyone seen this before, been hit with this recently, got any advice...?
Technical SEO | | d.bird0 -
My website's pages are not being indexed correctly
Hi, One of our websites, which is actually a price comparison engine, facing indexing problem at Google. When we check “site:mywebsite.com “, there are lots of pages indexed which are not from mywebsite.com but from merchants websites. The index result page also shows merchant’s page title. In some cases the title is from merchant’s site but when the given link is accessed it points to mywebsite.com/index. Also the cache displays the merchant’s product page as the last indexed version rather than showing ours. The mywebsite.com has quite few Merchants that send us their product feed. Those products are listed on comparison page with prices. The merchant’s links on comparison page are all no-follow links but some of the (not all) merchant’s product pages are indexed against mywebsite.com as mentioned above instead of product comparison page of mywebsite.com How can we fix the issue? Thanks!
Technical SEO | | digitalMSB0 -
Google not indexing /showing my site in search results...
Hi there, I know there are answers all over the web to this type of question (and in Webmaster tools) however, I think I have a specific problem that I can't really find an answer to online. site is: www.lizlinkleter.com Firstly, the site has been live for over 2 weeks... I have done everything from adding analytics, to submitting a sitemap, to adding to webmaster tools, to fetching each individual page as googlebot and then submitting to index via webmaster tools. I've checked my robot files and code elsewhere on the site and the site is not blocking search engines (as far as I can see) There are no security issues in webmaster tools or MOZ. Google says it has indexed 31 pages in the 'Index Status' section, but on the site dashboard it says only 2 URLS are indexed. When I do a site:www.lizlinketer.com search the only results I get are pages that are excluded in the robots file: /xmlrpc.php & /admin-ajax.php. Now, here's where I think the issue stems from - I developed the site myself for my wife and I am new to doing this, so I developed it on the live URL (I now know this was silly) - I did block the content from search engines and have the site passworded, but I think Google must have crawled the site before I did this - the issue with this was that I had pulled in the Wordpress theme's dummy content to make the site easier to build - so lots of nasty dupe content. The site took me a couple of months to construct (working on it on and off) and I eventually pushed it live and submitted to Analytics and webmaster tools (obviously it was all original content at this stage)... But this is where I made another mistake - I submitted an old site map that had quite a few old dummy content URLs in there... I corrected this almost immediately, but it probably did not look good to Google... My guess is that Google is punishing me for having the dummy content on the site when it first went live - fair enough - I was stupid - but how can I get it to index the real site?! My question is, with no tech issues to clear up (I can't resubmit site through webmaster tools) how can I get Google to take notice of the site and have it show up in search results? Your help would be massively appreciated! Regards, Fraser
Technical SEO | | valdarama0 -
Problems with to many indexed pages
A client of our have not been able to rank very well the last few years. They are a big brand in our country, have more than 100+ offline stores and have plenty of inbound links. Our main issue has been that they have to many indexed pages. Before we started we they had around 750.000 pages in the Google index. After a bit of work we got it down to 400-450.000. During our latest push we used the robots meta tag with "noindex, nofollow" on all pages we wanted to get out of the index, along with canonical to correct URL - nothing was done to robots.txt to block the crawlers from entering the pages we wanted out. Our aim is to get it down to roughly 5000+ pages. They just passed 5000 products + 100 categories. I added this about 10 days ago, but nothing has happened yet. Is there anything I can to do speed up the process of getting all the pages out of index? The page is vita.no if you want to have a look!
Technical SEO | | Inevo0 -
My beta site (beta.website.com) has been inadvertently indexed. Its cached pages are taking traffic away from our real website (website.com). Should I just "NO INDEX" the entire beta site and if so, what's the best way to do this? Please advise.
My beta site (beta.website.com) has been inadvertently indexed. Its cached pages are taking traffic away from our real website (website.com). Should I just "NO INDEX" the entire beta site and if so, what's the best way to do this? Are there any other precautions I should be taking? Please advise.
Technical SEO | | BVREID0 -
De-indexing millions of pages - would this work?
Hi all, We run an e-commerce site with a catalogue of around 5 million products. Unfortunately, we have let Googlebot crawl and index tens of millions of search URLs, the majority of which are very thin of content or duplicates of other URLs. In short: we are in deep. Our bloated Google-index is hampering our real content to rank; Googlebot does not bother crawling our real content (product pages specifically) and hammers the life out of our servers. Since having Googlebot crawl and de-index tens of millions of old URLs would probably take years (?), my plan is this: 301 redirect all old SERP URLs to a new SERP URL. If new URL should not be indexed, add meta robots noindex tag on new URL. When it is evident that Google has indexed most "high quality" new URLs, robots.txt disallow crawling of old SERP URLs. Then directory style remove all old SERP URLs in GWT URL Removal Tool This would be an example of an old URL:
Technical SEO | | TalkInThePark
www.site.com/cgi-bin/weirdapplicationname.cgi?word=bmw&what=1.2&how=2 This would be an example of a new URL:
www.site.com/search?q=bmw&category=cars&color=blue I have to specific questions: Would Google both de-index the old URL and not index the new URL after 301 redirecting the old URL to the new URL (which is noindexed) as described in point 2 above? What risks are associated with removing tens of millions of URLs directory style in GWT URL Removal Tool? I have done this before but then I removed "only" some useless 50 000 "add to cart"-URLs.Google says themselves that you should not remove duplicate/thin content this way and that using this tool tools this way "may cause problems for your site". And yes, these tens of millions of SERP URLs is a result of a faceted navigation/search function let loose all to long.
And no, we cannot wait for Googlebot to crawl all these millions of URLs in order to discover the 301. By then we would be out of business. Best regards,
TalkInThePark0 -
How best to set up Google + business pages for clients
I wish to setup a business page on google+ business page for my clients but it requires a personal profile, my clients don't want a personal profile but do want the business page. Currently i have set them up with pages on my personal profile but do can i allow the client to manage it? so i am not sure this is the best way Whats the best way for web developers to setup Google+ accounts for clients?
Technical SEO | | Bristolweb1