Questions created by loopyal
-
Is googlebot the slowest bot?
This morning, I wrote a breaking news story about a "Wolf of Wall Street" It was published at 12:05:49 Googlebot, which used to be on my site within a minute or less, didn't bother to visit for 53 minutes. And now, 32 minutes later, even though it has been crawled, this story doesn't even show up in google search. Except that it is in the top 10 stories today, at #2, so the headline appears in every page on the site, so every page that has been crawled today, around 10 minutes after it was published, contains that text, so they show up. EINnews, which also crawls our pages is listed for the headline text. Finally, the page turns up in search results 4 hours later, and says that it is 4 hours old. Does anyone else see this slow motion mode? If you do see this, what is wrong with the site that causes this recalcitrant behavior? The headline of the story is "A 'Wolf of Wall Street' Raided By FBI In Florida" and the link is http://shar.es/1bW5Sw
Algorithm Updates | | loopyal0 -
Does anyone have a clue about my search problem?
After three years of destruction, my site still has a problem - or maybe more than one. OK, I understand I had - and probably still have - a Panda problem. The question is - does anyone know how to fix it, without destroying eveything? If I had money, I'd gladly give it up to fix this, but all I have is me, a small dedicated promotions team, 120,000+ visitors per month and the ability to write, edit and proofread. This is not an easy problem to fix. After completing more than 100 projects, I still haven't got it right, in fact, what I've done over the past 2 months has only made things worse - and I never thought I could do that. Everything has been measured, so as not to destroy our remaining ability to generate income, because without that, its the end of the line. If you can help me fix this, I will do anything for you in return - as long as it is legal, ethical and won't destroy my reputation or hurt others. Unless you are a master jedi guru, and I hope you are, this will NOT be easy, but it will prove that you really are a master, jedi, guru and time lord, and I will tell the world and generate leads for you. I've been doing website and SEO stuff since 1996 and I've always been able to solve problems and fix anything I needed to work on. This has me beaten. So my question is: is there anyone here willing to take a shot at helping me fix this, without the usual response of "change domains" "Delete everything and start over" or "you're screwed" Of course, it is possible that there is a different problem, nothing to do with algorithms, a hard-coded bias or some penalizing setting, that I don't know about, a single needle in a haystack. This problem results in a few visible things. 1. Some pages are buried in supplemental results 2. Search bots pick up new stories within minutes, but they show up in search results many hours later Here is the site: http://shar.es/EGaAC On request, I can provide a list of all the things we've done or tried. (actually I have to finish writing it) Some Notes: There is no manual spam penalty. All outgoing links are nofollow, and have been for 2 years. We never paid for incoming links. We did sell text advertising links 3-4 years ago, using text-link-ads.com, but removed them all 2 1/2 years ago. We did receive payment for some stories, 3-4 years ago, but all have been removed. One more thing. I don't write much - I'm a better editor than a writer, but I wrote a story that had 1 million readers. the massive percentage of 0.0016% came from you-know-who. Yes, 16 visitors. And this was an exclusive, unique story. And there was a similar story, with half a million readers. same result. Seems like there might be a problem!
Intermediate & Advanced SEO | | loopyal0 -
Clicking on SERPs
I've spoken to 2 people in the past few weeks, who have clients who search for their keywords every day, and then click on their own links, but not others, in the belief that this is good for their website ranking. Google does track clicks with redirects, so I can understand that they would know someone searched for a term, and they clicked on a result. I could understand if it was once per month, but doing it every day? And if they keep coming back, to search for other keywords and click on those results too. Looks like they would just paint a very large target on themselves. What's your opinion?
Search Behavior | | loopyal0 -
Anyone good at ranking on Yahoo?
I don't get Yahoo or Bing these days! I wrote a story "Did David Wineland and Serge Haroche Steal Idea For The Nobel Physics Prize?" It is a unique story, sourced from a prince. It is referenced on many other sites. It has had about 115,000 pageviews so far. most of those people read 2 or 3 pages. Do Yahoo or Bing list my story? No. They list all the referencing sites, they even list unrelated stories on my site that show that headline in the "top ten stories of the day" widget, but not the story itself. It has about 1250 shares. What am I doing wrong? http://buff.ly/WXso4P Is it page structure?
Content Development | | loopyal0 -
How fast is my front page?
Yesterday, I changed all of my front page structure from tables to divs. I think this has improved page load time, but I am in Australia, so it is hard to tell. Using Firefox with Firebug tells me the load time here is between 4 to 6 seconds. One of my editors is in Houston, and she says 2 seconds. I'm hoping you can help me, it will take less than a minute. Can you load the front page and tell me how long it takes - and where you are - Country/State Also, if you click to a story, how long does that take? http://newsblaze.com I am working on the story page template too, but it will take longer to get right, because it also is the same for 3 other areas, so I have to be more careful. It would also be nice to get a before and after snapshot from various places. The reason I care about shaving off a second or two is that I've been told google may now care about loading speed, and they are rejecting my new adsense account because of poor user experience on my site, and I have no idea what they mean by that, so I'm clutching at straws.
International SEO | | loopyal0 -
New adsense account request rejected - need help
I'm moving my company to Australia, shutting down the US company. Google said I had to request a new Adsense account, so I did. They opened the account, I added the same ads, in the same places, and they have rejected my application. What do I do now? The other account has been open since 2004. They never said a word about this before. After two years of working on improvements, now I'm just about destroyed. I need some help, because I thought I knew what I was doing, but obviously not! As usual. their helpful response is no help at all. http://bit.ly/NPACk - there are no G ads on the front page http://bit.ly/V8ubB5 - this is a typical story http://bit.ly/UpTC2r - this is a typical press release As mentioned in our welcome email, we conduct a second review of your AdSense application once AdSense code is placed on your site(s). As a result of this review, we have disapproved your account for the following violation(s): Issues: - Site does not comply with Google policies --------------------- Further detail: Site does not comply with Google policies: We're unable to approve your AdSense application at this time for one of the reasons listed below or another reason listed in our program policies ([https://support.google.com/adsense/bin/topic.py?topic=1271507](https://support.google.com/adsense/bin/topic.py?topic=1271507)). We recommend that you review the information provided below and make the necessary changes to your site. 1\. You need to improve your site’s user experience To ensure a good experience for users and advertisers, publishers participating in the AdSense program are required to adhere to the Webmaster Quality guidelines ([http://www.google.com/support/webmasters/bin/answer.py?answer=35769](http://www.google.com/support/webmasters/bin/answer.py?answer=35769)). These guidelines provide many tips to help you to provide a positive experience for your users. You’ll also find more useful information in this AdSense blog post which highlights five user experience principles: [http://adsense.blogspot.com/2012/10/publisher-insights-part-1-5-principles.html](http://adsense.blogspot.com/2012/10/publisher-insights-part-1-5-principles.html). Applying these principles will help you to provide a great experience for users on your site. 2\. Your site is a chat site which is not compliant with our policy Publishers are encouraged to experiment with a variety of ad placements and ad formats. However, as stated in our program policies ([http://support.google.com/adsense/bin/answer.py?hl=en&answer=48182](http://support.google.com/adsense/bin/answer.py?hl=en&answer=48182)), AdSense publishers may not place ad code, search boxes or search results in chat programs. This includes, but is not limited to, instant messaging (IMs), chat sites and other pages that contains dynamic content. 3\. You need to remove all content that encourages violation of Google product policies Publishers may not provide the means to circumvent the policies of any Google products, such as by allowing users to download YouTube videos, or encourage the violation of Google AdSense policies. Moreover, publishers may not make use of Google brand features such as logos, screenshots, or other distinctive features without our express permission. For more information, please visit our Help Center ([http://support.google.com/adsense/bin/answer.py?hl=en&ctx=as2&answer=1348688&rd=1](http://support.google.com/adsense/bin/answer.py?hl=en&ctx=as2&answer=1348688&rd=1)). 4\. Your site is dedicated to the sale and distribution of term papers We’re happy to see our publishers’ sites full of useful and informative content, however, as stated in our program policies ( [https://www.google.com/adsense/support/as/bin/answer.py?hl=en&answer=105953](https://www.google.com/adsense/support/as/bin/answer.py?hl=en&answer=105953) ), the sale or distribution of term papers, or any other content that is illegal, promotes illegal activity, or infringes on the legal rights of others is not allowed. Please review the AdSense program policies ([http://support.google.com/adsense/bin/answer.py?hl=en&answer=48182](http://support.google.com/adsense/bin/answer.py?hl=en&answer=48182)) to ensure that your site meets all of the requirements for approval. As soon as you’ve made the necessary changes, we’ll be happy to take another look at your application.
On-Page Optimization | | loopyal0 -
Large Site - Advice on Subdomaining
I have a large news site - over 1 million pages (have already deleted 1.5 million) Google buries many of our pages, I'm ready to try subdomaining http://bit.ly/dczF5y There are two types of content - news from our contributors, and press releases. We have had contracts with the big press release companies going back to 2004/5. They push releases to us by FTP or we pull from their server. These are then processed and published. It has taken me almost 18 months, but I have found and deleted or fixed all the duplicates I can find. There are now two duplicate checking systems in place. One runs at the time the release comes in and handles most of them. The other one runs every night after midnight and finds a few, which are then handled manually. This helps fine-tune the real-time checker. Businesses often link to their release on the site because they like us. Sometimes google likes this, sometimes not. The news we process is reviews by 1,2 or 3 editors before publishing. Some of the stories are 100% unique to us. Some are from contributors who also contribute to other news sites. Our search traffic is down by 80%. This has almost destroyed us, but I don't give up easily. As I said, I've done a lot of projects to try to fix this. Not one of them has done any good, so there is something google doesn't like and I haven't yet worked it out. A lot of people have looked and given me their ideas, and I've tried them - zero effect. Here is an interesting and possibly important piece of information: Most of our pages are "buried" by google. If I dear, even for a headline, even if it is unique to us, quite often the page containing that will not appear in the SERP. The front page may show up, an index page may show up, another strong page pay show up, if that headline is in the top 10 stories for the day, but the page itself may not show up at all - UNTIL I go to the end of the results and redo the search with the "duplicates" included. Then it will usually show up, on the front page, often in position #2 or #3 According to google, there are no manual actions against us. There are also no notices in WMT that say there is a problem that we haven't fixed. You may tell me just delete all of the PRs - but those are there for business readers, as they always have been. Google supposedly wants us to build websites for readers, which we have always done, What they really mean is - build it the way we want you to do it, because we know best. What really peeves me is that there are other sites, that they consistently rank above us, that have all the same content as us, and seem to be 100% aggregators, with ads, with nothing really redeeming them as being different, so this is (I think) inconsistent, confusing and it doesn't help me work out what to do next. Another thing we have is about 7,000+ US military stories, all the way back to 2005. We were one of the few news sites supporting the troops when it wasn't fashionable to do so. They were emailing the stories to us directly, most with photos. We published every one of them, and we still do. I'm not going to throw them under the bus, no matter what happens. There were some duplicates, some due to screwups because we had multiple editors who didn't see that a story was already published. Also at one time, a system code race condition - entirely my fault, I am the programmer as well as the editor-in-chief. I believe I have fixed them all with redirects. I haven't sent in a reconsideration for 14 months, since they said "No manual spam actions found" - I don't see any point, unless you know something I don't. So, having exhausted all of the things I can think of, I'm down to my last two ideas. 1. Split all of the PRs off into subdomains (I'm ready to pull the trigger later this week) 2. Do what the other sites do, that I believe create little value, which is show only a headline and snippet and some related info and link back to the original page on the PR provider website. (I really don't want to do this) 3. Give up on the PRs and delete them all and lose another 50% of the income, which means releasing our remaining staff and upsetting all of the companies and people who linked to us. (Or find them all and rewrite them as stories - tens of thousands of them) and also throw all our alliances under the bus (I really don't want to do this) There is no guarantee this is the problem, but google won't tell me, the google forums are crap, and nobody else has given me an idea that has helped. My thought is that splitting them off into subdomains will have a number of effects. 1. Take most of the syndicated content onto subdomains, so its not on the main domain. 2. Shake up the Domain Authority 3. Create a million 301 redirects. 4. Make it obvious to the crawlers what is our news and what is PRs 5. make it easier for Google News to understand Here is what I plan to do 1. redirect all PRs to their own subdomain. pn.domain.com for PRNewswire releases bw.domain.com for Businesswire releases etc 2. Fix all references so they use the new subdomain Here are my questions - and I hope you may see something I haven't considered. 1. Do you have any experience of doing this? 2. What was the result 3. Any tips? 4. Should I put PR index pages on the subdomains too? I was originally planning to keep them on the main domain, with the individual page links pointing to the actual release on the subdomain. Obviously, I want them only in one place, but there are two types of these index pages. a) all of the releases for a particular PR company - these certainly could be on the subdomain and not on the main domain b) Various category index pages - agriculture, supermarkets, mining etc These would have to stay on the main domain because they are a mixture of different PR providers. 5. Is this a bad idea? I'm almost out of ideas. Should I add a condensed list of everything I've done already? If you are still reading, thanks for hanging in.
On-Page Optimization | | loopyal0 -
Do you think Google is destroying search?
I've seen garbage in google results for some time now, but it seems to be getting worse. I was just searching for a line of text that was in one of our stories from 2009. I just wanted to check that story and I didn't have a direct link. So I did the search and I found one copy of the story, but it wasn't on our site. I knew that it was on the other site as well as ours, because the writer writes for both publications. What I expected to see was the two results, one above the other, depending on which one had more links or better on-page for the query. What I got didn't really surprise me, but I was annoyed. In #1 position was the other site, That was OK by me, but ours wasn't there at all. I'm almost used to that now (not happy about it and trying to change it, but not doing well at all, even after 18 months of trying) What really made me angry was the garbage results that followed. One site, a wordpress blog, has tag pages and category pages being indexed. I didn't count them all but my guess is about 200 results from this blog, one after the other, most of them tag pages, with the same content on every one of them. Then the tag pages stopped and it started with dated archive pages, dozens of them. There were other sites, some with just one entry, some with dozens of tag pages. After that, porn sites, hundreds of them. I got right to the very end - 100 pages of 10 results per page. That blog seems to have done everything wrong, yet it has interesting stats. It is a PR6, yet Alexa ranks it 25,680,321. It has the same text in every headline. Most of the headlines are very short. It has all of the category and tag and archive pages indexed. There is a link to the designer's website on every page. There is a blogroll on every page, with links out to 50 sites. None of the pages appear to have a description. there are dozens of empty H2 tags and the H1 tag is 80% through the document. Yet google lists all of this stuff in the results. I don't remember the last time I saw 100 pages of results, it hasn't happened in a very long time. Is this something new that google is doing? What about the multiple tag and category pages in results - Is this just a special thing google is doing to upset me or are you seeing it too? I did eventually find my page, but not in that list. I found it by using site:mysite.com in the search box.
Algorithm Updates | | loopyal0 -
If you have G+ buttons on your site, does google still suggest you add them?
We've had G+ buttons on the site for many months now (Can't remember exactly when they were added.) Yet in Google Webmaster Tools, they still give me this message: "Get more recommendations in Google Search and grow your audience on Google+. Add the Google+ badge to your site." Is this happening to everyone, or is it just me? Do they think the buttons aren't there? Also, they say this: "Your site doesn't have enough +1's yet to show characteristics." According to the stats, 551 unique people have +1'd our pages. How many does it take, to get stats? Anyone willing to give stats?
Reporting & Analytics | | loopyal0 -
Google spitting out old data as new alerts
Am I just unlucky or are others seeing this too? I have several google alerts. For the past 6 months, google keeps sending crap along with good stuff. its a bit like their search results. There are three types of Alerts they send that I'm not impressed with. 1. Alerts that are from unintelligible splogs that take real news stories and rewrite them with unintelligible garbage that makes no sense at all. Sometimes, they serve up new alerts from the same splogs I saw several months ago, that I felt sure they would have zapped by now. 2. Old stories, that have been around for months. I just received one that was from January, from TechDirt, a big site that must get a huge amount of attention from google. 3. Irrelevant stories because they love to show how smart they are by splitting my alert keyword text into multiple words, but it gives useless results. This is the kind of stuff that crappy search engines like AltaVista used to do. Is google reverting to the childhood of search with all these changes?
Algorithm Updates | | loopyal0 -
Can bad text URLs hurt pages?
If you have some pages that contain plain text URLs (not anchored links) that used to be good URLs, but are now bad, either because the website shut down or because it has been acquired by someone else and is now parked (or worse) - are those URLs enough to cause quality problems? For example: This information was brought to you by Waymaker http://www.waymaker.net These aren't the only ones. And yes, I know I should fix them, but there are probably 10,000 pages like it. I will fix them, but its not something I can do in a few minutes. (this one is easy to fix programmatically, but others are a lot more complex) So my question is: do you have actual experience that these are bad enough to cause ranking problems (making them low quality)
On-Page Optimization | | loopyal0 -
Do new Mozzers realise it takes effort to respond to their questions?
I see a lot of people asking questions and getting some pretty good responses, but the people who are responding, often do not get a thumbs up for their answer. If you are new and you are reading this, then you maybe do not understand that a thumbs up actually helps the person who gave their time to answer your question. There are several ways it helps, including giving feedback that the answer or the attempt to answer was actually useful, that you learned something or you appreciate the time it took to check your questions and give you feedback that you don't get in other forums, where you might be ignored for weeks or months. This is a collaborative forum and we are all here to learn something and to pass some of our knowledge to others who need it. Not every answer we give needs a thumbs up, but if you got something out of the answer, then, surely it is worth a second of your time to say "Thanks, that helped me" or "Yes, I agree with this"
Moz Pro | | loopyal13 -
Stop google indexing CDN pages
Just when I thought I'd seen it all, google hits me with another nasty surprise! I have a CDN to deliver images, js and css to visitors around the world. I have no links to static HTML pages on the site, as far as I can tell, but someone else may have - perhaps a scraper site? Google has decided the static pages they were able to access through the CDN have more value than my real pages, and they seem to be slowly replacing my pages in the index with the static pages. Anyone got an idea on how to stop that? Obviously, I have no access to the static area, because it is in the CDN, so there is no way I know of that I can have a robots file there. It could be that I have to trash the CDN and change it to only allow the image directory, and maybe set up a separate CDN subdomain for content that only contains the JS and CSS? Have you seen this problem and beat it? (Of course the next thing is Roger might look at google results and start crawling them too, LOL) P.S. The reason I am not asking this question in the google forums is that others have asked this question many times and nobody at google has bothered to answer, over the past 5 months, and nobody who did try, gave an answer that was remotely useful. So I'm not really hopeful of anyone here having a solution either, but I expect this is my best bet because you guys are always willing to try.
Algorithm Updates | | loopyal0 -
Http://dev.visualwebsiteoptimizer.com
Hello. http://dev.visualwebsiteoptimizer.com seems to be hanging all the pages. Most pages will not load for me. Even your contact page will not load. The forum seems OK. Alan
Moz Pro | | loopyal0 -
How come my Moz Points stopped incrementing
I was so happy when my Moz Points reached 100 today, but I've done a whole lot more today and it still says 100. Have I broken the system, or is it resting today (it is Sunday where I am)
Moz Pro | | loopyal0 -
Does google have the worst site usability?
Google tells us to make our sites better for our readers, which we are doing, but do you think google has horrible site usabilty? For example, in webmaster tools, I'm always being confused by their changes and the way they just drop things. In the HTML suggestions area, they don't tell you when the data was last updated, so the only way to tell is to download the files and check. In the URL removals, they used to show you the URLs they had removed. Now that is gone and the only way you can check is to try adding one. We don't have any URL parameters, so any parameters are as a result of some other site tacking on stuff at the end of our URL and there is no way to tell them that we don't have any parameters, so ignore them all. Also, they add new parameters they find on the end of the list, so the only way to check is to click through to the end of the list.
Algorithm Updates | | loopyal0 -
Bing Traffic drop
Anyone notice a drop in Bing traffic on January 26th? Slashed by 50% and holding for the past three weeks. 100% white hat site
Algorithm Updates | | loopyal0 -
Why does SEOMoz crawler ignore robots.txt?
The SEOMoz crawler ignores robots.txt It also "indexes" pages marked as noindex. That means it is filling up the reports with things that don't matter. Is there any way to stop it doing that?
Moz Pro | | loopyal0 -
Crawl Diagnostics Report Lacks Information
When I look at the crawl diagnostics, SEOMoz tells me there are 404 errors. This is understandable, because some pages were removed. What this report doesn't tell me is how those pages were discovered. This is a very important piece of information, because it would tell me there are links pointing to those pages, either internal or external. I believe the internal links have been removed. If the report told me how if found the link, I would be able to take immediate action. Without that information, I have to go so a lot of investigation. And when you have a million pages, that isn't easy. Some possibilities: The crawler remembered the page from the previous crawl. There was a link from an index page - i.e. it is in the database still There was an individual link from another story - so now there are broken links Ditto, but it in on a static index page The link was from an external source - I need to make a redirect Am I missing something, or is this a feature the SEO Moz crawler doesn't have yet? What can I do (other than check all my pages) to discover this?
Moz Pro | | loopyal0 -
Should I delete a page that gets search traffic, that I don't care about?
I have a page on my site that consistently gets traffic, every month. Googlers seems to love it. But I don't like it at all. Webmaster tools shows that google allows us a certain number of search impressions each day. - it flatlines, they are limiting the impressions we get. We also getthe same number of clickthroughs each day. So my question is for anyone who has this same experience, who may have experimented by deleting a page you don't care about. Did you just lose that number of clicks each day or did other pages on your site get displayed and clicked through instead?
Reporting & Analytics | | loopyal0 -
Is Googlebot ignoring directives? Or is it Me?
I saw an answer to a question in this forum a few days ago, that said it was a bad idea to use robots.txt to tell googlebot to go away. That SEO said it was much better to use the META tag to say noindex,nofollow. So I removed the robots directive and added the META tag <meta robots='noindex,nofollow'> Today, I see google showing my send to a friend page where I expected the real page to be. Does it mean Google is stupid? Does it mean google ignores the Robots META tag? Does it mean short pages have more value than long pages? Does it mean if I convert my whole site to snippets, I'll get more traffic? Does it mean garbage trumps content? I have more questions, but this is more than enough.
Technical SEO | | loopyal0 -
Page hidden in SERPs
As well as unique content, we carry press releases (contracts with PR companies) Some pages don't show up in SERPs - but sometimes a few or even hundreds of other sites do. At the end of the search, google shows this: In order to show you the most relevant results, we have omitted some entries very similar to the 109 already displayed.
Content Development | | loopyal
If you like, you can repeat the search with the omitted results included. When the search is repeated, using that link, the page often shows up in the top 10 or top 20. I believe this is a site problem, not a page problem. We are often one of the first to publish the release.0 -
Duplicate page titles are the same URL listed twice
The system says I have two duplicate page titles. The page titles are exactly the same because the two URLs are exactly the same. These same two identical URLs show up in the Duplicate Page Content also - because they are the same. We also have a blog and there are two tag pags showing identical content - I have blocked the blog in robots.txt now, because the blog is only for writers. I suppose I could have just blocked the tags pages.
Moz Pro | | loopyal0