Yeah, your robot.txt seems fine, but the answer sounded like the error code could be misleading, so maybe you're looking in the wrong area for the root of the problem due to that reason. Wish I could be of more help.
Posts made by WilliamKammer
-
RE: How can I fix this home page crawl error ?
-
RE: How can I fix this home page crawl error ?
This was brought up a little while ago, hopefully Chiaryn's answer here can help: http://moz.com/community/q/without-robots-txt-no-crawling
-
RE: Pleasing the Google Gods & Not DeIndexing my site.
-
Consider a warning a pleasant courtesy from Google. They by no means do this on a consistent basis.
-
You can usually get back in, but the effort it takes to recover from being deindexed is demanding and probably not worth it for you naughty "dark dark grey" hat folk.
-
I agree with Irving: I've never encountered a friendly warning from Google before they destroyed all the things.
-
-
RE: Duplicate content. Wordpress and Website
Yes, however, implementing canonical tags in the posts on one of the properties will resolve this issue.
Here's a post to help you out with implementation: http://moz.com/learn/seo/canonicalization
-
RE: Navigational Links in Dropdown Menus
Woo! A new tool for me to play with. Thanks for the tip, David.
-
RE: Pagerank and sitemap question :)
The number of sitemaps you have is not a big concern, as long as you have the ones the need and they include the pages you want indexed. You might be barking up the wrong tree here. There are a lot of reasons why a competitor might outrank you, but competitive analysis of a sitemap is not high up on that list of things to check. Be concerned with our own sitemap and make sure all of the pages you want to be in there are in there. It's more than likely that he is beating you for another reason.
-
RE: How does dynamic call tracking affect local SEO?
Hi DJReason,
If you can manage to not have a pool of numbers for SEO traffic, my solution above will work. By using a script that only fires when certain parameters hit, it allows you to only use one number for non-parametered URLs, so search engine bots will only see the one number. Then you use that number during citation building, and you're golden... at least from that perspective.
-
RE: Site was moved, but still exists on the old server and is being outranked for it's own name
Then send a DMCA to the server company. The server company doesn't care that he's a lawyer, and I bet your client and send a very well written one.
Besides that, you're strategy moving forward seems pretty sound.
-
RE: Site was moved, but still exists on the old server and is being outranked for it's own name
First things first: before you start working on optimizing the new site, I would work on resolving the issue with the old partner, since this is an issue that can continue to haunt you for a long time if you don't.
The old partner needs to 301 redirect all pages in your client's subdirectory to the new site. Offer to send him the steps needed to do this. If he refuses, let him know that he does not own the data from your client's side of the old website, and you will therefor be forced to send DMCA takedown requests to the server company and to him for straight-up copyright infringement. Once he cooperates and puts in the proper 301s, you will have much less to worry about.
Unless, for some reason, the old partner does own the data, in which case, you shouldn't be using it on the new site.
-
RE: How do I test images in WP migration without Changing URLs?
If the images are coded into the website using "http://example.com/image.jpg" instead of "/image.jpg" then the images will always pull from example.com, instead if your test site.
If they were coded with "/image.jpg" then the code would pull from example.com when the page was there, and example.yoursite.com when you moved it over for testing.
-
RE: How do I test images in WP migration without Changing URLs?
It sounds like the URLs being used in the code aren't dynamic. There are two different ways to embed an image (and it's URL) on a page with html:
You can use a dynamic URL, where the domain will be whichever domain the page is on, like notice how there is no domain in front of the URL path, this means the domain will change depending on which website the code is on.
The other way, and the way it sounds like is being used on your site is This makes the URL definite, since all of it is defined. If you see that your images were put in with URLs like these, you'll need to make them dynamic to accomplish what you want to.
-
RE: Buying a domain and redirecting it to your website (improves seo?)
In most cases it will help your rankings, since that link juice does transfer in a 301 redirect. Now, if that competitor's website has some kind of link spam penalty or something similar, that will also transfer over and will hurt your site. As long as the 301'd site isn't penalized in Google, you should see a bump.
-
RE: Disavow Links & Paid Link Removal (discussion)
Yeah, disavowing should have the same effect as if the links were removed, so you're better off submitting the disavow.
-
RE: Disavow Links & Paid Link Removal (discussion)
Just disavow. Don't let people like this extort you. If you want to get him to try and remove the links for free, tell him you're not going to pay him, and instead you're going to submit a disavow, flagging his entire network to Google as unwanted links. You made a good faith effort by contacting the webmaster, but being extorted goes beyond good faith.
-
RE: Google Webmaster Tools Search Traffic->Manual Actions shows "No Manual WebSpam Actions Found" but is it right?
While the manual action may have been removed, it's possible you're still suffering from a different, non-manual penalty (one the algorithm dishes out without manual (human) involvement). So, it's good news that you no longer have the manual action, but it doesn't mean you have no penalty at all.
-
RE: "Via this intermediate Link" how do I stop the madness?
Awesome tip about transferring with the same content here. Completely noindexing the old site sounds like a good option. I'm going to have to keep this trick under my hat for a rainy day, thanks!
-
RE: "Via this intermediate Link" how do I stop the madness?
If you're trying to transfer the good SEO juice, but leave the manual action on the old site, that might not be possible. Google has been cracking down on that, and if you try to move over the good stuff, the bad will follow.
It looks like you have two options:
-
You can 301 the old site, disavow all the bad links, and hope for the best. This way you aren't completely starting from scratch, but the amount of ranks you'll recover won't be clear until a few weeks after you do complete this strategy. Different sites have different results doing this. It depends on the penalty, how well you clean up the links, etc.
-
Start from scratch. New content, new domain, new everything, so there's no relation to the penalized site. This might be what you want to do if the manual action is severe. It'll take a bit more analysis than seeing one keyword drop to know the extent of the penalty you're experiencing.
-
-
RE: 301 for a Very Long URL
You can always go in and edit the .htaccess file yourself to put in the redirect. Here is some help with that, you'll likely find what you need under "301 Redirects in Apache": http://moz.com/learn/seo/redirection
-
RE: "Via this intermediate Link" how do I stop the madness?
History is showing us that canonical tags are very powerful, and do pass pagerank. Canonical tags and 301 redirects pass roughly the same authority, so even if your physical links are nofollow, those canonical tags are still being interpreted by bots as a link-like entity.
So, by cananocalizing your old domain to your new one, you effectively moved all those links to your new domain, just like they would have with a 301.
-
RE: How does dynamic call tracking affect local SEO?
It can. Ideally, the same phone number will always be displayed, but in your case (and a few of my clients), dynamic call tracking really needs to be on the site.
There are solutions to this. I don't know which software you're using, and it could differ depending on the software. The solution we found that worked was to put the call tracking script in, and have it change the phone number on the website only when traffic is coming in with certain parameters.
This works well with PPC call tracking, because the traffic coming in has predictable parameters, and those can used as a trigger to fire the script. Search engines aren't going to use these parameters, so they won't effect local SEO. Depending on how much direct and referral traffic you have, you might be able to use this solution with your PPC pool of numbers, and then you can focus on separating the referral and direct traffic from the SEO traffic to get the data you need. We usually only split the PPC traffic this way, trying to do it with organic could get you into trouble.
-
RE: How Additional Characters and Numbers in URL affect SEO
I've never experienced a noticable, direct effect in this regard, and have experience with ranking pages well with crazy parameters. As long as things are canonicalized properly and the system isn't creating a bunch of duplicate pages with different parameters, you should be fine.
-
RE: How Additional Characters and Numbers in URL affect SEO
When in doubt, the answer is almost always the same as the answer to, "What is best for the user?"
Users can't make sense of all those parameters, and bots aren't likely to either. A site like Amazon.com or Canon.com can get away with it, because they have so many other factors going for them. Also, some systems create these parameters automatically, and can't easily be optimized.
So, to answer your question: It's best not to have those parameters. Users like it without them, and it makes it easier for people to link to you, since URLs are more memorable. On the other side of that, it's not the end of the world if you can't do this in an easy manner and your time might be better spent elsewhere.
-
RE: Google is showing product rating of 1-star in search results when average rating is 3.7 - 4 stars
It looks like there's a glitch in the markup on the REI website (or whichever 3rd party review software REI uses). Only one of the reviews on the website is being read and put into the snippet on Google. It looks like it might have been the first review given: a 1-star review by "dan hiker" from 4 years ago. Like the markup isn't being automated on each review after that one.
This looks like an issue on REI's side, and not an error with Google.
-
RE: Should a website have the same standard headings for product pages? And how does that affect SEO?
You don't want to confuse your users, and switching up the structure on every page is a good way to do that. Part of SEO is making sure not to harm the user experience when making changes.
Changing up the header and structure on every page will effect SEO. How it will effect SEO depends on the specific changes you make. This is one of those situations where it's not worth sacrificing a user's expectations for a slight potential SEO benefit.
-
RE: Too Many Links on One Page - What to Do?!
Google updated their guidelines a while ago, and no longer suggests the 100 or less links per page. Now the guideline simply states, "Keep the links on a given page to a reasonable number," which is subjective. https://support.google.com/webmasters/answer/35769?hl=en
With a site like yours, full of different kinds of forms and such, it's logical to consider having 100+ links per page. There are other options for you as well, if you believe these links are hurting, but according to Google they likely are not.
If you wanted to try something different, you could think about building out detailed category pages for each sections of things you offer on the site, and make those the pages that rank for your terms. This way, the number of links on your main page is dramatically reduced, and the user experience might improve, since things aren't quite as condensed.
-
RE: Importance of news headlines for their search rankings
I'm a fan of the third one, personally. It looks like you know the terms you want to use, so if they can fit in the title naturally, the only thing that matters it which one sounds better and will get the best click through rate. In this case, "free outdoor Shakespeare" seems like it would get a bigger draw from people who saw the title, so putting that up front would be best for users.
You may not be shooting yourself in the foot with the first one from an SEO perspective, but from a CTR perspective you might be.
-
RE: Taking Back Google+ Page Managed By Advertiser
This is the process I use when I need to get control of the map listing:
-
Go to the listing page and click on "manage this page." If the button isn't there, click on "edit details" below the details section of the business.
-
This should bring you to a "Request for manager access" page, where you can fill out the information telling Google you are the owner, not the advertisers.
-
Now, if you don't want to wait a million years for anything to happen with the page, you can click on "contact us" at the top right corner of the page and have Google call you.
-
Tell a human at Google the problem and it's possible they will expedite the process.
-
Follow up with the human every few days. The Google human should give you a way to contact them directly through email.
-
-
RE: Order and multiple match when 301 redirect ?
Apache reads the .htaccess from top to bottom, so it's good to keep that in mind.
If I had to guess, I'd say you have more 301s than just those in your .htaccess. A rule above that is likely conflicting with that specific redirect.
If you have a rule redirecting everything from that subdirectory without the French URL parameter, and it's before the rule in question, then this rule will never catch anything.
-
RE: Question about robots.txt
Consider deleting all of this:
Disallow: /&limit
Disallow: /?limit
Disallow: /&sort
Disallow: /?sort
Disallow: /?route=checkout/
Disallow: /?route=account/
Disallow: /?route=product/search
Disallow: /?route=affiliate/
Disallow: /?marca
Disallow: /&manufacturer
Disallow: /?manufacturer
Disallow: /?filter
Disallow: /&filter
Disallow: /?order
Disallow: /&order
Disallow: /?price
Disallow: /&price
Disallow: /?filter_tag
Disallow: /&filter_tag
Disallow: /?mode
Disallow: /&mode
Disallow: /?cat
Disallow: /&cat
Disallow: /?product_id
Disallow: /&product_id
Disallow: /?route=affiliate/
Disallow: /*?keywordThose rules are telling Google not to crawl domain.com/EVERYTHING(then the URL parameter). This could be where the issue stems from. If you're worried about URLs with these things ranking, consider implementing canonical tags instead to point to the proper pages
-
RE: Seo Technical Audit - Trustworthy, competent and competitive firms
There is a saying that I think holds particularly true in our industry: "If you think hiring a professional is expensive, wait until you hire an amateur."
This is one of those "you get what you pay for" moments. Spend what you can afford, but keep in mind that there's a reason why certain SEOs charge more than others. It's because they can, due to proven results and demand for their services.
All you can do is know your budget, shop around, ask questions to be perfectly clear on the services they are providing, and then make a decision on who you want to go with. Sorry I can't provide you with specific companies, but each situation is different and requires a different set of skills. Make sure to inquire when shopping around as to whether the companies have specific experience with your kind of site and what you need looked into.
-
RE: Links From Public Info
Not unless they are copying material that could warrant a DMCA takedown notice. Otherwise, just disavow and note in the report that the webmaster was mean.
-
RE: SEO companies, Backlinks and Trade Secrets
Hi Nick,
A couple things here, but first and foremost: a good, honest, white hat, non-sleezy SEO will tell you every single thing they are doing. About 15% to 20% of my job is explaining my actions to my clients, which can be frustrating sometimes, but is also necessary. My mindset is that it's important for my clients to not only understand what I am doing with their website, but also for them to get behind it 100%.
Let's be clear about one thing: there is no such thing as a "trade secret" in SEO. There is nothing that, once you know it, you can just exploit it and charge business owners to do that one thing. Every account is unique and requires a different approach. If a company won't tell you what they're going to do, don't use them. The same way I wouldn't buy a car unless the dealer told me which one I was getting.
Also, giving out free stays for blog posts/links is walking a fine line on Google's guidelines and some SEOs will say it's straight up buying links. Be aware that this is a somewhat risky strategy.
It sounds like you have a good amount of content and social presence, so it looks like the first thing you'd want to do is figure how to best utilize those two things together in a content distribution strategy. If you can get you good content to the right audience, links will "naturally" come in without needing to target them with free stays.
Make sure not to ignore potential on-page issues or possible penalties incurred as well. Moz is a great tool to help with the on-page stuff, and this might give you an indication of a deeper penalty issue: http://feinternational.com/website-penalty-indicator/
-
RE: Problem Shooting with Google authorship
You need to have the name of the person who wrote the content on the web page. Google is trying to find the name of the author, but can't, they are only seeing the company name. Putting the name of the author on the page and making it a link to his/her G+ profile can help define the author. There is also the rel=author tag, and a URL parameter you can use, but usually one will do.
-
RE: Website starts ranking on Google then always drops - Targeted for Australia but most traffic from U.S - Bounce Rate at 94.49% - HELP!
Same here. For me, any clients that are running BingAds get a few thousands bot hits like this. The hits aren't categorized as PPC, but this is the most prominent correlation I could find. Same thing with 100% bounce and all from one city.
-
RE: Where to add new content
Google will index the two sitemaps fine, just remember to submit them both if the root sitemap doesn't cover WordPress.
A blog could be a great addition, and you could share posts in the forums as a way to distribute the content.
Your plan is sound, as long as you install everything correctly and people like your blogs
-
RE: Identifying Bad Domains
There are tools out there you can pay for to do this work for, things like linkdetox.com can be helpful to identify the bad links. Something like Rmoov is great to then begin to removal process.
If doing it on your own, you'll need to check out each questionable domain manually. You can rule out big domains that you know are not spammy, and focus on the ones you have no idea about. When looking at a domain, ask yourself the following:
-
Is this site indexed in Google?
-
Could I easily create a followed link from this domain to mine? OR pay for one?
-
Does this site seem to be scraping information of mine from other sources?
-
Is this site in English (or whichever language your site is in)?
-
Is it a crappy directory site? Article submission site? Other easily exploitable kind of site?
These are the sorts of things to look out for. Google doesn't want to count links for you if you made them for SEO purposes, or paid someone else to. If you look at a site, and it appears that their business model is building links, or they are spammy in anyway, you know it's a red flag.
Also, I'm learning more and more that when you're on the fence about a domain, disavow it.
-
-
RE: Linking Root Domains after Site-Wide Redirect
I believe Site B will gain one linking domain that is linked through a 301. The Site A 301 has more power due to the 5 links linking to it.
This question also depends a lot on why you're asking. If you are asking because Site A has a penalty and you want to keep that link juice by redirecting, then the penalty will travel with the 301, in my experience. So, from a penalty perspective, it is like those 5 links are linking to Site B.
-
RE: SEO Consulting for HUGE Website. How Big Is TOO Big Of A Change?
If the traffic is relatively stable on the site, then testing on newly published pages and monitoring their tragectory when compared to previous articles might work.
Also, I think 2 weeks is a bit short. I'd shoot more for 3 weeks to a month if you want to see where they settle. If you're using new pages, just use the same time frame for those as when you compare them to previous article data.
-
RE: SEO Consulting for HUGE Website. How Big Is TOO Big Of A Change?
Sounds like the site is big enough that you have the luxury taking a nice little chunk of pages, doing your tests, seeing what happens, and then deciding whether to make the change site-wide. Take a good sample of pages across you site to test on, make sure you know their baseline ranks and traffic to those pages, make the changes, monitor, test some more, etc. This way, no guess work
-
RE: Does Google read bullet point lists are text? WordPress SEO by Yoast says different...
They do. The Yoast plugin has a lot of great features, but sometimes struggles with finding keywords in content if it isn't regular text. No need to worry on this front.
-
RE: 2 blogs and authorship
This shouldn't be an issue. One of the benefits of authorship is it can be used across multiple sites to attribute posts. If you are attributing different posts on different sites, that's fine, and sharing those posts on your G+ is even better. It sounds like you're using authorship as it was intended.
-
RE: Claiming Google+ URLs?
I do sometimes see hijacked G+ pages, or other shady G+ activities, but in my experience flagging it or calling Google resolves the issue quickly. You're always going to have people trying to game the system, just make sure to know how to defend yourself against it.
As for +TacoBellCom, first you would have to have the business named "Taco Bell" and get that verified (good luck there). Then, you'd have to hope that Google lets you add characters to the end of the URL instead of deciding it for you. Then, you would have to hope you were never reported, flagged, or otherwise looked at by a human on Google's end. So, odds are it wouldn't work for too long.
That's not even getting into the legality of trademark infringement.
-
RE: Customer Testimonial Question
That would be helpful. Something along the lines of, "Any logos used on our client list are logos of companies we've directly worked for. If you represent one of these companies and would like your logo remove, please contact us at: blah"
It shows you're making a little more of an effort, but it still isn't ideal. Permission to use a logo is often just an email away, once you find the right person to email or call.
-
RE: Should I Remove Thousands of Bad Links over a Short Time or Long Time?
If you know for sure they are all bad links, then do it all at once and get it done with. Maybe you'll see an increase in traffic.
Once you disavow something, you're not likely to get it back, so keep that in mind. From the sounds of it, these links are obviously bad, so you shouldn't have to worry.
-
RE: Customer Testimonial Question
If it's a major corporation, there should be a PR department or "Press" section of their site that should give you a good place to start your search. Most brands are fine with you using their logo like that, but double-checking is important. Some brands coughRealtorcough are insane with how you use their brand, so checking is recommended.
At the same time, if you can't get ahold of them and you still use their logo and they don't like it, you'll just receive a DMCA-esque letter asking you to remove it. I don't recommend doing it this way, but in the past, the worst I've seen is a take-down notice.
-
RE: Claiming Google+ URLs?
When I have multiple listings under the same brand like that, I think it's good to add something descriptive at the end.
For example, if you have a brand with 10 locations in 10 cities, then you can make them +BrandLosAngeles, +BrandNewYork, etc.
The URLs are there to make for a better user experience, so adding a number to end of the brand doesn't really help much. Instead, think of a word or two that is memorable, and that makes sense to those who might see it.
-
RE: Merging domains into sudomains
If consolidating the domains into subdomains is inevitable, then 301s are going to be the way to go. Just be careful to redirect everything to where it should go in the subdomain, and do it one chunk if you can. You don't want to 301 some stuff, then re-301 or do a second round later.
Be as clear as you can to Google. once you are ready to move the site, do it all at once, so Google can clearly see that everything from the domain is now at the new domain. So yeah, wholesale switch, just make sure the client understands what he's getting himself into.
There is no exact number with regards to traffic loss, chance of recovery, risk of things not going smoothly, etc. Make sure they are aware of all the risks and how to best lower the possibility of something bad happening.
-
RE: URL SEO: Better directory structure vs. exact keyword phrase
The difference in the ranking strength of those two URLs is negligible. Your decision should probably be made by deciding whether or not you're going to build out those directories and have the content to do so.
Having /properties/new-york/rental suggests you having multiple properties in New York, some of which are rentals. If a visitor went to /properties/new-york, would they see a page with multiple New York properties, some for sale and some for rent? If the visitor just went to /properties, would there be multiple regions for people to browse through?
If you have the content and ability to do that, then build it out with the subdirectories. If you don't, then focus on that single page with the keyword as the slug.
-
RE: Duplicate and thin content - advanced..
If you are worried about duplicate content in you search pages, that should be pretty easily solved with canonical tags. These will tell search engines which pages should be indexed, even if that pages' content is seen somewhere else on the site. Here's a link to more information on that: http://moz.com/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps
Even though Cutts said it shouldn't be an issue, he speaks in general terms (to put it lightly). Maybe Google tries to pick up the canonical version, but there's no harm in helping point Google in the right direction, just in case it doesn't crawl your site properly.
There are a few automated tools out there to crawl tons of pages and the potential issues on them. ScreamingFrog may be of use. There are also higher-level enterprise solutions to the problem like Searchlight Conductor.
-
RE: Merging domains into sudomains
This is really a case-by-case basis, and a lot of different factors should be taken into consideration:
- I think the first things to consider is: why? Why do you want to consolidate the sites? If they are all somewhat different, all get traffic, and are doing their jobs, why change it? Is it broken? If not, why fix it?
- Does the brand of the strongest domain make sense as an umbrella to the other websites? Don't try to force domains together that don't make sense for the visitor.
- If it is strong enough, are the sites similar enough to where you could fold them into a single domain and brand (subfolders instead of subdomains)? I would recommend subfolders to subdomains if it still makes sense for the website.
Trying to consolidate multiple established domains into one domain, or a domain and it's subdomains, is possible but risky. Ask yourself if it's worth the risk to do all this, or if this is just one person's whim to organize domains without knowing what could happen.
In my experience, 301ing one domain to another (if done properly), should take about 2-4 weeks to recover the majority of organic traffic. Rarely have I seen a domain recover 100% of their organic traffic after a 301, but you will recover the majority of your ranks. So, if you are 301ing multiple domains, that would mean you are losing a sliver a traffic on each of them, and that doesn't count the downtime during the 301 transition.
And of course, the above is assuming all 301s are done properly, which could be a pretty big undertaking from the sound of it.