Hi Ketan,
Never too late...we're still here!
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Ketan,
Never too late...we're still here!
Hi d25kart,
You will find the reconsideration request form in Google Webmaster Tools - you will need to be logged in to the account for the site in question. Make sure you read all of the instructions carefully before submitting.
You can also use the "Fetch as Googlebot" feature in Webmaster Tools to check whether there is any problem with crawling specific URLs. The feature is under Health in your Webmaster Tools menu. If a URL is successfully fetched, you can then click "Submit to index" to add it (and linked pages) more quickly.
There are limitations on the number of submissions you can make in this way per month, but it is definitely worth utilizing this method to check for issues that might be preventing indexing and to add pages to the index quickly.
Hope that helps,
Sha
Hi Kevin,
Just a few things to think about...
The best thing you can do to improve the situation is to move your client's focus from "Rankings" to "Results"... Is the new site getting traffic? Clearly the UI you have built should be much better for conversion. Does the new text stack up on the conversion front? Whatever the rankings, if the client starts seeing trackable conversions from the new site, the tension will ease quickly.
Have you taken advantage of the "Fetch as Googlebot" feature in Google Webmaster Tools and the Bing version in Bing's Webmaster Central to get the pages of the new site crawled and indexed quickly?
A 301 is called a Permanent redirect. It is not intended to be reversed at will, but it is intended to inform search engines that the old URL should be de-indexed and replaced with the new.
I would be looking fairly closely at the backlink profile of the old site for two reasons. 1) To see whether there are local citation links in the mix so you can make a more informed decision about the need to use a 301 2) To see if the old site is in line for a penguin slap in the near future
Hope that helps,
Sha
Hi Robert,
There is a "Getting Started" video intro to the software which will give you a place to start on this page where you will also find information about the weekly Welcome Webinar with Moz staff.
I would recommend that you also sign up for the next Welcome Webinar so you can ask any questions you may have about the different tools in the SEOmoz toolkit and how you can use them.
There is a Welcome Webinar every Friday morning and you will find a button on that page to Reserve your Spot at the next one.
When using the Pro App, if you need a little clarification of how a particular tool works (for example if you are in the On-page Tool area) you will find a tiny blue "Help" link at the top right of the page that will take you to the help page for that tool.
If there is anything that hasn't been explained by the video or help pages, you can email direct to the Help team - help at seomoz.org. They are actually pretty awesome and always happy to help you out.
There are also a lot of great SEOmoz resources that will help you get your head around it all... The Beginner's Guide to SEO is one of the best. It is included in the list of resources in this thread from a while ago.
Hope that helps,
Sha
Hi again Matt,
Actually, you don't need to be using the 301 at all. A simple rewrite will do what you are wanting.
If you do it like this:
RewriteRule ^vacancy/([0-9]+)$ vacancy.php?id=$1 [L]
your URLs would be a lot more friendly.
The resultant URLs would look like this:
/vacancy/12345
Hope that helps,
Sha
Hi Matt,
The 500 Error is caused by the "space" in [R=301, L]
Should be [R=301,L]
Hope that helps,
Sha
Hi Ennovation,
This is a known issue that does happen occasionally with campaigns. I have had the same problem with a client site that turned out to be an "edge case".
Whenever you have something happening in your Pro campaign that does not look right, you should contact the Help Team direct. You can use the Support tab on the left of the screen while in your campaign, or send an email to help at seomoz.org.
Make sure that you let the Help Team know the campaign that is affected (campaign number in the URL, or domain name) and the problem you are having. They will be able to take a look at it for you and let you know when the issue is resolved.
In my case, there was a little engineering magic and the next crawl yielded the correct number of pages
Hope that helps,
Sha
Hi GYMSN,
The most obvious explanation would be the content itself.
If you are following the general idea that pages are for "evergreen" content and posts are for "news", then it follows that the content in your posts tends to be much more topical than that in your pages.
If the content happens to also be timely (appears before others), fresh (up-to-date) and well written/good quality, then it should rank well and may even have a little potential for virality if very topical.
This would easily explain a significant difference in rankings.
Hope that helps,
Sha
Hi Richard,
As an SEO who works with small to medium-sized businesses, I have to say that the more genuine understanding a client has of what we are doing and why, the less time I have to waste explaining things to them.
My most cherished client is one who has taken the time to develop a real understanding of the process and the pitfalls. Investing in making yourself more helpful to those who are helping you is always a good idea...and since it is likely they may also be represented at Mozcon, it could be an awesome opportunity to spend some time and gain a deeper understanding of the work they are doing and the challenges they face in doing it.
My special client is the ONLY one who doesn't say things like "do we really need to bother with that" on a regular basis. I love him for it and I know it makes me more valuable to him in the long run.
Hope we'll see you at Mozcon
Sha
Hey Alan!
...and we miss your awesome answers, but it puts a smile on my face when you are able to drop in
Sha
Hi Pol360,
Excellent answer from Anthony and big thumbs up!
If you are looking for a tool that will let you know when you have new links, you might want to take a look at Linkstant, a cool tool put together by Tom Critchlow & Rob Ousbey from SEO consulting company Distilled.
You won't get all of the Moz metrics available in Open Site Explorer once the Index is updated, but in the meantime you will at least know instantly when new links are created
Hope that helps,
Sha
Hi Naomi,
Since the problem is happening when you are using the On-page Tool, I would suggest you send an email direct to the SEOmoz Help Team providing the URL, keyword phrase and campaign number (if there is one). Any time that you are seeing results from the tools which seem strange, it is a good idea to ask the Help Team
The address to send your request to is help [at] seomoz.org
Hope that helps,
Sha
Ah, but that is the beauty of getting one question per month
When you are just starting out there can be a whole lot of very simple technical issues that could be hurting or slowing down your progress without you even knowing.
I actually used my first private question to ask about a nagging feeling I had and something weird I was seeing in Open Site Explorer results for a new client's e-commerce site.
An Associate answered my question and pointed out a fundamental flaw in the Template the client had used which was stopping it from being crawled beyond the first level!
That question paid for itself many times over.
Just pick one of those things that has been nagging at you and ask...you could be totally surprised
Sha
Hey Rand,
Bad news for someone...I'm cooking up a doozy at the moment!
Hi again Alan,
I would pretty much have to agree with all that's above. It really must come down to ROI and for someone who is just running a single site, then the decision you're faced with is how much up-front investment you are prepared to make and whether you are prepared to sustain that investment in the long term.
While all of that is obvious, I thought I would share the one thing included in the SEOmoz Pro subscription that I think is potentially the most valuable and perhaps the most under-utilized by Pro members...
1 Free Private Question per Month!!
Yes, I made it bold and underlined it because this is the one feature that could provide you with the professional input that changes your business completely. The pity of it is that many members completely forget about their private question. Maybe this is because they don't realize it actually means you can ask a question and have it answered by a member of the SEOmoz Staff or Associate.
Yes, that means you could actually have the help of someone as awesome as Dr Pete, John Doherty, Jen Sable-Lopez, Keri Morgret, Casey Henry, Michael King, Gianluca Fiorelli, Dan Shure, Cyrus Shepard and many more...
Even Rand Fishkin?
Yes. Even Rand answers private questions. The fact is, your question will be assigned to a person on the SEOmoz team who has the skills and experience to best help you.
Obviously, your private question needs to fit within the subject categories available here in Q&A and you should be careful not to waste it on something you could as easily get answered here in regular Q&A. With that in mind, I would consider the potential value of such a benefit to your business and think carefully before giving it up.
Hope that helps,
Sha
Glad to help anytime Simon.
I know it's probably hard to believe right now, but I'm sure it won't be long and you'll be explaining stuff to others here in Q&A!
Once all the pieces start to fall into place everything becomes much easier to understand and if you spend some time around the SEOmoz community you'll be flying in no time!
Here's a post with links to all of the most important resources here are SEOmoz which might help you find your way around.
Look forward to catching up with you around the community
Sha
Hi again waspman,
No worries...it takes a little getting your head around when you start out
As Chris confirmed, the root domain is waspkilluk.co.uk.
The most common subdomain is the www, which is standard, but you can also create other subdomains if you wish.
Perhaps the most visible real world example of a site that uses subdomains is Google - which uses subdomains to manage each of its specific products within the site. If you take a look at the URLs when you are using Google products you will see examples like places.google.com, maps.google.com, blogger.google.com, webmasters.google.com, www.google.com etc.
A subdomain is referred to as a "third level" domain, created within another domain. As a rule, each level of a domain is defined by the dots (.) appearing in reverse order within a URL.
So, in the Google Maps example:
the Top Level Domain (TLD) is .com
the second level domain is the chosen domain name google
the third level domain (or subdomain) is maps
In your case it is a little more complicated. You have an extra level in your domains because your Top Level Domain is the country designator .uk so your second level is actually .co, but for the purposes of understanding subdomains it won't hurt for you to think of .co.uk as your top level domain.
Generally, most small sites don't require extra subdomains, so often the only subdomain in play is the www.
Matt Cutts has a simple tutorial on the parts of a URL that makes it much easier to understand what is going on at a glance.
Hope that helps,
Sha
Hi waspman,
No, you are absolutely not doing anything wrong
The simple answer is that you are acquiring links primarily to the sub-domains and not the root domains.
To explain:
"mozRank is SEOmoz's global link popularity score. It compares the relative link value (ranking power) between URLs on the Internet. It is similar in premise to Google's original PageRank metric but is updated more frequently and offers greater precision."
You can read more about all of the SEOmoz Metrics at the Open Site Explorer explanatory page.
So, if you go to Open Site Explorer and run a query for your sub-domain you will see the number of external links to that URL. If you then remove the www and run the query for the root domain you will see that there are 0 links to that URL.
Since mozRank = link popularity - it should be higher for the sub-domain because that is where the links are pointing, and of course you want to ensure that your links are always consolidated to just one domain. So, things are exactly as they should be.
Hope that helps,
Sha
Hi JU1985,
Good call from Donnie!
I would be keeping both pages live and adding a unique explanation to each page that lets them know that the product they searched for has been superseded by Widget C.
When deciding on the right solution for any issue like this, the first thing to consider is the effect your solution will have on the user. Ask yourself "If I search for Widget A and land on a page that offers Widget C, what will I think?".
The answer for me is that I will most likely assume the result is incorrect and return to the search engine looking for a better result. That is not the best user experience possible and therefore unlikely to provide the best conversion rate possible.
So for me, any solution that simply delivers the client to a different product without an explanation (301 or rel=canonical) is least preferred.
The key to good business is good customer service - essentially being as helpful to your potential customer as possible. If a customer arrived at your offline store and said "I'm looking for Widget A", would you push them quickly across the store and say "here's Widget C"? Or would you explain that "Widget A has now been superseded by Widget C" and provide Widget C for them to look at?
The more you can emulate the offline store experience in your online store, the better the chance that the customer will feel comfortable buying from you.
Incidentally, I would make sure that the Widget C description added to the pages includes a Buy button and sufficient information that the customer can proceed to purchase without having to go to the Widget C page.
Hope that helps,
Sha
Happy to help anytime.
Look forward to catching up with you around SEOmoz as you settle in to the community
Sha
Hi Andarilho,
The 250 page crawl is just a quick version of the crawl to allow you to get started on your campaigns because a full site crawl generally takes up to 7 days to complete. So, you can start working on the first 250 pages in the meantime
Hope that helps,
Sha
No worries Alan.
I figured the one in your Webmaster Tools ought to be the right one
The problem with that one is that it returns blank pages for every URL indexed in Google (if you use site:www.whitby.uk.net you can see a long list of indexed pages, but not a single one is working). Obviously there is a problem with the site which you need to fix as quickly as possible. The Database may be the best place to start looking for issues.
Hope that helps,
Sha
Hi Talha,
Perfect answer from MyHolidayMarketing and a big thumbs up!
The benefit of this method of presentation is that it also reflects for your developer the practical implementation of a menu structure that will make it easier to expose search engine crawlers to all page levels on the site.
Basically, the sub-levels of your dot pointed list represent the sub-levels of your menu. That way, when any page is crawled, the basic menu structure will guide the crawler through all levels of the site.
Keeping the site structure within 3 levels is also a good idea.
Hope that helps,
Sha
Hi Again Alan,
You have now given us 3 different domains:
www.**whitby-uk.**net
www.**whitby-.uk.**net/
www.**whitby.uk.**net/
A little hard to give you a definitive answer if we don't know which domain you are referring to
Hope that helps,
Sha
Hi Alan,
The URL from your Google Webmaster Tools is a different domain.
The domain that is for sale has a hyphen in it that is not included in the link from your WMT message... or do you own them both?
The WMT message is a courtesy message that is being sent to all Wordpress users with out of date software versions. The risk of your site being hacked is significantly increased by having out of date cms versions. This is something that Google is concerned about because hacked sites may put users at risk of exposure to malicious code.
Hope that helps,
Sha
Hi David,
Having the 301's in place is a good thing rather than a problem. They have been created by Wordpress so that you do not have broken links on your site (because you have created links and then deleted them by changing the permalink).
There are 3 major reasons for using 301 (Permanent) Redirects:
The 301's have no influence on users as they cannot see them - they are written to the .htaccess file. The idea of having them there is to catch any incoming traffic that comes via existing links to those pages that no longer exist.
There is a good explanation of 301 Best Practice in the Learn SEO section here at SEOmoz.
When working through the things identified in the crawl test, the Errors (red) and Warnings (yellow) are the things to pay attention to first. If you check the single line explanation above the blue tabs you will see this message "Notices are interesting facts about your pages we found while crawling." So they are not really problems that need fixing.
A couple of other great resources if you are just starting out are The Beginners Guide to SEO and the rest of the Learn SEO section.
Hope that helps,
Sha
http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world
Hi sansonj
If you are asking about how long it will take for search engines to recognize the changes you have made, then you can speed up the process for Google by using the "Fetch as Googlebot" feature in your Google Webmaster Tools account. This allows you to fetch individual URLs (and pages linked from them if you choose). Once a page has been successfully fetched you have the option of adding it to the index.
There are limits on the number of pages per month that you can fetch, but this is a good option for new or changed pages on your site.
Bing Webmaster Tools has a feature which allows you to add pages to the Bing index in the same way.
Hope that helps,
Sha
Hi Jamie,
Search engines are not the only place you may find links to your pages. Actually, if you have done a good job of promoting your pages across a variety of channels, there should be links out there in a range of different places.
You will find a more detailed answer I wrote on this question about removing 301 redirects.
Hope that helps,
Sha
Hi Josh,
If you have a list of actual URLs which always return a 500 Error, the problem is a bug in the code.
You should send the list of URLs to your developer and ask him/her to fix them. Often, once a bug is identified and fixed, it will correct all or most of the errors.
If you check the URLs and find that the 500 Errors are intermittent (sometimes OK, sometimes returning an error), you should take a look at this question I answered about random 500 Errors.
Hope that helps,
Sha
Hi ahw,
A 500 Server Error is generally caused by one of two things:
From your description, it sounds like the 500 errors may be appearing intermittently - ie a page is Ok, then errors, then is Ok etc.... This would tend to suggest that the problem is a hardware fault, but it is still possible that it may be caused by random code which is not encountered every time a page is loaded.
The most reliable way to determine which problem you have is to check whether you are experiencing the 500 errors with a static file - ie an image, video, pdf, xml, css etc. If you are seeing the error with a static file, then the problem is 100% Server hardware fault (or Server misconfiguration, but this should return a permanent 500 Error).
If "the sitemap also goes down on a regular basis", means that your sitemap also is intermittently returning a 500 Error and it is a static xml file, that may be your clue.
If yours is a LAMP based system (Apache/Linux) and you have root access to your Server, my boss is a Systems Administrator and able to take a look at it for you, but his gut feeling is that you are looking at a hardware problem. This means your best course of action would be to contact your Hosting provider as soon as possible and ask them to take a look at it.
Hope that helps,
Sha
Hi again Daniel,
Notices are nothing to worry about - if you take a look at the message from Roger and the Mozzers above the blue tabs in the Notices section, you will see a little explanation:
"Notices are interesting facts about your pages we found while crawling."
They are just there to make you aware of things in the background that might not be obvious
You will find a short explanation on each of the other sections too.
Incidentally, lots of 302's is not such a good thing - which is why they are in the "Warnings" section instead of the "Notices.
Another little thing to be aware of is the tiny blue "Help" link toward the top right of each page in the Pro Tool. Whichever page you are on, this link will take you to the help documentation for that particular feature. Some good stuff in there and a great way to get to know the tools when you are starting out.
If you have questions about your campaigns, you can also email the Help Team direct - help at seomoz.org. It's worth it just to see the awesome thankyou video when you answer the feedback survey!...or register for the weekly Welcome Webinar with Moz staff.
Hope that helps,
Sha
Hi Andarilho,
Welcome to Q&A!
Good advice from both Andrews above. There are so many ways to learn how to improve your site here at SEOmoz.
A while ago I created a post with a list of resources for a bunch of new people who were starting out here, so I thought it might be helpful to share with you. You can read it here
Look forward to hearing more from you around the community
Hope it helps,
Sha
Hi Daniel,
I have to say I really don't agree with the idea of deleting 301's after a period of time. The only place where there is a time frame involved in the 301 scenario is at the Search engines.
Links can come from any number of other sources including links from other sites, browser bookmarks, links passed in emails, pdf documents, embedded in Youtube videos ...to name just a few.
If the 301's were originally placed to fix the problem of broken links, why would I want to "unfix" them?
Even for large sites with a lot of 301's there are ways to manage processing load etc and if you have found a solution that is acceptable for a period of time, the same solution should be acceptable over the longer term.
Hope that helps,
Sha
Hi Marie,
Good advice from James here.
One other question I would ask - have you checked the site in all browsers? There are viruses written for specific browsers such as IE. I have seen a situation where a site was infected, but nobody in the company knew because they were all using Chrome or Firefox. As soon as a customer hit the site with IE all the flashing lights and sirens went off!
Also, have you asked those reporting a problem for details of the virus scanning software they are using? If you can nail down the virus scanner involved and run an online scan from that company's website you should be able to replicate the report and see what exactly is happening.
Hope that helps,
Sha
Hey Robert!
Yes, everything is good here, thanks. Just been hectic around here lately and yes, I've been feeling bad about missing my time in Q&A
Hi Vinod,
I believe SEOmoz uses the Google API to power tools that utilize Google data.
If you would like more specific information about how the SEOmoz Ranking tool works, you can email the Help Team direct (help at seomoz.org) and I'm sure they will be able to help you out.
Hope that helps,
Sha
Hi dseo241,
Given that your question specifically relates to Roger's ability to access your sites (both dev & production), I think you would be best to email the SEOmoz Help Team direct - help at seomoz.org.
I use the stand alone Crawl Test tool to run a base crawl of our dev sites, which are noindexed, but not behind an htpassword, so you would still need to ask the Help Team about accessibility. Incidentally, the SEOmoz User-agent is rogerbot.
Hope that helps,
Sha
Hi Guido,
It seems none of the domains you mentioned are now working. You should return everything to the way it was before so that your client's sites are operable and then we can start over to resolve your problem.
Incidentally, what just happened sounds slightly familiar to me. Are you using an Apache server?
Sha
Hi Ryan,
A lot of people have gotten much more worried about dates since they heard about the "Freshness Update" late last year. Unfortunately a lot of people assume that it is a factor for all keyword terms & niches, but that is not the case. It is quite easy to find out whether it is a factor for your keywords. I gave a detailed explanation of this in this Q&A thread in November.
As is mentioned in the Quora thread you quoted, there are much more reliable ways for search engines to determine freshness (timestamps & previous crawl data).
I would agree with Brent and EGOL that the significance of year to your user base makes it reasonable (more likely expected) to include the year. However, I would take it a step further and suggest that you consider leveraging the intelligence of the bots a little.
We know that bots are now smart enough to help assess relevance. In fact it has become the centerpiece of their day to day work. For me, that should mean that using words like "wine" or "vintage" would signal to the search engine that this URL and its content might reasonably include date references in the form of 4 digit and/or 2 digit year information
That decided, I would build my site infrastructure accordingly, placing individual pages within directories using a reasonable and natural naming structure that includes the appropriate words. Depending how you prefer to approach it, a couple of possible examples might be grapesinyourtoesexample.com/07-vintage/2007-cellar-pod-viognier-adelaide-hills/ or grapesinyourtoesexample.com/red-wines/2007-cellar-pod-viognier-adelaide-hills/.
Hope that helps,
Sha
Hi Patrick,
Your first link is going to your custom 404 as well, but I get what you are describing, so yes, if it were me I would make sure that the pages are noindexed (using the noindex meta tag, which is the most reliable method) by default when they are created. When you decide that a page is to remain on the site permanently you can then remove the noindex tag and make it visible.
Your custom 404 has everything you need to make sure there is a good chance the visitor will click through to another page in your site, so you are doing all you can to make sure that those people following a broken link don't bounce right back to the search engine.
I would say your logic is correct in that the only way to further improve the odds is to reduce the potential for broken links before they happen.
Hope that helps,
Sha
Hi Derek,
I think kjay, Harald & Will have all pretty much covered removal (or correction) of links that are being called from within the pages on your site and using 301 redirects to capture traffic from external links.
I would also make sure that you have a well designed custom 404 page set up for your site so that any link that isn't covered by the work you have done on the above will still provide an opportunity for the visitor to get to the content they are looking for.
You should make sure that your custom 404 page contains some well written text that invites the visitor to explore the site to find other content that will help them and a menu that will allow them to click through to other parts of your site.
Hope that helps,
Sha
Hi Nikos,
A simple 301 redirect should not cause your rankings to evaporate. The 301 Redirect is just signalling to search engines that they should de-index the old domain and replace the link in their index with the new one. It generally doesn't happen in less than a day either
It seems perhaps there is a problem with the way that the redirect has been implemented, or some other completely unrelated issue that has caused a coincidental loss of rankings.
Can you post a link to your site so that we can take a look at what is happening? If you would rather not post the link publicly, you can go to your profile and send me a private message. There are really too many possibilities to guess without taking a look at the site.
Sha
Hi Jayesh,
A full crawl usually takes 1 week to update, but if you want to get a quick start on your campaign you can use the Crawl Test Tool to get started. You can also use this to run a crawl of sites that are not included in your campaigns.
PRO members can schedule crawls with the Crawl Test Tool for 2 subdomains every 24 hours. You'll get up to 3,000 pages crawled per subdomain. When these crawls are finished, your reports are sent to your PRO email address.
If the crawls in your Pro campaigns haven't updated after 7 days then you should contact the Help Team by emailing help [at] seomoz.org and ask them to check on the campaign for you.
Hope that helps,
Sha
Hi Proforums,
Since your problems have coincided with the change to Drupal, the most important thing is to ensure that the person you engage has a very good grasp of the specific SEO issues and challenges presented by the CMS.
One hint for you - if you search SEOmoz (site search, not Q&A) for Drupal, you will find member profiles for at least one member who specializes in SEO for Drupal sites
Hope that helps,
Sha
Hi Serge,
The reason your redirect is creating a 302 is that it is an unspecified Redirect.
Any redirect that is unspecified will by default be seen as a 302 (Temporary)
To create a 301, you need to specify like this
[R=301,L]
Hope that helps,
Sha
Hi John,
Excellent response here from Ryan as always, so not much more for me to say except that there are some other excellent resources here at SEOmoz which can help you to focus on the broad range of things that might be influencing your site visibility in the Search Engines.
Essentially, as Ryan indicated, it is about taking a "future proofing" approach to your site. If you haven't already found the Pro Webinars section of the site, there is an excellent Webinar from Doctor Pete on Future Proofing Your SEO: 2012 Edition. One extra thing to note here is that all SEOmoz Pro Webinars uploaded include the presenter's Powerpoint Slide Deck. Since SEOmoz presenters make a point of including links to useful resources etc in their decks, this is a huge extra help, so don't forget to grab it.
Also, one of my favorite tools for seeing exactly what is contributing to the top 10 Rankings in a niche is the Keyword Difficulty SERP Analysis Tool where you can run Advanced Reports to see how each of the Top 10 Ranked sites for a particular keyword term are doing against a whole range of key metrics. The bonus is that even if your site isn't in the Top 10, you can enter the URL to have it included in the comparison 8D.
Rand also provided a great post that explains how to use the tool: The Best Kept Secret in the SEOmoz Toolset.
Hope that helps,
Sha
Hi Adam,
"Other" is defined by exclusion.
It includes any traffic which comes from defined sources that do not comply with the definition in force for Referring Sites, Search Engines or Direct Traffic.
This thread on Stack Exchange provides the specific definitions for each of those.
Hope that helps,
Sha