Getting the publisher markup to work is much more important than HTML validation. One is going to have a bearing on many factors, whereas the other will make no difference at all.
-Andy
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Getting the publisher markup to work is much more important than HTML validation. One is going to have a bearing on many factors, whereas the other will make no difference at all.
-Andy
It could be that there is an element of local search results creeping in.
If I search for 'Airport Transfers' from here, then I get companies who are in Chester (UK).
-Andy
You need to test the authorship / publisher is working on your site. This needs to be tied into your Google+ account.
You can access the tool here: http://www.google.com/webmasters/tools/richsnippets
When you enter your URL, it will tell you if it's working.
-Andy
I wouldn't have done it quite like that. This is how it shows on one of my sites.
rel="publisher" href="https://plus.google.com/105905172195613766757/"/>
Does your authorship work when you test it in the structured data testing tool?
-Andy
So the word "lunch" is being added to the URL, yet the page doesn't exist?
I'll be honest, I don't work on Drupal because it does do weird things from time to time so can't advise on this specific issue.
I would check the settings though and see if there is an SEO plugin (or site setting) that is causing this to write a second page. Also check to make sure this isn't a page that has been deleted and is still able to be crawled. Has that page with the word "lunch" in the URL, ever been created?
If it is easy enough to do, delete the page and re-create it to see if that clears the problem.
-Andy
I haven't heard of this being a negative keyword, and the only way to really check this, is to try is briefly and see if it passes, and then see if you notice any problems. If there are issues, you will know pretty quickly.
Personally, however, I wouldn't use this to form part of the title - I would add this to the description:
Huge Protein Powder Range
Choose to buy from our range of clinically proven protein powders
That sort of thing.
-Andy
The $100 question
Spend depends wholly on what is required for the site. Do they need Social work? Do they need links?
No simple answer to this I'm afraid Felicity. It all depends on how this client wishes to focus and if you are completing the work, how do you feel this should best be spent.
-Andy
A feed is generated from the content of a your website. As for why this link is showing, when it isn't there is a little odd. Saying that, I did have a similar problem about 2 years ago and can't think what it was for the life of me!
What were you using to generate the backlink report?
-Andy
Whoever you choose, it is important to ask how much of the work they do will be outreach. Ask how they do it and hopefully they will say they sometimes pick up the phone to talk to the person responsible.
They should also be advising on why someone will want to link to you. Do you have genuine benefits you can offer to someone? If not yet, get them written / added.
As Egol said, steer well clear of anyone who you even think outsources their work. This will spell the end before you even start, and insist on updates weekly if they are not in the office.
There are some good link builders out there - finding them can be a little more awkward.
-Andy
It's a little awkward to tell as there is no URL to look at, but if Google is finding their way around your site, you can guess they probably have no issues.
Remember that Google is good at finding their way around Javascript menus, so these day, pretty much anything goes.
Just keep links natural, easy to follow and descriptive. You don't want to be trying anything spammy at this level.
-Andy
Yes, it is an RSS feed and all blogs have them (im sure they can be disabled) - normally at the same address /feed or /rss.
I am guessing that you have links in your articles back to your primary site? As the blog is on a subdomain, it will be seen as a separate site and links back will be seen... If I understood correctly
-Andy
Hi Nicholas,
You are fine noindexing the subdomain. However, you will need to create a new robots.txt file and place it in the subdomain root for it to work.
-Andy
Have a look around many other sites, and they all do the same. Do a search for something in Google and then change the case of anything after the domain. Microsoft is the same. I am sure that there are examples that will 404 these requests, and that will then be a server side setting.
I really wouldn't worry about that though. MOZ is the same as well. Upper or lower case will give you the same page. in the last 15 years of SEO, I have never come across this as an issue
-Andy
Hi Anirban,
There is no way to guess how long this can take I'm afraid. I am sure not even Google could tell you this.
The only thing you can do really is keep an eye on your site traffic and position in the SERPs. I am sure that if Google ranked you in the past, that you will get there again.
-Andy
Hi Gavin,
It sounds like you could do this and then make use of HREFLANG. You can read more about it here, but this is a quote from the page:
Your content has small regional variations with** similar content in a single language**. For example, you might have English-language content targeted to the US, GB, and Ireland.
I hope this helps.
-Andy
It's possible Gerhard, but unlikely. My money is on the initial ranking boost that many new sites see.
-Andy
Hi Gerhard,
From what you have said, this was just an initial jump that many sites see when first being indexed by Google. It is then very common for them to drop out of the SERPs again.
Sometimes referred to as the Google Dance, this has frustrated site owners for some time now.
Just carry on with the positive SEO work and I'm sure you will see it rise again
-Andy
They are all fine to leave in there Chris. If you wish to upload the file to the root of the server instead of having the verify in there, that is perfectly acceptable.
Edit-- I should probably change the Meta Author though. What you have there might be seen as a bit of a bad practice by Google.
-Andy
Hi,
If you would deem all internal pages to be of equal importance, then yes, there is no harm in this. The homepage is always set at 1.0 though.
That said, everyone always has a view of their internal pages and which they class as pass-though to get to something more important, so do keep this in mind when deciding.
-Andy
Hi Chris,
I always advise every client to remove the Meta Keywords. Although Google doesn't make any use of them, if they are over-used, they can easily be seen as a way to keyword-spam. If memory serves, the MOZ page score gives this a negative if found, or it used to.
At best, they will do nothing for you - at worst, they can cause you issues.
-Andy
Hi Dana,
I can certainly understand your problem, and whilst I have no data to give you, you should certainly be looking at this not only as a lost opportunity from and SEO perspective, but also as the inability to report back just how well the site is converting traffic. Without this data, no site can see where changes can be made and where improvements will result to an increase in revenue.
I would also look at the fact that anything that is broken on a site might not be having an observable negative issue right now, but what happens with the next algorithm update? Will something be spotted at some point? Do you want to wait for Google to penalise the site before realising it should have been corrected?
Also, does it make for a poor user experience? If someone comes to the site and then bookmarks of of these pages, how are they going to get back again? Are they then likely to just navigate away because they didn't land where they intended.
I am sure there will be a loss in revenue from this - quantifying it will be difficult for an outsider though. There is no doubt that this should be resolved, and I would say ASAP as well.
-Andy
It's a suggestion, and just indicates to Googlebot how often they should come along. It won't do anything in terms of telling them more about the site hierarchy really, but will give them an idea about which pages are most important.
That said, you should try and follow the following structure:
Homepage: 100%
Internal Pages: 60%
Blog Pages: 30%
Or something very similar.
I hope this helps.
-Andy
Hi Jef,
The best thing to do, is have a read here. The hreflang option would serve you nicely here.
You won't need a separate G+ account for each either. Having the same author is absolutely fine but just make sure you reference each of the sites as a contributor to that site.
Google understands, and handles, multiple language situations pretty well - just make sure you put in the necessary code for them.
-Andy
Hi Travis,
There is never a problem with buying a domain and then just pointing it to your primary site. Problems only arise if you were to buy a domain and then populate it with spam / duplicated content that then linked to your main site.
However, I am a little unsure by the example you gave above, as this looks like just a subdomain of your site, in which case, you won't need to purchase anything and it just requires setting up. Again though, fill it with content (unique) and SEO it as a new site.
-Andy
Hi Kimberly,
Yes, an incorrect 301 or bad code can cause a 500 internal error. Unfortunately, I am not too hot with writing re-write code, so can't tell you if everything you have there is perfect. That said, you might find some help here and here.
-Andy
This is what scares me about some 'SEO's'...
"An SEO has advised we redirect all pages to the homepage, but won't that mess up the link profile?"
Terrible advice!
I hope you have all sorted out now James.
-Andy
I'm sure that one of the MOZ guys will be along to answer this soon, but in the meantime, do you need to be entering a complete URL with http:// ? Do you have other live campaigns that are working OK?
Sorry I can't be of more help, as I don't sub to the full packages.
-Andy
Hi James,
I would never advise that you no-follow internal navigation links to primary pages (main or sub-cat). To things like the contact page, that is fine, but remember that Google wants to be able to navigate cleanly around your site. There are times when you would want to no-follow paginated pages, but this isn't in every circumstance.
Beyond this, it is a little awkward to advise too much more as I would need to be looking at site hierarchy and architecture in more detail.
-Andy
As per my last response Isaac, if you ask this in Moz product support, you will get an answer much quicker from one of the team
-Andy
I don't sub to the MOZ analytics Isaac, so can't answer this myself, but if you ask this in Product Support instead, one of the Moz help team will answer more quickly than here
-Andy
Hi,
If you have low quality / hacked links leading to your site, I would just go ahead and disavow that site. Don't worry about Google coming along to audit you for this, that isn't what they are trying to do. You need to distance yourself from the site and this is the way to do it.
-Andy
There are so many possibilities for this.
Is the content on your sub pages, better matched to the search terms? Does the homepage say more about what you do in a generic way, or is it very targeted?
Do you have duplicate page Titles, or ones that are very similar? This can lead to keyword duplication (cannibalisation).
Is the homepage indexed by Google? It isn't excluded through the Robots file? Are there good links leading through to the sub pages, but the homepage is missing out?
Are there any canonical issues, or does your homepage get duplicated through index.html, default.html, etc.?
Sorry if these are vague, but there really are so many possibilities, it would be hard to guess without seeing it.
-Andy
Hi,
If you have a look at this URL, there are plenty to choose from, and all appear to have reviews.
-Andy
Probably the best way to track down all the pages, would be to use Screaming Frog Spider. Put in the URL and leave it to go.
The free version crawls up to 500 pages with the paid version being unlimited.
-Andy
Sorry, but have to disagree a bit as having multiple H1's isn't the issue that most think it is, or once was. One of my own sites has 20 H1 tags (purely by chance as it is a single page design, but it's a long story), and that site ranks top 3 for a number of highly competitive phrases with almost half a billion results.
No, it isn't best practice and I wouldn't advocate doing this, but it isn't a major ranking factor.
-Andy
Erwan is correct and this is probably how it is being done. Mechanical Turk, you can get almost anything done.
Very grey / black hat area, so walk away.
-Andy
Outside of what I said above, I really couldn't mention anyone here I'm afraid. That would be a little awkward to state one name over another.
-Andy
Hi Marie,
I am exactly the same as you in those circumstances and certainly no offence was taken to anything that you said. I didn't even think about that
-Andy
Google rely on links, even no-followed ones, so any link that is given will pass along some information. Google have also said that if you link out to a site as a reference point, then this can be beneficial to you. I have seen pages in the SERPs jump as much as 3-4 places, just after the addition of an external link to a reputable / source.
-Andy
If that is what has happened. I'll be honest, I haven't seen something quite like this before, so would just be a fact-finding mission to track the problem down.
-Andy
That rules out Page layout.
It does look like a penalty of some sort though. it might be worth checking just to make sure someone hasn't disallowed the whole site from Google in some manner.
Then there is the question of why has traffic increased at the same time as the drop? Do they run a mailing list at all? Have you had a look through the list of referrers to see where this traffic is coming from? Is it all from direct visits, i.e people are typing in the URL?
-Andy
Google looks for sites with two types of content: Fresh or Evergreen. Fresh would be something you would expect on a news site or blog, while Evergreen is for something that doesn't change, or changes very infrequently.
If you provide content that is in an industry where specifications / news can change regularly, then it could be of benefit. If not, then don't do this just for the sake of it.
However, as Egol said, if you can improve on what is already there, then this could prove a positive move.
-Andy
Hi Patrick,
Are there any ads on the site? What was the exact date of the organic drop? I'm just wondering if perhaps it could have been the Page Layout algorithm update?
Was any work carried out prior to this?
-Andy
Right now every SEO guy on here is wondering how to reply to this
Behnaz, MOZ maintains a list of companies that they recommend, so might be worthwhile looking through there?
-Andy
"Prior to building the spammy links, did you guys rank well?"
As David said above...
"For a few years we were occasionally #1 on Google in the US for "salmon recipes", but normally we would be between #2 and #4."
And
"In the year from 25th April 2011 to 24th April 2012 the site attracted just over 500k visits."
-Andy
This could turn into an extremely long post if we go into everything in detail here
A good site structure should be one that is hierarchical and sensible. By this, I mean it should make sense when someone lands on a page, what your intentions are for them and how to achieve them. Keep your call-to-actions clear and concise and don't burden the page with wasted content.
Your URLS's should again be descriptive, so that if someone were to land on an internal page, would the URL tell them a bit about where they were? Are the key phrases you are targeting included in the URL itself? Also, you should aim to keep your URL structure to within 3-4 click of the homepage. Any more and it turns into a poor user experience.
Your meta titles and descriptions should also be in target for the pages themselves. Something like this:
<pin code:="" lohit,="" arunachal="" pradesh,="" india,="" <your="" site="" name="">or <pin codes="" for="" lohit,="" arunachal="" pradesh,="" india="">Whatever you decide to go for here, I would keep i to under 60-70 characters, with the most important information towards the start.</pin></pin>
I hope that helps a little.
-Andy
Hi David,
Whilst I can't go into specifics here, it is somewhat messy in there, including a couple that are even flagged as especially dangerous. I can give you more info if you wish to mail me at info@inetseo.co.uk
However, this isn't the worst I have seen, and have had complete recoveries from those, so all is not lost
-Andy
Hi,
You could do a lot worse than reading the MOZ SEO Guide. This will give you some great insights into SEO best practices and how you can effectively use them to SEO your site.
Also look at how competitors are doing, those who are ranking well. This is always a good sign and a great way to see what is already working.
-Andy
Hi David,
No site is beyond help. I have worked with sites who have been seeing hundreds of thousands of monthly visitors, only to be hit by Penguin / Panda, and seen later recoveries. What it sounds like you need is to disavow lots of the links that are associated with you now. It certainly isn't something that you need to give up on and shouldn't cost the earth to do it.
I would be happy to run a quick scan for you to give you an idea of what sort of state your link profile is in, if you wish?
Have you had a manual penalty from Google at all, or does this appear to just be algorithmic?
-Andy