Ahh, that explains it - thanks Mike
Iain.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Ahh, that explains it - thanks Mike
Iain.
Apologies if this question has been asked a million times, but I can't find it.
I have 35 pages, yet only 5 of them have generated On Page Optimization Reports. I know I can create them manually, but wondered if I've done something incorrectly?
Iain.
Wow, thank you everyone so much. I had a feeling you guys would know the answer.
I really appreciate your help.
Iain.
One of my top keyphrases is manchester magician and using Google Adwords Keyword Tool this produces exactly the same number of views as _magician manchester _and again the exact same for magician in manchester.
So I assumed from this that Google views them the same. But looking at the SEOMoz On-Page tool, this doesn't appear to be the case as my grade is different for each keyphrase.
So, I guess my question is, if this is the case and they are considered different, how would I find out which one really is the most searched for so I can optimize accordingly.
Many thanks,
Iain.
Thank you, that is very reassuring, phew!
When you say 250 or so, is that pages? My site only has about 35 pages. I'm happy to wait as long as it takes, just didn't want to miss something.
Iain.
I joined SEOMoz 6 days ago and set up my first campaign straight away.
On my Campaign Overview, it says: "Here's what you can expect over the coming days (we'll send you an email when each step is completed".
But I've not received an email saying that any of the steps are completed, so I haven't really used any of the tools yet as I wanted to wait until I had something to work with
Since it's almost been 7 days (will be 7 tomorrow), do I assume that most if not all of the stages have now been completed and I can get started?
Thanks,
Iain.
Thanks, Tom - I'll have a go at that and make an actual robots.txt file and upload it.
It is odd though and when I was creating my WP pages there are Yoast Options for each page - several of them I set to noindex, though looking at the virtual robots.txt, these isn't the case. My file just has:
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Thanks again for all your help,
Iain.
Cheers Tom,
Yeah it is rather strange. There doesn't appear to be another plugin that should be causing this. Yoast is certainly the one relating to SEO.
Iain.
Thanks Tom,
When I click Edit Files in Yoast it says:
"If you had a robots.txt file and it was editable, you could edit it from here."
And yet, I do have one (albeit it appears a virtual one) as it can be viewed here:
http://www.iainmoran.com/robots.txt
If I try to view the site files on the server, via FTP or CPanel, there is no robots.txt file there!
I appear to be using the latest version of Yoast.
Thanks,
Iain.
Thanks so much for your reply, Tom - very useful indeed!
I'm using Yoast SEO for WordPress, which apparently creates a virtual robots.txt and I can't see anyway to edit it as such. Unlike the posts themselves, which I can set to "noindex", the dynamic pages I cannot.
Unless I make my own robots.txt and upload it to my server, but I'm concerned that it will confuse matters and conflict with the one created/managed by Yoast?
Thanks again,
Iain.
Thanks, Natalie, I think that's it
Just verified authorship, but I guess it takes time for it to become live?
Thanks again,
Iain.
I've noticed that a few of my top competitors have a small photo (thumbnail) next to their listing. I'm sure it's not a coincidence that they are ranked top for the search phrase too.
Is this really a help and how can it be done?
Many thanks,
Iain.
While I can't recommend which option is best, I can certainly suggest a good Plugin to use. I've used it myself and it works rather well:
wordpress.org/extend/plugins/redirection/
It's easy to use and also keeps track of 404 errors.
I'm studying my crawl report and there are several warnings regarding missing meta descriptions.
My website is built in WordPress and part of the site is a blog.
Several of these missing description warnings are regarding blog posts and I was wondering if I am able to copy the first few lines of content of each of the posts to put in the meta description, or would that be considered duplicate content?
Also, there are a few warnings that relate to blog index pages, e.g. http://www.iainmoran.com/2013/02/ - I don't know if I can even add a description of these as I think they are dynamically created?
While on the subject of duplicate content, if I had a sidebar with information on several of the pages (same info) while the content would be coming from a WP Widget, would this still be considered duplicate content and would Google penalise me for it?
Would really appreciate some thoughts on this,please.
Thanks,
Iain.
I've finished doing the redirects and they all seem to work
When setting them up, I've used the full url (including http://www.) for both the "source" and "target" url's.
I originally had a page: http://www.iainmoran.com/mailing-list.html, but this is now:
http://www.iainmoran.com/newsletter/
Hope this was the right way to do it?
Thanks again - it's such a relief knowing there are kind people who are willing to help.
Iain.
Thanks so much, Matt & Keri,
I've just installed the plugin and looked in Google to check which of my old (html) pages were indexed and amazingly it appears that there are none. It was a few weeks ago since I launched the new site, but certainly a few of the old pages appeared in Google Search results a few days ago.
I guess it's still worth doing though as there will be other sites with the old links on?
I notice that in my first SEOMoz crawl report it mentions that I have a 301 Redirect (which I didn't set up) going from: http://www.iainmoran.com/close-up-table-magic
To this: http://www.iainmoran.com/close-up-table-magic/
Is this something that I should deal with and if so, how?
The support here is amazing and I'm very grateful to you all.
Iain.
Thanks Matt,
These pages no longer exist on my server so I should create the pages again - each page just having the redirect code that goes to the new version of each page?
Thank you,
Iain.
I've recently changed my old static website to a WordPress one. I'd like to know what do do (if anything) about my old links.
For example a page on my old site was: www.iainmoran.com/corporate-magician.html - now I'm using WordPress, the url is:
www.iainmoran.com/corporate-magician/
My question is, do I need to set up redirects on these old pages (which no longer exist or will Google eventually re-crawl my site and update the links themselves?
I'm using the Yoast SEO Plugin for WP and it creates a sitemap, which of course will have my new pages on. But don't want Google to penalise me for having broken links, etc.
Many thanks,
Iain.
John, thank you so much for your help with this. It's very much appreciated.
That does explain why I couldn't find the robots.txt file.
Is there another way to resolve the 404 error warning that SEOMoz is telling me about?
404 : UserPreemptionError:
http://www.iainmoran.com/comments/feed/
I understand that if I disable comments on my WP site, that would fix the issue, but don't really want to do that if it can be avoided.
Thanks again,
Iain.
Thanks for your quick response, Johnny,
I've just looked at my root directory via both my CPanel and FTP, but there is no robots.text file there.
I don't understand why this is, as in the Yoast section within each of my pages, there are some which I have set to "nofollow"
Also, within the Yoast CP there is a section titled: Robots.txt and says the following under it:
"If you had a robots.txt file and it was editable, you could edit it from here."
Iain.
Just joined SEOMoz today and am slightly overwhelmed, but excited about learning loads from it.
I've just received my Crawl Report and there is a
404 : UserPreemptionError:
http://www.iainmoran.com/comments/feed/
This is a WordPress site and I've no idea what the best course of action to take. I've done some searching on Google and a couple of sites suggest removing that url from within the robots.txt file. I'm using the Yoast Plugin which apparently creates a robots.txt file, but I can't see any way to edit it. Is there another solution for resolving the 404 error?
Many thanks,
Iain.