On first sight, this might have something to do with it.
https://www.dropbox.com/s/nisv0z2fbl5sb2j/Screenshot 2014-01-27 09.56.30.png
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
On first sight, this might have something to do with it.
https://www.dropbox.com/s/nisv0z2fbl5sb2j/Screenshot 2014-01-27 09.56.30.png
Folder will get juice passed from your main domain. Subdomains are essentially treated as domains on their own.
It all depends what you want to achieve with your seo campaign.
I'm trying to rank a .im domain in Australia and having a hard time getting any traction at all. GWT won't let me geo-target the domain for Australia. It is currently set for Isle of Man.
Thinking of registering a .com.au (same domain isn't available, but I can get something close) and mirror the site completely.
I'd canonical all urls on the .com.au to the corresponding .im pages - would this make a difference?
Also, since the linkbuilding is being done for the .im domain - I think I'll have to start over on the .com.au domain.
Thoughts, tips and insight highly appreciated mozzers!
stumbleupon
The best thing you can do right now is nothing. Sit tight for about a week orso, keep an eye on your traffic and serps.Give Google some time to sort itself out and have a look when the dust settles. Don't go into panic mode and start changing all sorts of things - worst thing you can do right now.
I you could only optimize for one, SEO, which keyword phrase would you select and why? Going by the data I've attached with the two images.
Looking at your entire backlink profile, I think you've got more to worry about than just paid ads. This for example would be a big no-no
Also, there's heaps of non-relevant footer links floating around on decent pagerank pages. A clear sign of link buying.
*update - I thought you said the top two images were yours, but your
site is the plastic one. Anyway, seems you're still buying other links
such as on here: http://www.knight-kit.com/kitchen-remodeling
/a-handy-instruction-concerning-storage-systems-for-the-residence-storage-area-office
That looks like a spammy link page, and it's not indexed in Google. Perhaps part of one of the linkfarms they're taking down at the moment?
Hi, I think most of us would consider that a small budget. Email me at bob@anseocompany.com.au and I'll see if my services would be a good fit for your company.
Internally redirecting pages with a 301 is fine. The whole point is to tell Google that what was once here, is now there.
However, if you're redirecting to a product is significantly different to the one that you had originally, you might want to have a placeholder on the old page, telling the client that this product s no longer available, but add a link to the new product that will achieve a similar outcome for the customer.
The perfect place to find writers is the Problogger Jobs board. I've had several great writers come from there. Another place to look is textbroker.com - you'll be paying for quality though, if that's not an issue.
Your site needs as many pages as it needs. There's not really any other answer for that question. A message board might help you to create a bigger footprint to index, and this is fine as long as it's relevant to your site's topic obviously, in order to attract the right visitors.
I would say yes. If someone wanted to watch the video on your page, but they were using an ipad for example, and the video is in flash, they'd be able to read the transcript.
Kinda like having an ALT img attribute I guess, to make ure people will know what is on the page, even if the actual image of video is broken.
Going by the limited information, I'd probably opt for the co.uk site since you're deploying in the UK. You could transfer the juice from the .com to the co.uk with a 301
Are the links to the .com relevant to the new site content/market?
Yeah, the Yoast plugin has more advanced options. It also has a lot less annoying ads in the admin panel
nonsense. Search for https://www.google.com.au/search?q=inurl%3Arobots.txt&pws=0
Some of the first results with visible robots.txt I see are:
I refuse to believe that "something is seriously wrong" with any of these sites.
Install Yoast's Wordpress SEO plugin. Has all the options you're looking for in one easy tab
I'd leave the archive crawlable since you use them in your nav bar. Just make sure to have canonical attributes on my posts. Also, use the "strip catagory base from URL" option - makes it all look a bit cleaner and shortens your URLs
I know Google used to have an API for stuff like this, but that's not been available for a few years now. Perhaps they're still allowing people who had access previously to get into the data?
Its breaking guidelines but not in order to manipulate a sites rankings. I think this sort of thing might have less of a priority for Google. it's something they can't really control because there will always be ways around this.
Interested to hear an official SEOmoz response to this one though
That's me And yes, I think a competitor ratted me out to Google while I was experimenting with the review stars
Hmm, not sure in that case. Any recent updates to the code perhaps?
And yes, I'm from Perth too
We're they real reviews or manufactured ones? I've seen a lot of Perth SEO sites had em, and Google took them away because they were misleading.
Sounds like you're looking for http://tracking202.com/home
As far as I know, the red rectangles highlight all sections of a page that have the keyword or phrase which you used in your search query.
Is the subdomain data stored on the server as directories?
So for example, is the Moe.123abc456.edu data stored in a folder like 123abc456.edu/Moe
If so, you can simply have one robots.txt on your root domain, blocking those directories
Disallow: /Moe/
Did you reconnect your GA account a few weeks ago? Here's the announcement:https://seomoz.zendesk.com/entries/21081482-reconnect-your-google-analytics-in-your-campaigns
Not saying that that's the problem, but it's the only thing I can think of right now.
This should be exactly what you need: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1663427
Creating a new sitemap and submitting it should get rid of those errors. And might I suggest you setup a development domain so you won't have these problems again in future?
Assuming that you do not need the development environments indexed in Google, why not simply block all crawlers on those subdomains?
Did someone ever get back to you Brent?
I can't imagine Google wouldn't take notice of a link like you're describing, if they'd come from an authoratative site. An example: http://jguide.stanford.edu/directory/url.lasso?id=1809&lang=en&url=http%3A//www.fashioninjapan.com/
The server header is a temporary 302 redirect though in this case, which isn't as valuable as a 301.
Depending on how many pages your site has, I'd either just add canonicals to the pages, or check if your programmer did indeed not use the htaccess - which you can then always mend.
Also have a read through Dr Pete's post here: http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world which is probably the only duplicate content post you will ever need to read to fix any problems.
Google are hoping to use rel=author as a ranking factor (http://www.youtube.com/watch?v=FgFb6Y-UJUI&feature=youtu.be&t=57s)
I think that in the (near) future this will start playing a role, and the attribute will become a standard like nofollow is. If you trust your freelance writers, and think they'll be able to created a trusted, authoritative profile, go for it. If you're not sure about the quality of work they deliver elsewhere, for other clients, you might want to consider creating your own profile attached to your site, and use that as the author attribute. That way you'll have a bit more control.
I'd suspect that if you change the date, and ping rss aggregators, they'll publish the content, which then becomes 99.9% duplicate content.