Yeah, should be fine.
Posts made by StalkerB
-
RE: Sequence of heading tags (H1, H2, H3, etc) important?
-
RE: My Site PR lost to PR4 ! I worked as per SEOmoz Suggestion - No Traffic Drop, Organic Search is good and higher than referral or Direct Traffic !
Anyone big enough is generally fine. I've nothing against hostgator, rackspace, ukfast, etc. I'd only really stay away from GoDaddy hosting.
I use - http://www.5quidhost.co.uk/ - for a lot of sites.
I believe these guys are pretty good and run by a bunch of goons from the SA forums - http://www.lithiumhosting.com/
-
RE: Cannot 301 redirect, alternatives?
Yeah, sorry, would get rid of the 1-to-1s (can't think of a work around just now).
See if you can find out why they can't do it. If it is something like MX records then change those and put the 301 in normally.
You could also put a meta refresh in to take customers to the new page; no real SEO value though.
-
RE: No index.no follow certain pages
User-agent: *
Disallow: /call_backrequest.php*Should work
However if the 'rid' parameter is only used here you might want this instead
Disallow: /*?rid
If 'rid' is used elsewhere that you want indexed, then don't do that.
You could also exclude the 'rid' parameter in WMT - http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1235687
-
RE: Cannot 301 redirect, alternatives?
I'd take it out of ITs hands.
Can't think of a reason why they can't 301 it, it but you can forward it (no masking) at registrar level.
If they need to keep the MX records or something they can do that there as well.
-
RE: My Site PR lost to PR4 ! I worked as per SEOmoz Suggestion - No Traffic Drop, Organic Search is good and higher than referral or Direct Traffic !
You know what, I tell people to ignore PR so often, I completely glossed over the server issues!
Yes, should definitely change host, though any shared hosting runs the risk of having less than desirable neighbours.
Esaky, can you be sure your site is clean?
Site is not currently listed as dodgy - http://www.google.com/safebrowsing/diagnostic?site=www.animhut.com%2F - which is good.
Worth checking for exploits anyway. Some advice here (pdf) - http://docs.apwg.org/reports/APWG_WTD_HackedWebsite.pdf
-
RE: My Site PR lost to PR4 ! I worked as per SEOmoz Suggestion - No Traffic Drop, Organic Search is good and higher than referral or Direct Traffic !
Does it matter?
If you're ranking better and getting better traffic then why worry about your PR?
Unless you're selling links, in which case you've maybe found your problem.
TBPR is not important on its own for ranking. Don't worry about it and focus on getting more traffic instead.
-
RE: Redirect help
Not tested and I'm very sleepy this Friday afternoon, but the below should fix it.
NB, this will also redirect any subdomains.
RewriteEngine On RewriteCond %{HTTP_HOST} ^neatstuff.com$ [OR] RewriteCond %{HTTP_HOST} ^www.neatstuff.com$
RewriteRule (.*)$ http://www.neat-stuff.com/$1 [R=301,L]EDIT
As per JP above, this is a solution using .htaccess. If you're on a Windows server you'll need to use IIS. The below might help
<match url="(.*)"></match>
<add input="{HTTP_HOST}" pattern="^neatstuff.com$"></add> <action type="Redirect" url="http://www.neat-stuff.com/{R:1}}" redirecttype="Permanent"></action>But again, I'm sleepy, so test things first
-
RE: Small Link nalysis project - anyone
Aye, all right. Message me what you're after.
If it's massive I might have to pass, but if not I can export it all and send it over no problem.
-
RE: CSVs From Social Export are Blank
Hi Ken,
While staff do browse these forums, it's mostly a peer-to-peer Q&A site.
To get a response from SEOmoz directly please contact help@seomoz.org or submit a ticket here - http://www.seomoz.org/help
-
RE: Change in url structure - added category page
You can literally put anything in the category part of the URL and it will resolve.
Try www.website.com/fhqwhgads/shirts and it will still resolve, heck throw another directory in there and it will still work www.website.com/fhqwhgads/zomg/shirts
As for why it does that, I'm probably not the best person to explain, but WP effectively just looks for the end of the URL, which is sometimes why naming pages and posts the same can cause problems.
I'm not sure what would happen if you 301'd /shirts to /clothes/shirts as it may just be looking at the last part anyway (could quickly try).
I'd consider adding a canonical tag instead.
-
RE: Has anyone used seolinkvine
Not quite up to date with their latest offering, but I've used it in the past along with many other content networks.
Unlike many people here I don't think something should be ignored just because it's against search engine guidelines. If nothing else it's worth testing what's working, then figuring out why it's working.
To answer your question; of the many large and small content networks I tested, SEOLinkVine produced the worst results by far. ALN and 10LAD both produced results, either in combination with other linking or on their own (before being decimated). SEOLinkVine did not.
I see they claim to have updated but are selling the same results as before. If it didn't work then, I'm not convinced it will work now.
Thoughts on proceeding.
Personally curated content networks, or even joining some smaller editor driven content networks will still probably get you results, if done well. However when they get found out and discounted you'll be in trouble. Sites that I had propped up by these networks ended up with a penalty from which they've still not recovered.
If you're after some shorter term success, might be worth doing, but in the long run you'll be better putting efforts into improving the site and relationships.
.
-
RE: Is google rolling out a huge update this week?
Something's going on but not determined what yet. I've seem a fair amount of movement in the SERPs for non-US territories (UK, IE, SE, IN, DE, FR, CH, +).
Assuming Google are denying a Pan-guin update of any sort there are a few theories about what it could be.
-
Crawl based link devaluation.
I'm inclined to think that this has something to do with linking but it would make sense that they include Penguin within the main algo if they can do it as they go. Specifically might have an effect on old links as well as spammy links. Alternatively might be ignoring anchor text on low PR sites. Maybe ignoring some directories specifically. -
Disavow data.
Doubt they've had time to sort through the amount of data they've been getting, but they'll have to look at it at some point. -
User metrics.
CTR, return to SERPs within same session, refined searches. I'm not leaning towards this one as most of what I've seen slip has been among the (if not 'the') best result. -
Social signal devaluation
If Facebook can figure out what likes are fake then Google can probably see which 'authors' are fake. They're definitely pushing for authorship, rolling out something to figure out who's legit could have an impact, but I'm not sure authorship rank is a big enough of a part of the algo currently to affect this kind of change. -
EMD update.
Not keen on this as a theory. Not seen anything to back it up and plenty of EMDs still up there. -
Google are telling lies!
Somebody's made a change and they're not letting on or haven't told the people communicating the changes.
Ultimately I'm shooting in the dark here. Although I'm seeing a lot of changes not all of my sites have been effected and I'm doing similar promotion for most.
If anybody wants to chime in maybe we can put our heads together. I'm seeing a bit more clustering again with the same domain showing up more often in the results (though not as bad as it was). Anybody else seeing any sort of pattern?
-
-
RE: Are seomoz ranking in jeopardy?
Very reasonable observation.
I think that some of the exclusive rank tracking software will continue to operate though, so won't really affect me.
-
RE: Updating content on URL or new URL
While I'm here, do you not have hyphens as word separators in your URLs or is it just for these examples that you're not putting them in?
i.e. Why have you gone for www.domainname.nl/eventname2013 vs www.domainname.nl/event-name-2013?
-
RE: Updating content on URL or new URL
Tough one these annual events, few paths you may want to consider.
**1) Create a new url - www.domainname.nl/event-name-2013 **
Reasonable idea if the event is searched by year i.e. they'll search "event name 2013". As you probably can't be sure about what people are going to do I'd suggest not relying on that and keeping the original URL. Make sure and link to all future years from here though (link to 2013, 2014 when it comes, etc.)
PROS - You'll now have a naming convention and never have to worry about this problem again You don't need to worry about what to do with last year's info You build up your site's relevancy for the term with multiple pages on the same topic
CONS - You lose any authority and link equity the main page has built up If the pages are highly similar you may have trouble ranking the newer ones (or older ones, I dunno how Google works it out)
2) Replace it - Simply put up the new content for 2013 and overwrite the 2012 content.
Not great for a number of reasons. Significantly changing the content may lose some of your relevancy and the archived content may still have value to users.
PROS - You get to keep the same URL and it will always be the most recent information (if you update it) You get to keep your authority and link equity (caveat: If the content changes entirely search engines may strongly devalue previous links to that page)
CONS - You lose content You may lose relevancy
3) You update the content with 2013's schedule and place the older content on a new page - http://www.domainname.nl/event-name-2012
This way you can keep working on the existing URL but don't lose the old content.
PROS - You build up your site's relevancy for the term with multiple pages on the same topic
CONS - You may confuse search engines by moving the content they expected to another page
3.1) Canonicalise the 2012 content
As above but you add a canonical tag to the 'archived' page telling search engines that the main page is the one they're looking for
PROS - Users still have access to the older content
CONS - The old content no longer counts for much
4) You add the new content to the main page and keep 2012's underneath
You could simply update the page with a
<header>
combo in HTML5 or demote the previous year's to
s and use
for this year. You can even somewhat hide the 2012 stuff by using css, jquery or js (maybe ajax, I dunno), that would mean that the page can still pretty much look like you want.
PROS - Adding more relevant content to a page can improve the pages quality All content accessible from one location for the user
CONS - If it is year specific you may dilute the relevancy Shouldn't be seen as hiding content, but if there's a lot of keyword heavy text in the hidden divs it may trigger sore sort of alert
What would I do? Depends on the event/type of site I guess. Most likely 3.1 or 4 but as I'm not 100% happy with what canonicalisation does, probably 4.
If anybody wants to jump in with other ideas or other pros and cons there's probably a lot I've not thought about.
</header>
-
RE: Adding twitter-accounts
Hi Alsvik,
While moderated these forums are primarily peer-to-peer answers, so no guarantee anybody from SEOmoz will see them.
Please contact help@seomoz.org with your problem and they'll sort you out
If you have any other problems with something else you can get more info from the contact page - http://www.seomoz.org/about/contact
-
RE: Someone just told me that the Google doesn't read past the pipe symbol. I find that hard to believe. Is this true?
I can't 100% confirm that it does but...
Pipes have been used as separators for years. Many sites still have "Sitename.com | Real title here", so I can't see why it wouldn't read it.
-
RE: What is your best technique of getting .Gov and .Edu links?
Will any .edu or .gov do?
.edu and .gov sites don't really come with any sort of boost beyond the natural strength of the page. The bonus usually comes from the fact they're on a strong page.
-
RE: Wordpress Blog Blocked by Metarobots
There shouldn't be a robots.txt file on the /blog section anyway, should always be in the root. It was just something to have a look at.
I'm having a look just now and also don't see any problems.
You've nothing in the robots.txt file and nothing in meta-robots for the header.
There's 42 pages in the site: command and a similar number in your sitemap.xml so I presume that's right. 6 pages in site:/blog which again looks right.
I've tried using SEOmoz's tools on your site though and it just tells me that your site doesn't resolve. edit Managed to get it to resolve on the 3rd try for a crawl, but using the on page report card checker it's still giving me problems.
You're definitely returning a 200 message with a site when I check using any other tool though, so I'd get in touch with SEOmoz directly and see what's wrong with their tool - help@seomoz.org
Just to confirm you're not doing anything tricky server side to prevent scraping are you?
-
RE: Moving a html site into Wordpress
Yes.
In your permalink settings just add .html (or whatever) to the end of your permalink settings so %postname%.html and that will work... for posts.
To add it to pages you can either use this plugin - http://wordpress.org/extend/plugins/html-on-pages/ - bearing in mind it's rarely updated, hack wp_rewrite in wp-includes/rewrite.php
add_action('init', 'change_page_permalink', -1); function change_page_permalink() { global $wp_rewrite; if ( strstr($wp_rewrite->get_page_permastruct(), '.html') != '.html' ) $wp_rewrite->page_structure = $wp_rewrite->page_structure . '.html'; }
bearing in mind you'll have to do this every time you update
or add it in your .htaccess
RewriteEngine On RewriteCond %{REQUEST_URI} !.[a-zA-Z0-9]{2,3,4} RewriteCond %{REQUEST_URI} !/$ RewriteRule ^(.*)$ $1.html
I'm not taking any responsibility for messing up that last one
-
RE: Anchor text questions - What are your thoughts?
A blanket answer would look something like
25% Keyword exact and almost exact match anchor text
25% Partial match anchor text and longer tail anchor text
25% Raw url and domain
25% Generic style links (click here, visit site, now, here, etc...)For something like 50 articles, assuming you're putting them out in article directories and that was all the links that page was getting, I might not be so bothered and go for a much higher percentage of exact and partial and pick up more generic anchors elsewhere for the page.
It also depends how strong the site and page are.
I'm not quite sure exactly what it would take to trip Penguin but more than just the anchors it'll depend on where those links are coming from. If they're all good sites then you could go 50/50 on the exact match and should still be fine. If they are of, let's say, less authoritative sites then I'd still want a few other anchors in there.
-
RE: Wordpress Blog Blocked by Metarobots
If you're not taking Zach up on his offer, have a look at http://yoursite.com/robots.txt and see if it has
User-agent: *
Disallow: (your blog url in here)If it does you'll need to edit your robots.txt file to not have anything you don't want disallowed in the disallow section. You can do this via ftp.
If it's in WP itself there may be another robots.txt file at http://yoursite.com/wp-install/robots.txt which, in theory, could also be preventing crawling if it has anything disallowed in there.
Again, editable via ftp or maybe this plugin - http://wordpress.org/extend/plugins/wp-robots-txt/
As it already says that it should be public probably not WP, but worth a look anyway.
-
RE: Spider Indexed Disallowed URLs
The directives issued in a robots.txt file are just a suggestion to bots. One that Google does follow though.
Malicious bots will ignore them and occasionally even bots that follow the directives may mess up (probably what's happened here).
Google may also index pages that you've blocked as they've found them via a link as explained here - http://www.youtube.com/watch?v=KBdEwpRQRD0 - or for an overview of what Google does with robots.txt files you can read here - http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156449
I'd suggest you look at other ways of fixing the problem than just blocking 1500 pages but I see you've considered what would be required to fix the issues without removing the pages from a crawl and decided the value isn't there.
If WMT is telling you the pages are blocked from being crawled I'd believe that.
Try searching for a url that should be blocked in Google and see if it's indexed or do site:http://yoursitehere.com and see if blocked pages come up.
-
RE: Brand new domain with a lot of old links.
The FTC? Oh you crazy Americans
Yeah, might all go south, but meh, get nothing if you don't try : )
I'm just being an affiliate, not going to run one (even though there's a billion scripts to do so).
-
RE: Brand new domain with a lot of old links.
That's never going to happen, too many on clearly unmoderated sites.
If I was spamming myself (not that I'd ever advocate such behaviour) I'd just have kept up the velocity until I broke through but the fact they're a year or more old makes me wonder...
I'll probably sort the site out properly, add a bit of decent content and then decide whether to hammer it with more spam and/or build some decent links
-
Brand new domain with a lot of old links.
So I just bought http://penny-auctions.co/ a couple of days ago, ignore the fact it's not set up yet and generally just a mess, and something kept bugging me about why the domain authority kept showing as 25 in the SEOmoz toolbar.
Now initially I'd set it up as www. and with the trailing slash so the PA was 1, however after a bit of exploration it seems that someone has been building links to the non-www version of the domain for at least a year!
The site has never been owned before so I've now made the non-www version the default and have a 5 day old site with PA 35 and DA 25!
SEOmoz shows PA - 45 links, 20 domains and DA - 341 links, 113 domains.
Majestic Historic - 8598 links, 236 domains and Fresh - 160 links, 62 domains.
Brilliant! Except they're all spamtastic.
What do you think this will do for my future attempts at ranking?
Should I create pages that have links to them or just 301 them?
-
RE: Newbie - help me get started, please :)
Hi Daniel, you're getting a lot of good advice in this thread and I know it's a lot to take in at first.
I've never worked in beauty so don't know off the top of my head but the beginners guide does have a section on keyword research.
And here's a timely post on simplifying keyword research which gives a brief introduction to some of the tools you have at your disposal from SEOmoz.
-
RE: Newbie - help me get started, please :)
Could go a lot further wrong than the SEOmoz beginners guide.
However your question is very broad. Perhaps you have a specific thing you would like to achieve first?
-
RE: Graph ad groups by cost in Adwords
If I go into the adgroup I can see the cost by day no problem (just toggle graph options), but I want to do it a level up rather than collate them all individually.
Seems like something that should be possible
-
Graph ad groups by cost in Adwords
Easy one for someone I'm sure, but I can't figure it out.
In Adwords under campaigns "Widget" I have ad groups "Blue", "Green" and "Red" each with 3 ads in them.
How can I produce a compared report on cost per day of each ad group.
So I want something that looks like
1st 2nd 3rd ...
Blue £12 £14 £9 ...
Green £8 £11 £5 ...
Red £9 £22 £16 ...Possible?
-
RE: Can someone explain how a site with no DA, links or MozTrust, MozRank can rank #1 in the SERPs?
Well, I don't think there is any SEO going on. That site has not been made to be more accessible to search engines (or users), however still ranks number one.
The exact match title, domain age and legit info on the site appear to be the only significant factors as I don't see a much else going on I'm not convinced they're doing anything dodgy though, just lucky.
Possibly an anomaly or perhaps as other factors catch up with it the site will drop in rank.
-
RE: Can someone explain how a site with no DA, links or MozTrust, MozRank can rank #1 in the SERPs?
Well, the simple answer would be that SEOmoz isn't Google.
Just because SEOmoz haven't crawled any relevant links, etc doesn't mean that Google hasn't.
It is a horrid little site though, the HTML version has "INSERT TEXT HERE" all over it. I would say activitysuper may be on to something with the domain age, plus the fact that it's for a real company (lots of contact info).
-
RE: Title case or lower can URL folders
If your CMS doesn't do it then you'll need a rewrite rule, but I'd recommend that you try and sort out the CMS in the first instance.
Are you using a common CMS?
-
RE: When providing search results for SEO purposes to you use the exact results in Google Adwords
Do you mean in your pitch?
If you're trying to explain what keywords to go after and why then I'd normally use exact simply to set expectations. Using broad often gives an inflated figure which the client would then expect you to deliver on.
If they understand SEO or are willing to have it explained to them, then both figures have their place in setting goals.
-
RE: When providing search results for SEO purposes to you use the exact results in Google Adwords
I don't really understand how that'd be used in a testimonial, maybe as a case study as to how they chose keywords.
Most testimonials would be of a form "100% increase in traffic" "100% increase in sales from search traffic" or similar. Being first for something with 33,000 searches a month but no relevance or conversions is pointless.
Ultimately testimonials are spin, so do whatever you feel like
-
RE: (not provided) in GA
Hadn't seen that, have a thumbs up
Is anybody actually seeing a large chunk of traffic coming in with not provided though?
I'm around 0.9% - 1% overall and 1.4% - 1.5% for English only sites.
-
RE: Ranking Tracker not correct?
4 point swing?
I just checked manually using proxies, depersonalization and language settings and I see you in 11th too, but the above article might give you some sort of insight.
Unless you're checking at the exact same time it's hard to tell.
I'm not particularly up to date on how Google's datacenters work these days either, but sometimes it can be different depending on which one you hit.
-
RE: SEO All in One Not Showing Up
Have you tried updating to the latest version of both?
Do you have in your header.php file?
-
RE: Keeping our old links
You've made the changes already, without making a note of what you've changed from > to?
Well, a) Doh! b) Alan's suggestion of looking for 404 errors in webmaster tools is a decent place to start. Add redirects for anything that's throwing up an error.
I would imagine it'll be a heavily manual process if you've already changed them, though I'm not sure what your CMS is capable of or if your new URLs were sufficiently thought through to allow you to create a rewrite rule
e.g. if the changes were something like
from http://example.com/?id=123&name=product&cat=stuff
to http://example.com/stuff/name/123
then you could add just one rule for the redirects -
RE: SEO All in One Not Showing Up
Version of WP? Version of the plugin? Has it ever worked? Why are you asking here and not on the WP support forums?
I'm going to fire in the dark and say you need to either add to or remove from your header.
-
RE: ROR Sitemap
What does a Ruby site map look like? Surely Ruby just runs something like 'gen_url_list' and you can output it in sitemap.xml, no?
Ignore that, ROR is an XML format, lol.
Eh, no search engines support them as far a I know.
Why not test it by submitting one through WMT and seeing if Google accepts it and let us know the results?
-
RE: Google's weighting of Page Load speed
Well, surely you're now in an excellent position to tell all of us?!
I don't think the issue is whether going from 51 > 92 will help, but more whether you've gone from X seconds to less than 1.5 seconds.
In webmaster tools if you look at labs > site performance, you should see a graph of your time.
Can also use something like - http://tools.pingdom.com/ - to check how fast a page loads.
I would suggest collecting your own data (especially since you've already made the changes) and evaluating whether it's been worth it or not. You should also monitor things like conversions and bounce rate.
-
RE: Google showing malware attack, how to remove ?
Might take a couple of days unfortunately, but should be only a few hours.
The malware error is not being produced by people coming to the site specifically, as in it's not being checked and generated each time.
Google itself is probably getting the data from stopbadware (I think) and you'll find it's only on certain browsers as well that subscribe.
Rather than rely on solving the problem through WMT I had better success by tackling a similar problem through the stopbadware (or whatever site it is) channels.
Try accessing the site through bing/google on IE/FF/Chrome/Opera/Safari and see what advice is thrown up during the warning. It should give you a link to the appropriate site to fix things on.
In fact, try going here - http://stopbadware.org/home/reviewinfo - use the tool and then request a check.
-
RE: A Blog Structure Dilemma We're Facing...
Google now treats sub domains as a key element of the site
[citation needed]
Though I know you're talking about - http://googlewebmastercentral.blogspot.com/2011/08/reorganizing-internal-vs-external.html
However, as far as I'm aware, there's no information yet as to how Google are changing the weighting of these links (or even if they are), so I'd still be wary of charging ahead with a subdomain
-
RE: Could somebody suggest a GOOD Wordpress XML sitemap generator?
I'm in agreement with Stefano that I've been using it for years and never had any problems that weren't my fault/settings.
Is the problem that it simply can't write to your server? I've had that before. Create the sitemap.xml and sitemap.xml.gz files manually via ftp, then play around with the permissions until the plugin can write to the files.
If you can tell us exactly what the issue is we might be able to get it working