Hi Matt,
In your Crawl Diagnostics area, download the full CSV file. This will contain all of the pages on your site that have been crawled. You can filter out pages that do not have errors, should you need to.
Best,
Matt
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Matt,
In your Crawl Diagnostics area, download the full CSV file. This will contain all of the pages on your site that have been crawled. You can filter out pages that do not have errors, should you need to.
Best,
Matt
Regards revisit-after: It's widely considered a myth, developed by one tiny search engine in Canada. There has never been any documented support/usage by the major search engines though.
Here's a link to another Q&A from a little earlier in the year - http://www.seomoz.org/q/meta-tag-revisit-after-useful
Regards why your competitor is beating you in SERPs: could be any number of things... Site age does help, but is not the be all and end all.
Cheers
Matt
Hi Sean,
Sure it is. Just run the page through SEOmoz's OpenSiteExplorer - http://www.opensiteexplorer.org/. Put in the exact URL, and change the params to "only external" and you can see all of the backlinks to that page, plus the linking domain's authority, etc. It's a very good tool.
Cheers
Matt
Hi Brandon,
I had something very similar with my site very recently. Our main area of business is to do with French property, and we realised that we have 4 areas of the site all directly targeting the same keywords. Our homepage was by far the strongest on "French Property" keyword historically, but it suddenly started plummeting down the rankings.
Since we have re-targeted the other 3 areas of the site away from "French Property", our homepage has surged back up the rankings, so there is definitely danger in targeting the same keyword on multiple pages/areas.
The difference between your site and mine however, is that we were targeting almost exact matches. You seem to be targeting slightly different keywords, so I would suggest that you are safe, on the whole. I mean, if you're selling Mars bars, it's going to be pretty difficult not to mention Mars bars on almost every page! One thing I would suggest is your homepage will more often than not rank highest for your site's overall area of business, so make sure you homepage is optimised with your most valuable keyword.
I hope that helps!
Matt
The easiest way to minimise downtime is to lower the "TTL" (time to live) within your DNS provider, so that when you make the switch to point the domain to the new server location, this will happen quicker. The lowest (safe) amount you should have for TTL is 3600, which in human terms is 1 hour. Once the switch is successful, I would recommend putting it back to the default, which is usually 86400.
Hope that helps.
Matt
It's really hard to pin down three things, and all of my tips would cross over one another to some degree, but I'll wade in with mine anyway...
For me, it's roughly the same as yours, but perhaps my priorities would be slightly different:
1. Content. It's king. Write excellent quality, valuable content and people will want to share it socially, link to you from their website, and most importantly, users will read it and engage with it. This is the vast majority of your link building effort right here (which is why I won't include a bullet on link building).
2. Architecture, structure and on-page optimisation. If you have a good flat architecture that is technically structured, marked up well and geared towards SEO best practises, you will reap the benefits.
3. Analysis. To the point of being anal retentive. Probably one of the most little mentioned parts of an SEO's job is to analyse everything. See what your competitors are doing well, see what niches you can fit into, research your own market, analyse technical mistakes/improvements, analyse how users engage with your site, how they navigate, what they do when they're here, how they found you, how else they can find you, and so on. Good research means that you can gauge what you need to be doing better, what new things you should do, and when.
That's mine very much in a nutshell. I hope others will come along and share their's too.
Matt
Hi David,
This does happen occasionally. The simple solution is to put a meta before the closing head () tag in to the template's code, as below:
This will block robots from showing the DMOZ directory description.
Hope that helps.
Matt
Hi Daniel,
It's a tough choice, really. I have 80 categories in my classified ads site under 12 headings. I have taken a Gumtree-style move though, and have spread them across the page (http://www.gumtree.com) rather than just having on long list down a page.
From an SEO point of view, yes, it would be good to cover off all of the potential search phrases, but not at the cost of duplicating categories. You need to think of it from a user's point of view - if you have (for example) "Cars for sale" and "secondhand cars" as two separate categories, users may not know which section to look in (not to mention any keyword dilution). Moreso, and coming from someone who manages a classified ads site, it is really annoying (for admins and users) when users put items for sale in the wrong category. If you're adding multiple similar categories, this will quickly become an issue, in my opinion.
What we tried to do was to create dynamic search engine friendly titles and headers... Therefore we eliminated the need to create multiple categories for different regions, or even for similar products. Whatever language or software you're using, it should be fairly easy to set this up. We also added "similar" searches on search results pages and product pages in order to get links to the more niche search terms. We're in the top 3 for all of our high value key phrases with this approach.
Best,
Matt
Well I guess my next question in that case is 'did you buy or source links from a link farm?!' My understanding is that it's very rare for there to be manual intervention in this fashion, and if you have received a penalty, it's probably for good reason.
Having looked through OSE there's nothing that's massively obvious (for me), but will probably be more obvious for you. Bear in mind that data in OSE could be some weeks old.
Cheers
Matt
Hi Lynn,
I once spoke to an SEOmoz staff member about this, and they told me that there is no logic to the way Roger crawls a website - it's completely random. I have looked through my last 10 or so CSVs and each time I have a different order. The first few links of my latest report are from the second level of architecture, so I would concur that it is at least randomly reported, if not randomly collected.
Matt
Hi Sarah,
the search engines won't index the no follow pages even if those pages are linked to from elsewhere? - Incorrect. The search engine will index the pages if the crawler reaches them from another link, provided this other link is followable. Assuming the page you are linking to does not meta rel nofollow, noindex.
no link juice will flow from the page with the (no follow) links on? - Correct. Link juice will only flow through links if they are followable.
With regards to your general statement about why your rankings have dropped since the redesign - have you changed the overall architecture of the site? Change the URL structure? Or is it simply a refresh, moved a few bits around, prettied up the pages, etc?
Matt
Well, this was my question really... Does Google consider the TLD as part of a query match? i.e. Would pizza.vegas rank for the search term "pizza vegas" or just "pizza"? Does the .vegas hold any bearing/weight when Google looks to match the query? I would argue that pizza.vegas is not an EMD of the search term pizza vegas
Matt
Hi Lawrence,
It's hard to tell why that page only has a pagerank of 0. If you do some relevant searches in Google, do you see the page appearing in SERPs? Having a PR of 0 doesn't necessarily mean it's not worth getting a link from them - the page could be new, they could have poor architecture, etc.
I would suggest that more important factors from linkbuilding include: contextually relevant webpages, non-spammy anchor text*, and not going over board with loads of similar looking links.
*If you're on a linkbuilding mission across these directory pages, try not to link to the same page with the same anchor text over and over again. Google will spot this and will either give you a slap on the wrist, or just not pass any link juice to your website from the links - either way, it's pointless.
If you're in doubt, it is MUCH better to get only a couple of really valuable links from high authority websites, then it is to get hundreds of links from low value websites.
Best,
Matt
Actually... Just found the answer here - http://www.seomoz.org/help/ose-terms-metrics
"Tweets: Total tweets and retweets of this URL since March 2010, including tweets of the URL with unique parameters added. Data provided by Topsy."
Best,
Matt
Hi Zora,
In my experience, it is easier to rank well for a subfolder than it is for a subdomain. With a subdomain, you are mostly leaving it to rank for itself, and it will need almost as much SEO as your root domain. With a subfolder, it seems that more link juice is passed down the line from the root, so it is much easier to rank. Again - this is just from my experience.
I would say that you should not question why you have suddenly jumped onto page 1, though I suspect it is because you have moved the contents to a subfoler. I would set up a 301 from the subdomain and fingers crossed Google will rank your subfolder in place of the subdomain at it's next crawl.
Cheers
Matt
Hi Menachemp,
In a word, no. Google is not stupid. Yes, these link farms do exist, but Google is very clued up, and will penalise sites with links from known link farms. If the link comes from a site with absolutely no value/correlation to your site, it can actually have a negative effect on your SEO.
It is becoming increasingly important to get links from valuable sources, it's not just about quantity. I would say that 10 links from very valuable, reputable sources is better than 1,000 links from mediocre sources.
Cheers
Matt
I too wonder whether one of these new TLDs will make any difference to SEO at all. I've never really thought it could make much difference whether you are .com, .org, .net, etc so I doubt any of these new ones will make any difference either. There are factors with significantly higher importance.
The only benefit I can see from an SEO point of view is if you buy one of these to EMD, and the dotcom is already taken.
If you registered best.university, would that have any benefit on the search phrase "best university"? Unlikely... In the same way that if you sold 'the best intercoms' and registered bestinter.com, I doubt very much you'd rank for the search phrase "best intercom". The com isn't going to be associated with the root domain name.
I do however think that one of the new TLDs could be great for branding, marketing and being memorable to users... Just to come at it from a non-SEO point of view for a second.
I'd love to hear from someone on SEOmoz that has bought one of these new TLDs and can share their experience.
Cheers
Matt
Hi,
I use VB redirects to URLs with ampersands regularly and cannot report anything but noticeable value being passed down in doing so.
I'm not a coding expert (so I would suggest that consult one in the first instance) but I would imagine it depends largely on the language you are writing the redirect in? For example, in PHP there are quite a few uses for the & character, so you may need to escape & with a preceding \ or by the use of $ - meaning to match the query exactly.
The best advice I can personally give from my experience is - if the redirect works for a human, it will work for a bot.
Hope that helps.
Matt
Hi Patrick,
On the 27th of September, there were two major updates by Google - an EMD update and a subsequent Panda update. Seeing as you don't EMD, it's possible that it was the Panda update that seems to have hit you.
Can I assume that the pages that dropped from rankings were shopping results pages, or pages with pagination? Whereas pages that remain are static pages? The Panda updates address duplicate pages and pages with pagination, so I can imagine that a site like yours probably has a fair amount of what search engines would consider duplicate content. Do you get duplicate content errors in your SEOmoz crawl diagnostics? If that is what you're seeing, then I would recommend canonicalising your pagination pages. http://www.seomoz.org/learn-seo/canonicalization
The above is just an idea, and given that I don't think it's a penguin penality (penalises backlinks) overwise your above mentioned fixes should have made some noticeable difference, it will be worth looking into Panda.
Best,
Matt
Hi Olivia,
Maybe Rand has set about you (translation here for all non UK folk) for ripping off the SEOmoz site?
In all seriousness, 1.7mb is rather excessive for images. I would consider scaling these down.
I've never heard of any association between scriptsdown and increased page speed, so I doubt this is effecting it (PS - have you heard of Google Tag Manager?)
Is your new server equivalent in spec to your previous server? Smaller capacities and bandwidth can play a small part in load speed but probably not by that much... Just a thought though.
Cheers
Matt
Hi Brendan,
Your assumption is correct. The entrant's keyword for a page within drilldown doesn't necessary mean they came direct to that page. It just means that they ended up there during some part of their journey.
I'm not aware of any easy way of getting keyword data for direct-to-page entrances, but someone else on here might know... I'm sure there must be a way some how.
Hope that helps!
Matt
Hi Greg,
I don't know of any way you can do what you're asking, exactly... I usually use the "inanchor" operator because people often anchor the domain name of the website they're linking to. More often than not, I get much better results than the inurl operator, which usually favours a completely irrelevant, low level sub domain on large newsy websites e.g. bbc.co.uk/news/england/south/west/ducks/quack/my-keyword.
Try using:
inanchor:keyword or allinanchor:keywords
It's not perfect, but more accurate than inurl. A good crib sheet for operators is provided here - http://www.googleguide.com/advanced_operators.html. I could not possibly confirm whether this is complete or not though.
Good luck!
Matt
"Top URLs" doesn't mean GWT thinks your category page is more important, or ranks better. It just means that there are more occurances of that keyword on that particular page. The majority of my "Top URLs" are directory pages which, whilst they rank well for what they should, do not feature in SERPs for that keyword - my homepage does. If I shoe-horned in all of the mentions of my top keyword into my homepage, it would certainly be considered keyword stuffing, so would have an adverse effect.
It isn't something to worry about.
Best,
Matt
Hi,
Yes, more than likely. Is there a need for your website to always have a secure connection (https)? Some clever websites can do a request to RSS feeds within https pages, but only those that know what they're doing.
Displaying http content (your RSS feed) in a https page might cause some browsers to mis-trust your site's content, and oftentimes displaying warnings to users. Does this happen to your site?
Basically, having any http content in a https site is not best practise.
Best
Matt
Hi William,
First off, I have never done something similar personally, but until someone either contradicts or confirms my thoughts, I will give you my opinion from a theoretical point of view.
I would do as you suggested first - create subfolders for the specific locations onto one national site. Then I would redirect the local domain pages to the specific page on the national site, if this wasn't too much work. If it is too much work, I would just 301 the whole of the local domain to the subfolder that roughly correlates. This would just redirect any link from the old local domain to one new location.
Having subfolders on one national domain means that you will have all of these incoming links from the local websites (and the value from any backlinks). Every one of the subfolders (assuming your architecture is good) will benefit in some way from all of the backlinks. Win-win. Your site should flurrish with this approach.
With regards to having separate local sites with a link back to the national site, this will only pass on a fraction of the link juice value, and most would be retained by that local domain. I've always likened having lots of local websites to kind of hedging your bets. Hoping for local search value.
I think with good architecture and a good backlink profile, you can easy rank better, even for local searches, with a national website.
I welcome other people's opinions though!
Best,
Matt