Hi David,
This does happen occasionally. The simple solution is to put a meta before the closing head () tag in to the template's code, as below:
This will block robots from showing the DMOZ directory description.
Hope that helps.
Matt
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Hi David,
This does happen occasionally. The simple solution is to put a meta before the closing head () tag in to the template's code, as below:
This will block robots from showing the DMOZ directory description.
Hope that helps.
Matt
Hi Mozzers,
We're currently in the process of a website redesign with new CMS and have the opportunity to change URL and structure. I would love some opinions as to what the best practise will be.
A quick prerequisite, the website is entirely about France. French property, living, holidays, forum - everything. Therefore, we're unsure of the usage of the word France/French.
Presently, we're running Classic ASP which allows for one subfolder then dynamic article ID. In my examples, I will take our activity holidays URL. At present this is /france-activity-holidays/DisplayArticle.asp?ID=12345. We know that DisplayArticle.asp?ID=12345 will simply become [article-title], however, its the preceding subfolders I would like some help with.
Here are our thoughts on the options available. Can you please vote as to which you think is the best?
My gut feeling is either number 2 or 5. Concise, good for UX, OK for SEO. However, there is very little information around that is relevant to our sector.
Thanks in advance!
Matt
Thanks for the responses all.
I've always had the suspicions that subfolders are the way to go, and will incorporate this into our development.
Thanks
Matt
Hi Mozzers,
We're in the process of re-developing and redesigning several of our websites, and moving them all onto the same content management system. At the moment, although the websites are all under the same brand and roughly the same designs, because of various reasons they all either live on a separate domain to the main website, or are on a subdomain. Here's a list of what we have (and what we're consolidating):
My question to you lovely people is: should we take this opportunity through the redevelopment of the CMS to put everything into subfolders of the main domain? Keep things as they are? Put each section onto a subdomain? What's best from an SEO perspective?
For information - the property database was put onto a subdomain as this is what we were advised to do by the developers of the system. We're starting to question this decision though, as we very rarely see subdomains appear in SERPs for any remotely competitive search terms. Our SEO for the property database is fairly non-existent, and only ever really appears in SERPs for brand related keywords.
For further info - the forum and classifieds were under a separate brand name previously, so keeping them on separate domains felt correct at that time. However, with the redevelopment of our sites, it seems to make more sense to either put them on subdomains or subfolders of the main site. Our SEO for the forum is pretty strong, though has dwindled in the last year or so.
Any help/advice would be very much appreciated.
Thanks
Matt
Perhaps it made its way into the perks section? Is this what you're looking for? http://moz.com/perks
Best
Matt
Hi Mozzers,
Is anyone aware of a tool that will tell me how many outbound external links there are on my website? Basically, I have a theory that our website is littered with links to other websites, but need to know the approximate figure. As far as I can tell, none of the Moz tools tell me this?
Any help appreciated!
Cheers
Matt
Hi Daniel,
It's a tough choice, really. I have 80 categories in my classified ads site under 12 headings. I have taken a Gumtree-style move though, and have spread them across the page (http://www.gumtree.com) rather than just having on long list down a page.
From an SEO point of view, yes, it would be good to cover off all of the potential search phrases, but not at the cost of duplicating categories. You need to think of it from a user's point of view - if you have (for example) "Cars for sale" and "secondhand cars" as two separate categories, users may not know which section to look in (not to mention any keyword dilution). Moreso, and coming from someone who manages a classified ads site, it is really annoying (for admins and users) when users put items for sale in the wrong category. If you're adding multiple similar categories, this will quickly become an issue, in my opinion.
What we tried to do was to create dynamic search engine friendly titles and headers... Therefore we eliminated the need to create multiple categories for different regions, or even for similar products. Whatever language or software you're using, it should be fairly easy to set this up. We also added "similar" searches on search results pages and product pages in order to get links to the more niche search terms. We're in the top 3 for all of our high value key phrases with this approach.
Best,
Matt
Actually... Just found the answer here - http://www.seomoz.org/help/ose-terms-metrics
"Tweets: Total tweets and retweets of this URL since March 2010, including tweets of the URL with unique parameters added. Data provided by Topsy."
Best,
Matt
Hi Rich,
I would hazard a guess as to say that data comes from Followerwonk, since this was a recent aquisition of SEOmoz. Just a guess though...
best,
Matt
Thanks Shane. My syntax wasn't perfect, but after messing about with a few different ones, and closing with [RP], I managed to get it to work.
Many thanks!
Hi folks,
Due to working on a site older than myself, I find myself in a position whereby I need to set up some redirects in the httpd.ini document. I had wrongly assumed this would be in the same way as I would in htaccess, but alas, no dice.
There's nothing special about what I'm trying to do, but I think the expressions are what are confusing me.
I'm trying to redirect this example page - http://www.example.com/subfolder/DisplayArticle.asp?ID=12345 to it's subfolder http://www.example.com/subfolder/.
Here's what I have:
[ISAPI_Rewrite]
Redirecting old article to new locations
RewriteRule ^/subfolder/DisplayArticle.asp?ID=12345 /subfolder/ [R=301,L]
Can someone please point me in the right direction?
Thanks
Matt
I read that as 350 backlinks to the page... If it is 350 out-going links from that page, then completely agree! 350 is mental.
Hi Sean,
Sure it is. Just run the page through SEOmoz's OpenSiteExplorer - http://www.opensiteexplorer.org/. Put in the exact URL, and change the params to "only external" and you can see all of the backlinks to that page, plus the linking domain's authority, etc. It's a very good tool.
Cheers
Matt
Hi Olivia,
Maybe Rand has set about you (translation here for all non UK folk) for ripping off the SEOmoz site?
In all seriousness, 1.7mb is rather excessive for images. I would consider scaling these down.
I've never heard of any association between scriptsdown and increased page speed, so I doubt this is effecting it (PS - have you heard of Google Tag Manager?)
Is your new server equivalent in spec to your previous server? Smaller capacities and bandwidth can play a small part in load speed but probably not by that much... Just a thought though.
Cheers
Matt
Hi Patrick,
On the 27th of September, there were two major updates by Google - an EMD update and a subsequent Panda update. Seeing as you don't EMD, it's possible that it was the Panda update that seems to have hit you.
Can I assume that the pages that dropped from rankings were shopping results pages, or pages with pagination? Whereas pages that remain are static pages? The Panda updates address duplicate pages and pages with pagination, so I can imagine that a site like yours probably has a fair amount of what search engines would consider duplicate content. Do you get duplicate content errors in your SEOmoz crawl diagnostics? If that is what you're seeing, then I would recommend canonicalising your pagination pages. http://www.seomoz.org/learn-seo/canonicalization
The above is just an idea, and given that I don't think it's a penguin penality (penalises backlinks) overwise your above mentioned fixes should have made some noticeable difference, it will be worth looking into Panda.
Best,
Matt
Sandip,
I'm giving you a thumbs up just because of your glorious moustache. Well done, sir.
(Your answer was pretty good too!)
Matt
Hi Dan,
I agree with Jarno - you're not really offering any valuable content to the user. The only unique content is a couple of images and a few changed words. If the server can pull through a couple of different images and change a few words, surely the server can request some unique text that will a) add value to the user and b) reduce the overall amount of templated content?
PS - As a pro member, you can have up to 5 campaigns... Why not set up a campaign for your competitor? I can almost guarantee that they've got a load of duplication errors too, judging by what they're doing.
PPS - Looking at your on-page optimisation, you don't seem to be targeting "DID Numbers"... Your competitors are though, hence why they are displaying in results for that search.
Best,
Matt
Agreed.
I too use a T as a button, similar to yours by the sounds of it! I've been wary of using it though because of their policy. However, we've been going with the T button for about 2+ years and we get 250K unique monthly visitors... I would have thought someone would have given us a slap on the wrist by now if it wasn't allowed, right?
I'm sure it's fine...
Matt
In theory, nope. URLs in these threads are nofollow so Google should not use any links here for ranking purposes. Besides, SEOmoz has a great authority and reputation, so even if it was followable, I don't think it would HARM. It probably won't help much, unless your business is related to SEO, but not harm.
Best,
Matt
Jarno,
Twitter are actually a bit funny about the use of their imagery... See their policy here - https://twitter.com/logo just so you don't accidentally get yourself into trouble
Matt
Well, this was my question really... Does Google consider the TLD as part of a query match? i.e. Would pizza.vegas rank for the search term "pizza vegas" or just "pizza"? Does the .vegas hold any bearing/weight when Google looks to match the query? I would argue that pizza.vegas is not an EMD of the search term pizza vegas
Matt
Hi Matt,
In your Crawl Diagnostics area, download the full CSV file. This will contain all of the pages on your site that have been crawled. You can filter out pages that do not have errors, should you need to.
Best,
Matt
"Top URLs" doesn't mean GWT thinks your category page is more important, or ranks better. It just means that there are more occurances of that keyword on that particular page. The majority of my "Top URLs" are directory pages which, whilst they rank well for what they should, do not feature in SERPs for that keyword - my homepage does. If I shoe-horned in all of the mentions of my top keyword into my homepage, it would certainly be considered keyword stuffing, so would have an adverse effect.
It isn't something to worry about.
Best,
Matt
Hi David,
It's probably caused by the fact the website has a Spanish language subdomain, yet you only employ a root level (www.) analytics code across the root and subdomain. Within analytics, if you go to the Admin, Tracking Info, you should be able to toggle a setting to allow for subdomains. Generate this new script and place it into the source and that should do the trick.
Cheers
Matt
PS - it's minus 5 here in Bath... Brrr.
I too wonder whether one of these new TLDs will make any difference to SEO at all. I've never really thought it could make much difference whether you are .com, .org, .net, etc so I doubt any of these new ones will make any difference either. There are factors with significantly higher importance.
The only benefit I can see from an SEO point of view is if you buy one of these to EMD, and the dotcom is already taken.
If you registered best.university, would that have any benefit on the search phrase "best university"? Unlikely... In the same way that if you sold 'the best intercoms' and registered bestinter.com, I doubt very much you'd rank for the search phrase "best intercom". The com isn't going to be associated with the root domain name.
I do however think that one of the new TLDs could be great for branding, marketing and being memorable to users... Just to come at it from a non-SEO point of view for a second.
I'd love to hear from someone on SEOmoz that has bought one of these new TLDs and can share their experience.
Cheers
Matt
Regards revisit-after: It's widely considered a myth, developed by one tiny search engine in Canada. There has never been any documented support/usage by the major search engines though.
Here's a link to another Q&A from a little earlier in the year - http://www.seomoz.org/q/meta-tag-revisit-after-useful
Regards why your competitor is beating you in SERPs: could be any number of things... Site age does help, but is not the be all and end all.
Cheers
Matt
Hi,
I don't think it will hurt per se, but I would like to know why they would want to do that?
The title tag is quite important real estate, so I would be inclined to use the 10 or so characters more effectively. Also, if people see their telephone in SERPs, that's a click your client isn't going to get through to their website.
Matt
Hi Hugh,
Can you link us an example or two of your duplicate pages? It's quite hard to tell what might be causing it until I understand what your setup is like.
Thanks
Matt
Hi William,
First off, I have never done something similar personally, but until someone either contradicts or confirms my thoughts, I will give you my opinion from a theoretical point of view.
I would do as you suggested first - create subfolders for the specific locations onto one national site. Then I would redirect the local domain pages to the specific page on the national site, if this wasn't too much work. If it is too much work, I would just 301 the whole of the local domain to the subfolder that roughly correlates. This would just redirect any link from the old local domain to one new location.
Having subfolders on one national domain means that you will have all of these incoming links from the local websites (and the value from any backlinks). Every one of the subfolders (assuming your architecture is good) will benefit in some way from all of the backlinks. Win-win. Your site should flurrish with this approach.
With regards to having separate local sites with a link back to the national site, this will only pass on a fraction of the link juice value, and most would be retained by that local domain. I've always likened having lots of local websites to kind of hedging your bets. Hoping for local search value.
I think with good architecture and a good backlink profile, you can easy rank better, even for local searches, with a national website.
I welcome other people's opinions though!
Best,
Matt
It's really hard to pin down three things, and all of my tips would cross over one another to some degree, but I'll wade in with mine anyway...
For me, it's roughly the same as yours, but perhaps my priorities would be slightly different:
1. Content. It's king. Write excellent quality, valuable content and people will want to share it socially, link to you from their website, and most importantly, users will read it and engage with it. This is the vast majority of your link building effort right here (which is why I won't include a bullet on link building).
2. Architecture, structure and on-page optimisation. If you have a good flat architecture that is technically structured, marked up well and geared towards SEO best practises, you will reap the benefits.
3. Analysis. To the point of being anal retentive. Probably one of the most little mentioned parts of an SEO's job is to analyse everything. See what your competitors are doing well, see what niches you can fit into, research your own market, analyse technical mistakes/improvements, analyse how users engage with your site, how they navigate, what they do when they're here, how they found you, how else they can find you, and so on. Good research means that you can gauge what you need to be doing better, what new things you should do, and when.
That's mine very much in a nutshell. I hope others will come along and share their's too.
Matt
Hi Lawrence,
Your research is quite interesting. It sounds like there's benefit from adding links to this type of directory in the Netherlands regardless of the SEO benefit - if the Dutch are using these types of sites as their start pages, then you may actually gain a fair amount of traffic.
Although we have a few banklinks from Dutch directory pages, I cannot say that this was part of any strategy similar to yours. I'd be really interested to know if it works for you, and also whether you get a fair bit of traffic from them.
Please post a blog or something about it in a few months - it would be very much appreciated!
Matt
Well, in theory it's easier to create a http page then it is to create a https page (because of all of the added encryption when sending data via SSL). I would suggest that you discuss this with your designer. If you can get away with non-secure pages, especially if you've got RSS content you're looking to push out, then definitely do.
Best of luck,
Matt
Hi,
Yes, more than likely. Is there a need for your website to always have a secure connection (https)? Some clever websites can do a request to RSS feeds within https pages, but only those that know what they're doing.
Displaying http content (your RSS feed) in a https page might cause some browsers to mis-trust your site's content, and oftentimes displaying warnings to users. Does this happen to your site?
Basically, having any http content in a https site is not best practise.
Best
Matt
Hi Lawrence,
It's hard to tell why that page only has a pagerank of 0. If you do some relevant searches in Google, do you see the page appearing in SERPs? Having a PR of 0 doesn't necessarily mean it's not worth getting a link from them - the page could be new, they could have poor architecture, etc.
I would suggest that more important factors from linkbuilding include: contextually relevant webpages, non-spammy anchor text*, and not going over board with loads of similar looking links.
*If you're on a linkbuilding mission across these directory pages, try not to link to the same page with the same anchor text over and over again. Google will spot this and will either give you a slap on the wrist, or just not pass any link juice to your website from the links - either way, it's pointless.
If you're in doubt, it is MUCH better to get only a couple of really valuable links from high authority websites, then it is to get hundreds of links from low value websites.
Best,
Matt
Hi Gary,
Could be any number of things. Can you tell me what sort of work you are doing on the client's website, especially regarding linkbuilding? Do you have access to the client's webmast tools? Is there a notifcation of a penalty within there? Also, has the drop in rankings only occurred on Google, or have they dropped off of Bing and Yahoo results as well?
If you can give a bit more info, I'm sure a lot of people will be able to help
Matt
Hi Zora,
In my experience, it is easier to rank well for a subfolder than it is for a subdomain. With a subdomain, you are mostly leaving it to rank for itself, and it will need almost as much SEO as your root domain. With a subfolder, it seems that more link juice is passed down the line from the root, so it is much easier to rank. Again - this is just from my experience.
I would say that you should not question why you have suddenly jumped onto page 1, though I suspect it is because you have moved the contents to a subfoler. I would set up a 301 from the subdomain and fingers crossed Google will rank your subfolder in place of the subdomain at it's next crawl.
Cheers
Matt
Hi Ketan,
One of our brands is "FrenchEntrée" and we rank for all permutations in Google - French Entree, FrenchEntree, French Entrée, etc even though we never separate the two words. 'French entree' is a term widely used and means a starter (of a meal) so it certainly can be done. I think the reason that we rank for both is because of the general keyword usage of the word "French" around the site, and also because of the domain reputation we have built up. If either of the two words individually in your client's name is considered a relevant keyword, then I think you're more than entitled to use it in places, on it's own.
What will also help over time will be the back links from other sites. Some with write "Great Company", some "GreatCompany" and some "Other anchor text, from GreatCompany". This co-occurance will hopefully build up value for both versions of the brand name in search.
I would suggest that you never de-value the brand by putting a space between the words - stick to the brand's real name and only ever use this. This will be best from a user's point of view, and eventually the search engines will catch up.
Good luck.
Matt
The easiest way to minimise downtime is to lower the "TTL" (time to live) within your DNS provider, so that when you make the switch to point the domain to the new server location, this will happen quicker. The lowest (safe) amount you should have for TTL is 3600, which in human terms is 1 hour. Once the switch is successful, I would recommend putting it back to the default, which is usually 86400.
Hope that helps.
Matt
Hi,
The disagnogstics are still reporting the 404 error because this page - http://www.salustore.com/protocollo-nanogen still has a link to this page - http://www.salustore.com/capelli/nanogen-acquamatch.html undernath the product header "Aquamatch". If you remove the broken link from the product page, the 404 error will disappear from your disgnostics.
Matt
Hi David,
Iain is correct that it must be in the head. No ifs or buts.
Can you tell me what language or programme you are using for your website? There are pretty easy ways of making the page title dynamic based on the article title.
Matt
Hi Lynn,
I once spoke to an SEOmoz staff member about this, and they told me that there is no logic to the way Roger crawls a website - it's completely random. I have looked through my last 10 or so CSVs and each time I have a different order. The first few links of my latest report are from the second level of architecture, so I would concur that it is at least randomly reported, if not randomly collected.
Matt
Do you know what, I knew that was going to be your response
The amount of times I have heard of "reputable" SEO companies doing the same old things is truly horrifying. If they are like the others I have heard of, the likelihood is that they have probably made it quite easy for you to disavow the links by adding approx 20+ links across various pages from each domain? Therefore, when and if you disavow, just disavow the whole domain and that's all of them done.
This is of course you are left with no choice but to disavow.
Best of luck Anthony!
Matt
Hi Anthony,
It sounds as though your client may have used an "SEO Agency" before... But perhaps not a very good one.
It's always quite hard to tell if backlinks are harming a site or not. Has your client had any drop in rankings / traffic since Google Panda first came onto the scene around 18 months ago? Or with any subsequent update on Panda since? A quick way to work this out is by comparing traffic in analytics to SEOmoz's Google algorith change calendar here - http://www.seomoz.org/google-algorithm-change. If there's no noticeable drop in rankings/traffic, then the site has either; always been negatively affected by these links, or hasn't yet been negatively affected by these links, or won't be negatively affected by these links. This now makes your task a little harder!
Instead of running around and removing links, you can disavow links (see SEOmoz blog post here - http://www.seomoz.org/blog/googles-disavow-tool-take-a-deep-breath). Take Dr Pete's advice though, only disavow if you're sure you have been negatively affected by backlinks.
Hope that helps.
Matt
Hi,
I use VB redirects to URLs with ampersands regularly and cannot report anything but noticeable value being passed down in doing so.
I'm not a coding expert (so I would suggest that consult one in the first instance) but I would imagine it depends largely on the language you are writing the redirect in? For example, in PHP there are quite a few uses for the & character, so you may need to escape & with a preceding \ or by the use of $ - meaning to match the query exactly.
The best advice I can personally give from my experience is - if the redirect works for a human, it will work for a bot.
Hope that helps.
Matt
Hi Nic,
When you say that only 107 pages have been indexed, is this in your SEOmoz Crawl Diagnostics? It can take a week or so for all of your pages to be crawled fully, so don't panic too much.
There are two ways are knowing what pages have been indexed by Google:
1. doing a "site:" search in Google e.g. site:mydomain.com. This will display results only from your web page. Take a look through and make sure all of your major pages are displaying.
2. Google Analytics (only if your site has been around a little while). Change the date range to all time. Then go to Traffic Sources > Sources > Organic and then change the primary dimension (which is a link just above the grid of results) to Landing Pages. The number of rows displaying roughly correlates to the number of pages that are ranked, and have received an organic entrant.
Hope that helps.
Matt
Hi Sarah,
the search engines won't index the no follow pages even if those pages are linked to from elsewhere? - Incorrect. The search engine will index the pages if the crawler reaches them from another link, provided this other link is followable. Assuming the page you are linking to does not meta rel nofollow, noindex.
no link juice will flow from the page with the (no follow) links on? - Correct. Link juice will only flow through links if they are followable.
With regards to your general statement about why your rankings have dropped since the redesign - have you changed the overall architecture of the site? Change the URL structure? Or is it simply a refresh, moved a few bits around, prettied up the pages, etc?
Matt
Well I guess my next question in that case is 'did you buy or source links from a link farm?!' My understanding is that it's very rare for there to be manual intervention in this fashion, and if you have received a penalty, it's probably for good reason.
Having looked through OSE there's nothing that's massively obvious (for me), but will probably be more obvious for you. Bear in mind that data in OSE could be some weeks old.
Cheers
Matt