I can't see any reason why you would want to nofollow links to your own social networking pages. They are very much related to your site so why not pass pagerank to them. As Takeshi rightly points out, if you nofollow them then any pagerank they might have got from your home page just evaporates.
Best posts made by simon_realbuzz
-
RE: Is it better "nofollow" or "follow" links to external social pages?
-
RE: Responsive design or mobile website for SEO
I think the best thing to consider is what is best for your users rather than a search engine. If having a mobile version improves their user experience then great.
Google have expressed a preference for responsive over mobile but again you have to think are your users best served by a mobile (and possibly scaled back version) of your site.
There have been some great blogs on Moz on the whole mobile issue which I think you should check out.
http://moz.com/blog/seo-of-responsive-web-design
-
RE: Can a home page penalty cause a drop in rankings for all pages?
Have you had a look in Webmaster tools for any issues? Any unnatural links warnings? Have you lost any significant backlinks? There could be any number of reasons, be it Penguin or Panda - without more info people will struggle to give you suitable solutions.
-
RE: Specific page URL in a multi-language environment
Definitely your language used in your URLs should reflect the country/language you are targeting. If, for example, you saw a search result in US SERPs but the language URL was in Spanish, I'm pretty certain most people would be less inclined to click on the result.
-
RE: Should I noindex the site search page? It is generating 4% of my organic traffic.
This was covered by Matt Cutts in a blog post way back in 2007 but the advice is still the same as Mik has pointed out. Search results could be considered to be thin content and not particularly useful to users so you can understand why Google want to avoid seeing search results in search result pages. Certainly I block all search results in robots.txt for all out sites.
You may lose 4% of your search traffic in the short term, but in the long term it could mean that you gain far more.
-
RE: What To Do With Content From SEO Perspective
Personally, I'd focus on your own site and keeping your content quality and unique. In the past we have had our fingers burned by syndicating content to other sites and finding that they end up outranking us for our own content. While the referral traffic can of course be useful, you have to weigh up whether the benefits of referral traffic outweigh the negative impact on the ability of your own site to rank well.
If you decide that referral traffic would still be an avenue you wish to pursue then perhaps you could consider providing content that is considerably different from the version that you keep on your own site or at least making it just an abridged version. Also make sure you publish the content first on your own site before allowing it to be published elsewhere.
-
RE: Will Google View Using Google Translate As Duplicate?
Google will not view translated content as duplicated, rather it will view it as rather spammy since auto translations are far from perfect and the translated content often appears as a very 'broken' version of whatever language you are translating into. Ideally, while I appreciate it can be pricey, do not use autotranslate tools but instead have content translated properly by a professional.
-
RE: SEOMoz advice on only buying domain if .com version is available
It really depends which markets your client is trying to target. If their target market is UK only then the .co.uk is perfectly fine. If the .com is available then it would do no harm to purchase it to save a competitor getting hold of it and outranking for the domain/brand name. You could simply redirect the .com to your .co.uk site.
Alternatively if the target is wider than the UK then it becomes increasingly difficult (though not impossible) to rank with a .co.uk in other countries. Hope this helps.
-
RE: Increase in 404's
Redirecting multiple pages to one page is ok so long as there is relevance, after all you want to send your users to somewhere that is of use to them. What you don't want to do is just point all 404s to one page such as your home page.
I think the whole redirect issue was expertly covered by Cyrus recently in this great blog post.
-
RE: Duplicate title tags in a pagination case (not search results)
You could make use of differentiating each of the title tags with use of page 2, page 3 etc but in addition if you really want to capitalise on the strength of the pages you could make use of the rel=next rel=prev mark up to give a strong indication to Google that these are a paginated series. This will consolidate the indexing properties across all the pages.
Look at the following example lifted from Google webmaster Central
Let’s say you have content paginated into the URLs:
http://www.example.com/article?story=abc&page=1
http://www.example.com/article?story=abc&page=2
http://www.example.com/article?story=abc&page=3
http://www.example.com/article?story=abc&page=4On the first page, http://www.example.com/article?story=abc&page=1, you’d include in the section:
On the second page, http://www.example.com/article?story=abc&page=2:
On the third page, http://www.example.com/article?story=abc&page=3:
And on the last page, http://www.example.com/article?story=abc&page=4:
By all means make each of the titles unique, but also connect them all in the series by implementing rel=next rel=prev.
-
RE: Giving Follow Links is good for SEO ?
Links out to quality useful sites that are relevant to your niche will help your SEO efforts. There was a great WBF by Cyrus Shepard from a couple of years ago which is well worth checking out and is still very relevant today.
-
RE: Link Building Tactic Advice
I would be very weary of any such reciprocal link scheme as this would be a breach of Google's guidelines. Any link exchange of such nature could have a negative impact on your ranking.
Take a look here http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66356
-
RE: Why the archive sub pages are still indexed by Google?
No one can say with any certainty as it varies from site to site and depends how frequently your site is crawled, so all I can say is patience is key. I've know some pages on our sites removed from the index within a week and others take far longer.
-
RE: Are sites that "smell of SEO" being demoted?
There definitely seems to be a sea change whereby much of the focus now is to concentrate on providing good content and great user experience. By getting your audience to remain on your site for longer, sharing your content, and potentially returning to your site again is what should pay dividends . That not to say that certain basic SEO principles should be ignored but user experience is surely the key.
-
RE: 301 vs 302
I'm in complete agreement that a 301 instead of a 302 is the best practice here, but wanted to point out that 302s do not necessarily pass no page rank at all. Check out this test study by Geoff Kenyon which dispels the theory that 302 pass no page rank at all, but clearly 301 is preferable in most cases.
-
RE: Site for my clients to log in and see their traffic, etc.
This may seem like an obvious answer but what is wrong with using Google Analytics? All info about traffic, referring sites and more in one place. If you want to see info about Ranking why not use a Chrome Extension such as PageRank Status.
-
RE: Best Way to Use Date in Title
Why not differentiate each of your titles by the actual content so that you include relevant keywords in your titles?
For example if it's a blog about 'Beauty Tips for Women over 40' then make that the title rather than calling the post 'Beauty Industry News - today's date'. Page title is an important ranking factor so make sure that your title gives both the user and search engines a clue of what the content of the blog post actually is.
-
RE: Translating URLs worth it?
Definitely agree translating the URLs is a must. Having keywords in your URL can aid your SEO efforts so it makes sense for the keywords to be in the language of the country you are targeting with the translated content.Certainly if I looked at SERPs for an English language search but found the URLs written in a foreign lanaguage I would not click on that search result as it instantly looks rather strange or even spammy.
-
RE: Are Back Links King
Back links are just one part of a very big equation. Try to look at it from the standpoint of the quality of your content. Is your content better than your competition? Would people be more inclined to share your content and perhaps link to it than your competitor's page? Would a user clicking on your page actually hang round long enough to view your content instead of clicking on the back browser button?
As Highland rightly says 'build quality content first' and this is good solid advice. Getting hung up on DA and PA isn't where you should be concentrating your attention - instead you should concentrate on good quality unique content and the links should naturally follow.
-
RE: "Hreflang=x" tag and multinational websites
Take a look at this video from Matt Cutts outlining their position on translated content.
In addition, as to the question of canonical element and hreflang you'll see that Google removed this portion from their guidelines with an update saying 'to simplify implementation, we no longer recommend using rel=canonical.' Check out the piece for the current position.
I also had advice direct from Christopher Semturs from Google who said directly to me "the golden rule I personally recommend to everybody using hreflang: In doubt, don't use rel-canonical."
-
RE: Solve duplicate content issues by using robots.txt
Why not use a cross-domain canonical whereby you reference the pages on your primary website as the canonical version on your secondary websites, thereby eliminating the duplication.
For example on each page that is duplicate on your secondary website you would add the following to the head to reference the primary pages:
-
RE: Modifying Content to Avoid Duplicate Content Issues
I'm presuming that because you are willing to largely re-write this content that you actually want to rank for it yourselves. As such a sizeable re-write would be required especially the blog titles, and you may even want to to target different keywords and long tail keywords. All the while bear in mind that quality and uniqueness is essential.
Alternatively, you could just use the content from the US-based blog directly on your Canadian blog but specify the US blog URL as the canonical version. This would avoid any duplication issue and yet give you some useful content on your site, even if you would never rank for it in SERPs.
-
RE: How important are image file names
From my own experience, I would say the images that send us the most search traffic are those where the file names have been labelled correctly rather than autogenerated such as 33456789.jpg. I would certainly suggest there is some correlation between filename and search performance, although of course there are other factors to considered. As a recommendation, I'd certainly suggest file names are important although I couldn't put a measure of quite how important. There's certainly no reason I can see for us to stop labelling filenames correctly.
-
RE: Modifying Content to Avoid Duplicate Content Issues
You definitely have the right approach. It's amazing how many people think they ought to rank for content even when it is duplicated on other sites. By adding some value to the content and making it more or less unique will only pay dividends.
-
RE: Inconsistent page titles in SERP's
It may be as simple as the fact that Google doesn't always display the page title that you want to display. As pointed out in this article Google can change the title it displays in SERPs depending on the terms actually searched for.
-
RE: Webmaster Tools and Domain registration
Presumably all your domains redirect to abctravel.com and this is the domain presented to users, in which case just add the main domain to Webmaster tools.
-
RE: Canonical tag refers to itself (???)
I agree, this is not an issue. It merely tells search engines which page is the 'canonical' version you want displayed in search. In fact I've seen it recommended that sites can use a self referential canonical in order that it affords some 'protection' for your pages from content scrapers which automatically steal content.
-
RE: Rel alternate implementing with x default
In my view there is no issue with you using your default for en-us. It may not be necessary to reference en-us in the code as Google will know that www.mysite.com is your default for English speakers in the US in any case, but there is certainly no harm in leaving that line in there.
As to your second question, I always reference every alternative version of a page in the hreflang including the page itself, so in the example you have included code on the page on the UK site. This is perfectly fine and is how I have always implemented hreflang when dealing with pages that are similar but intended for different countries/languages.
-
RE: Webmaster Guidelines Change History
To my knowledge there isn't anywhere where changes to the guidelines are documented. You can track many of the changes by referring back to posts in the Webmaster Central blog http://googlewebmastercentral.blogspot.co.uk/ Quite often a blog post will have been as a result of changes to Webmaster guidelines.
-
RE: Redesigning a Site - What Optimizations are "Must Haves"?
If you have access to the site analytics I would advise that you take a look at the habits of users coming to the site. Have a look at the reasons they are coming to the site, what pages are they visiting, and what pages are they exiting on. Only by knowing what the user intentions are can you truly ensure that the site is designed with the user in mind.
Make sure that you make accessible as possible the very things that visitors are wanting to see. Of course making a site responsive will pay dividends, but you still have to make sure that the site infrastructure makes it easy for users to find the very things they are seeking.
-
RE: How often should I upload a new sitemap in google webmasters?
Hi Scott, depending on the volume of new pages being created you could always make use of the 'Fetch as Google' contained in the 'Health' tab facility in your Webmaster account to make sure that your most important new pages are indexed.
Periodically you could add a new sitemap but on a daily or perhaps weekly basis submitting the important pages for indexing through your Webmaster account could be the way to go.
-
RE: Domain Authority Droped 4 Points
One of our sites dropped 3 points for DA at the last Mozscape index so you are not alone. And this occurred at at time when our search traffic has been growing considerably and we are performing better for the majority of our keywords than we have for some time. For this reason I'm not unduly worried and considering it just to be a fluctuation which will probably improve in due course.
In fact, having looked at our competitors which I've been tracking for comparison, I've also seen that all of them have also experienced a similar fall in this latest index update.
-
RE: Best Way to Use Date in Title
Hi Philip,
By all means add the last part if you wish to give you some consistency in the series, but make sure you append it at the end of the title. One thing to mindful of is not to make the title too long or it may end up being truncated by search engines. This SEOMoz guide should help.
-
RE: Targeting multiple geo locations with single site
Have you created a site map for each of your sub-directories and set a geographic target for each in GWT? In addition, depending on your content, and if you are duplicating it across the three countries, this may be an opportunity to make use of hreflang in order that you can tell Google which pages are alternatives of one another in order that the correct ones in served in SERPs.
-
RE: A problem with duplicate content
301 redirects are fine. They pass the majority (but not all) of the link authority. It is 302 temporary redirects that you should avoid as these are intended as temporary and therefore link authority is retained by the original URL.
-
RE: Hreflang hindering performance?
I'm afraid I can't offer any proof as such. I know there is an alternative way of implementing hreflang through xml sitemaps so you can avoid having to add the extra lines in the code within your pages. It's not something I've ever implemented but you can find more about it in the Using hreflang in sitemaps section of this article. Might be worth looking into.
-
RE: SMO - Author Image
I agree very much with Tom's view - consistency brings you brand awareness and users are drawn to authors they recognise. I know that I generally am. I can't really see any benefit in using different images for your social media profile so would suggest using the same ones.
-
RE: Correctly Dealing With Redirects
Is it possible that some of the pages have inadvertently had a noindex meta tag added to them or that certain pages are blocked in error by your robots.txt? Without your site URL it is very difficult for any of us to ascertain the issue.
-
RE: Which factors are effect on Google index?
Have you submitted your URLs in a xml sitemap to the major search engines? In addition, check your robots.txt - it may be that you preventing certain pages from being crawled.
-
RE: Url for Turkish, Russian, Chinese, Arabic, Vietnamese and Arabic websites
Definitely go for translating the URLs into their relevant languages. This was discussed in a previous thread which is worth looking at.
-
RE: Solve duplicate content issues by using robots.txt
Using robots.txt is perhaps not the best way of doing it. Using the canonical or a noindex meta tag would likely be best. I think the reasons for this are best summed up in this article which explains, probably better than I could, why robots.txt is not the best way of dealing with duplicate content. Hope this helps.
-
RE: How to recognize Panda, Penguin or Unnatural Links Penalty ?
I wouldn't consider doing a submit for reconsideration request until you are satisfied there are no issues with your site. Take a look at your backlink profile and see if there are any links that look suspicious and work through removing these, even if it means contacting other webmasters to remove these. Make sure you document all this on a spreadsheet and submit this as part of your reconsideration request but only when you have really addressed the issues that Google have highlighted.
-
RE: Should my client copy and paste his blog posts onto other professional sites?
Having your content on another site can be good from a brand awareness point of view and could send you useful referral traffic but you do run the risk of causing duplicate content issues.
If you wish to rank for your own content in SERPs then you run the risk of the Q&A site outranking you for your own content. Much depends on how valuable the exposure on the other site is to you.
You could try an implement a number of strategies to prevent problems including:
- Having the other site provide a link back to your original blog post (to signal you are the originator e.g 'This article first appeared...')
- Having the other site 'noindex' the pages where the blog posts are duplicated (unlikely that they will do it because they probably want to rank for it)
- Have the other site specify a cross-domain canonical to your content URL (again unlikely they will do it)
- Making sure the post is published on your site first before syndicating it to other sites.
The more sites you syndicate to the greater the risk, especially if they have greater authority than your site.
-
RE: Should you change Temporary redirects 302's to a 301 even if page is not important/intended for ranking ?
Choosing to leave a redirect as a 302 is not a major issue as it's not going to have any major effect other than, as you rightly say, preventing full flow of link juice to the new page. However, it is worth considering that while you may not wish to rank for this page you are unnecessarily wasting link juice, however minimal.
In theory, if you're not overly concerned about rank for this page you could noindex it. The page, although not indexed, would still accumulate page rank (if you changed to a 301) which you could pass internally to other pages in your site. A noindex page can still accumulate and pass pagerank as this old but still relevant article attests. Really though leaving the 302 in place is not going to be a problem if you decide the benefit of changing it would be minimal.
-
RE: Cross linking websites of the same company, is it a good idea
I asked a similar question a few months back so some of the responses on this Moz thread will be of interest to you. Cheers.
-
RE: What About Google Panda Update 22?
Hi, I'm looking into a similar fall experienced across several of our sites since Nov 21. From what I can gather, Panda is all about 'quality quality' and some of the signals used to indicate 'quality' are bouncerate, time on site, browse rate, CTR from search, rather than whether the content is well written.
I've identified that some of our pages have high bounce rates and maybe that is part of the reason we have seemingly been hit. Perhaps reducing bounce rates, improving on site clicks, improving browse rates etc will allow your site to bounce back. Personally, I'm still scratching my head wondering whether Panda really is to blame for our fall but improving 'quality' and the user experience certainly has to be one of our main goals.
-
RE: No google traffic for this site? Help?
I think a large part of the problem is the fact there is each page of the site has all the link no followed. If you look in the source code all the page have the following line:
While this doesn't stop your pages being indexed it does prevent the links on those pages being crawled. I can;t see a reason to nofollow links on your pages.
In addition the page titles such as: <title>Air Conditioning Manchester by Manchester Air Conditioning Company | Maintenance Services</title> seem overly spammy and repetitive.
This is just from a quick glance. I'm sure other people with more time will be able to add further advice.
-
RE: Is article syndication still a safe & effective method of link building?
Article syndication may help you build links but often at a cost to your own site's search presence. In the past we syndicated content to many high authority sites and received much referral traffic. However, in the long term this came at a cost to our own site's ability to rank for our own content.
What would often happen is that, even though we had published the content on our site first, a high authority site would outrank us for that content. Very few content partners were willing to specify our version as the canonical version using a cross domain canonical and inevitably our search traffic began to fall.
Since Panda we've realised that unique quality content is a must, and while we may have lost out on the referral traffic we might have received from content partner sites, we figured that having unique content and being an authority in our own area of expertise is what we should be aiming at - not getting masses of referral traffic which is often bounced visits in any case.
Really you need to weigh up what the benefit is to you from syndicating your content and whether this is worth putting your own ability to rank in search for your own content at risk.
-
RE: Same content pages in different versions of Google - is it duplicate>
This is the very situation where Google recommends that you use hreflang in order that you identify what version is intended for what country.
In your situation you need to implement in the head of your page
This tells Google that the gb version is intended for a UK audience and the US version is for aUS audience. This will help you avoid any duplication issue and should see that the correct URL version of your article is served in the right country SERPs.
-
RE: Best Place to Redirect 301 to?
There was a great Moz blog post on the whole re-direct issue by Cyrus Shepard which is well worth taking a look at.