Panda would be site wide not kw specific !
It was the HP
there is link growth of aprox 1200 backlinks (but only an increase of 4 domains) over 3 days according to MajesticSEO (although hrefs reports a link & ref domain drop over same period)
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Panda would be site wide not kw specific !
It was the HP
there is link growth of aprox 1200 backlinks (but only an increase of 4 domains) over 3 days according to MajesticSEO (although hrefs reports a link & ref domain drop over same period)
Also does anyone know which is the more accurate data source MajSEO or Ahrefs since im getting wildly conflicting data from both. MajSEO now showing 845 links added for early July (which would indicate a neg link campaign) but hrefs shows 345 links lost over same time period ! ?
Thanks Irving
do you know if any free versions of linkdetox ?
how will doing a link:(space)www.yourdomain.com search help since results wont highlight site quality will they ?
All Best
Dan
Thats a great answer Robert, i really appreciate you taking the time to comment so helpfully
I should have added that there was a big rise in backlinks beginning of may, that peaked and levelled throughout June to then drop from beginning of July to date (according to ahrefs data). So in an otherwise nice natural looking link growth rate from nov last year to date there is a huge hump or wave in the graph as links rise in may but then drop over july.
So if i was looking into this in June it would, initially at least, look like it could well be a neg campaign, but the ranking drop has only occurred recently, correlating with the drop, not the rise, in links. If a neg campaign i would have thought the rank drop occur soon after the spike in link growth, not after a drop in links. Also the link growth period is spread over a month (as is the period of the link drop too), not a few days as article suggest one should look out for in a neg campaign, hence i'm pretty confident that its not (which is why i didnt mention it originally but thought best to now just in case).
When you say look at CTR do you mean purely in regard to traffic from the effected kw in the run up to the rank drop ? What kind of time period do you recommend, a week or more ?
Cheers
Dan
Hi
A friend/clients site has recently dropped 2-3 pages (from an average #2 - #3 position on page 1 over last few months) for a primary target keyword & suspects a Neg SEO campaign hence asked me to look into it.
I checked on Removeem and the KW does not generate a red (or even a pink) result.
I looked at Ahrefs & MajSEO, backlinks and referring domains have dropped over the period the KW dropped hence presume i can be sure its not a neg campaign since this would show an opposite pattern (as per articles like this: http://moz.com/blog/to-catch-a-spammer-uncovering-negative-seo ) ? Also site has very few site wide backlinks.
The keyword is a 3 word phrase with 2 of those words being in the domain and brand name hence presume such kw are relatively safe from neg seo campaigns anyway
I would have presumed the backlink/ref-domain drop may well explain the ranking drop but site still in first field of view of page 1 for the other keyphrases which 2 out of the 3 are words are same as effected keyphrase (and also in the domain/brand name) so would have thought these would have dropped too if a neg campaign. Also many of the anchor texts in the disapeared backlinks are for one of the other partial match variant keyphrases which are still top of page 1.
Anchor text is at 4.35% for the effected kw according to MajSEO
Im pretty confident from the above that i can conclude no negative seo campaign has occurred, nor other type of penalty and probably just a 'wobble' at Google that may well right itself shortly
Would appreciate feedback though from others that im concluding correctly just for confirmation ?
Many Thanks
Dan
Hi
On Moz theres Open Site Explorer for backlink analysis
Also look at aHrefs & MajesticSEO
All Best
Dan
Hi Matt
My personal opinion is to consolidate everything on the one domain using subfolders (as you suggest) and set up domain forwarding from the old domains to the new subfolders which are replacing those domains. Interesting to hear what others think too though ?
All Best
Dan
Thanks for taking time to respond Egol
Ok great Panda theoretically escapable in a few weeks then
Cheers
Dan
Thanks for taking time to comment Kevin
No - what i'm worried about is if more urls are blocked than indexed doesnt that mean that all site pages are blocked (when at least 50 of them dont want to be blocked since want to be crawled/indexed) ? in addition how can more urls be blocked than those that exist/indexed ?
cheers
Dan
How do you establish which keywords generated traffic that resulted in goal completions (in GA) ?
I thought this would be Reverse Goal Paths but this only goes back as far as the original page. Goal Flow shows the traffic source which is better but not the kw used ?
Cheers
Dan
I have a client who after designer added a robots.txt file has experience continual growth of urls blocked by robots,tx but now urls blocked (1700 aprox urls) has surpassed those indexed (1000). Surely that would mean all current urls are blocked (plus some extra mysterious ones). However pages still listing in Google and traffic being generated from organic search so doesnt look like this is the case apart from the rather alarming webmaster tools report
any ideas whats going on here ?
cheers
dan
Hi,
Is being hit by Panda as hard to get out of as being hit by Penguin ?
Or if you clean up all your content should you get out of it relatively quickly ?
I have a very old (11 years) and established site (but also very neglected site that i'm looking to relaunch) but its on an ancient shopping cart platform which never allowed for Google analytics & GWT integration etc so i cant see any messages in GWT or look at traffic figures to correlate a drop with any Panda updates.
The reason i ask is i want to relaunch the site after bringing up to date with a modern e-commerce platform. I originally launched the site in early 2002 and was perceived well by Google achieving first field of view SERPS for all targeted keywords however competitive, including 'ipod accessories', 'data storage' etc etc. These top positions (& resulting sales) lasted until about 2007 when it was overtaken by bigger brand competitors with more advanced & Google friendlier ecommerce platforms (& big SEO budgets)
I originally used the manufacturers descriptions editing slightly but probably not enough to avoid being considered duplicate content although still managed to obtain good rankings for these pages for a very long time even ranking ahead of Amazon in most cases. The site is still ranking well for some of the keywords relating to products for which there is still manufacturer copied descriptions so i actually don't think i have been hit by Panda.
So my questions Is, is there any way of finding out for sure if the site has indeed even been hit by Panda at all without looking at analytics & gwt ?
And once i find out if it has or not:
**OR **
Many Thanks
Dan
My pleasure, thanks Craig, currently on the Isle Of Wight is lovely !
In that case i would be looking to closer associate the reviews to the page content.
I would also concentrate on using the blog etc to generate fresh content and deep link (where directly appropriate and helpful for user) from the blog post to the product pages and of course category pages too (re your response and factoring in Chris' good comments too). Regular fresh unique content (associated/tied to an author & publisher) of a high quality (hence likely to earn social amplification) linked to the relevant categories and product pages (where appropriate & helpful) will do far more toward those pages being perceived well by G than overly concentrating on changing the content in the category & product pages
Hope this helps
ps - I'm going to Worthing for a 40th tomorrow coincidentally (just checked your profile)
The problem with Products is that they are are hard to update the content for since they usually (but not always) have an 'evergreen' description. Hence a great way to keep the content fresh is to enable customer reviews and comments on the product page and then encourage customers (via your 'post sale touch points' such as follow up emails saying thanks for your order, hows the product ?) to leave a review of the product (& incentivise them to do so via loyalty points/future discounts). This will mean the product page is continually populated with new fresh content that is also user generated demonstrating customer/user engagement hence showing 'signs of life' from real people too.
Hi Roy
Well in that case then i would be wary of duplicate content & descriptions etc
I would really focus on trying to develop content/product descriptions to highlight differences between them
This is almost certainly worth watching: http://moz.com/webinars/ecommerce-seo-fix-and-avoid-common-issues
Cheers
Dan
Hi Anand
Yes it looks like you have both versions, the non www and the ww version so could potentially be perceived as duplicate content (2 websites)
You need to choose 1 version and stick with it, so if you would prefer to have the www version then yes 301 the non www version to the www version (or vice versa)
Also for your info you have too many characters in your home page title tag @ 89 characters when you should keep under 70 characters.
Also reduce characters in your meta description tag since currently 189 when you want to be under 155
Hope that helps
All Best
Dan
Hi Roy
It depends how similar the products are, if they are simply variations of the same products then i would just list the 1 product and then give multiple options for the different variations. For example 1 product page for 'Suede Shoes' with selectable further options/variations of that product i.e. Colours available (blue, red, pink) and size available (8, 9, 10 etc)
Hope that helps
All Best
Dan
Hi Adam
Heres a really good October 2012 Moz article from Cyrus Shephard, all about G+. Have a look at point 7 , it may well be worth having a clearer picture without any background clutter:
http://moz.com/blog/tips-to-use-google-for-seo
All Best
Dan
I think when first launched some were getting away with it and possibly still now, although i wouldn't reccommend trying since is definately misleading and not what authorships about. It doesnt make sense for an author to have a logo since an author is a person and the image should therefore be the persons picture and nothing else
Thanks for your response Wesley !
The way i understand it If you put the 'rel=publisher' in the head tags then it should show on every major page shouldnt it ?
cheers
dan
Hi
Since rel=publisher code should be added to the head tags of a website surely this means that would then show up on every main page of a site but i'm looking at a few sites using the rich snippet testing tool and its only showing for home page - how come ?
Cheers
Dan
HI Lynn
In this particular case it is not actually although thats great info thanks very much for sharing, Everett is great i always refer to his posts/advice whenever i have an ecommerce project.
In this case my client i'm talking about is a music education establishment with many different courses and the site is in Wordpress, any ideas if possible to edit breadcrumbs in wordpress ?
Cheers
Dan
Hi Jarno & Lynn,
Thank you both for taking the time to respond !
Yes i agree i think this logical structure is best since helps search engines AND the users better understand the content since its associated with other immediately related content too both in terms of semantic relationship & close architectural proximity. This is also reinforced by good internal linking provided by breadcrumbs (which do help contribute to rankings in part since contributes to setting relevance of the pages content and its context).
I think in the case of a single item of content needing to be in more than one folder then maybe in that kind of scenario its better to have the content page 'off the root' and canonicalised to avoid duplicate content issues from displaying it in the 2 different category folders it will also be displayed in. Then so long as you have breadcrumbs (which from Lynns comments looks like you can edit/customise for the 2 different paths) you still benefit from the logical hierarchy and internal linking beneficial for both users and engines.
Although i must confess since i'm not that technical i don't know this for a fact and welcome the view of others to clarify/confirm. So does having the canonicalised page off the root stop engines seeing the silo structure therby defeating the purpose of this suggested solution OR would they still see the other page instances & associate it with the path but just not penalise it for being duplicate (since the page 'off the root' is the canonical version) hence is a good solution ??
All Best
Dan
Hi Jarno & Lynn,
Thank you both for taking the time to respond !
Yes i agree i think this logical structure is best since helps search engines AND the users better understand the content since its associated with other immediately related content too both in terms of semantic relationship & close architectural proximity. This is also reinforced by good internal linking provided by breadcrumbs (which do help contribute to rankings in part since contributes to setting relevance of the pages content and its context).
I think in the case of a single item of content needing to be in more than one folder then maybe in that kind of scenario its better to have the content page 'off the root' and canonicalised to avoid duplicate content issues from displaying it in the 2 different category folders it will also be displayed in. Then so long as you have breadcrumbs (which from Lynns comments looks like you can edit/customise for the 2 different paths) you still benefit from the logical hierarchy and internal linking beneficial for both users and engines.
Although i must confess since i'm not that technical i don't know this for a fact and welcome the view of others to clarify/confirm. So does having the canonicalised page off the root stop engines seeing the silo structure therby defeating the purpose of this suggested solution OR would they still see the other page instances & associate it with the path but just not penalise it for being duplicate (since the page 'off the root' is the canonical version) hence is a good solution ??
All Best
Dan
Hi
A client hasn't structured site architecture in a silo type format so breadcrumbs are not predicating in a topical hierarchy as one would desire (or at least i think one would prefer)
For example: say the site is called www.fruit.com and it has a category called 'types of fruit' and then sub/content pages called things like 'apples' and 'pears'. So in terms of architecture that should be: www.fruit.com/types-of-fruit/apples and www.fruit.com/types-of-fruit/pears etc etc
The client has kept it all flat so instead architecture is: www.fruit.com/types-of-fruit and www.fruit.com/apples and www.fruit.com/pears
As a result breadcrumbs follow suit and hence since also not employing logical predication dont reflect the topical & sub-topical hierarchy
I have seen that some seo's at least used to think this was better for seo since kept the page/s nearer the root but surely its better to structure site architecture in a logical topical hierarchy so long as dont go beyond say 3 or 4 directories/forward slashes in the url's?
Also is it theoretically possible to keep url structure as is (flat) and just edit/customise the breadcrumbs to reflect a topical hierarchy in a silo structure rather than change the entire site architecture & required 301'ing etc in order to do this (or is that misleading or just not possible?)
Cheers
Dan
Thanks Mihai
i thought microsites are well out the window these days, unless fully developing as a high quality site (which budget wont allow)
To confirm then, no bad can come from forwarding kw rich domains to main domain ?
cheers
dan
Hi
My client has a clutch of kw rich domains that want to point to main domain, apart from being good for promotional reasons is there any seo benefit for doing so (i know there used to be years ago but under impression hasn't been any benefit for a long while)
Most importantly though can any bad come from doing this ?
Best Rgds
Dan
ok thanks Chris but its not something to do with Yoast (platforms wordpress) ?
the instruction would just be that getting warning signs in rich snippet testing tool and needs to be fixed !
Also just to confirm this is nothing to do with authorship (since that tested fine) & its just other structured date detected on the page ?
cheers
dan
thanks Chris
so does that mean its a prob with how the website was set up and should just send these instructions to the developer or something i (the seo) can/should be able to do via Yoast plugin ?
Cheers
Dan
Hi
I've just successfully set up authorship for a client according to the rich snippet testing tool although bit perplexed since underneath the results theres a section called 'Extracted Structured Data'. The first section is marked hatom feed and under that it says under the field saying 'Author' it says in red:
Warning: At least one field must be set for Hcard.Warning: Missing required field "name (fn)".And then under the URL field & the URL it says:Warning: Missing required field "entry-title".Any ideas what this means or even if its important ? I would have thought the tool wouldnt acknowledge authorship as being set up correctly if this was an issue but that does beg the question what is it doing there and what does it mean ?Theres another section after that called rdfa node which seems all fineAlso says page does not contain publisher mark up although i know publisher has been added to the home page, is it best to add publisher to head section in every page (as i have heard some people say) or just the home page ?Many ThanksDan
Thanks Tom , so just to clarify re 301's, link building/growth to the promoted domain should still have a beneficial effect pass on to the primary hosted domain ?
cheers
dan
Havnt had a detailed look but all looks good from a quickly scanning it, i would suggest adding an 'about' section including mission statement etc and/or a 'contact us' section with physical address for more trust
Hi
Is it ok or bad practice to domain forward shorter more memorable snappier domains used for promoting a website to a longer domain where the website actually lives, such as:
Promoting in social media profiles, emails and offline literature a domain with forwarding set up like:
To the main website:
www.brandincludingprimaryproductrelatedkeyword.com
And if ok (not bad practice), since its the forwarded domains that are being promoted they are hence the links most likely to be shared on social media and other websites so will they be treated like 301's and 'link building' for those will pretty much equate to link building for the main domain (or not) ?
Many Thanks
Dan
Hi Aleyda
Sorry 1 more question:
Ive noticed that the new pages have started to rank well already.
In this case is it still advisable to 'fetch as google' in GWT the old redirected urls or, since the new pages have been found and ranking well, best to not fetch the old urls ?
Best Rgds
Dan
Hi
I have a client who has set up Facebook fan page for their physically located local business and they have 'checkedin' to it but not showing on FB page as having been checked in i.e. '0 were here' when i would have thought it should show at least 1
Does anybody know how best to set up FB fan page for a local business so is checkin'able ? or rather just get the checking in aspect of existing page working properly ?
(Have followed FB help guidelines)
Cheers
Dan
ok great many thanks Aleyda !!
Great thanks Aleyda
I have already done most of those things except for fetching as googlebot in gwt, which ill do next.
Please confirm then that theres no need to remove the old urls in GWT at any point and they should just 'fade away' ?
many thanks
dan
Many Thanks Aleyda for your detailed reply
In this case though its not a site migration just a redesign but thanks for your great info ill use for future reference when tackling any migrations
In the case of a redesign where the urls have been changed (and hence why set up the 301's) would you recommend after aproximately 2 weeks that you should 'fetch as google' in GWT the old page urls and then few days after that remove the old urls ? (or should they just disapear eventually ?)
Many Thanks
Dan
Many Thanks Stufroguk !!