If the server keeps going down I would also look at and consider page load speed - http://www.webpagetest.org/ because if it is that flaky I would guess performance is poor all round.
I would get to a stable server as quick as possible.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
If the server keeps going down I would also look at and consider page load speed - http://www.webpagetest.org/ because if it is that flaky I would guess performance is poor all round.
I would get to a stable server as quick as possible.
If I understand you correctly and you want to stop a particular URL from being indexed you would be best using the no-index meta tag within the head section of that page:
You probably want the search engines to follow the links on the page even if the page should not be indexed, so add the 'follow'.
Billy and Dan are spot on. Not only is it feasible to have a new section on the site, you are doing what Google loves and introducing new informative content, growing your site and raising its overall value/authority. Google will have absolutely no issue with tightly relating any kind of surfing/board activity. Only consider if the new content is way off track - cake decoration classes or something.
I would just ensure the structure of your site is such that you have something like -
www.islesurfboards. com/paddle-boards/
rather than
www.islesurfboards. com/blah/blah/paddle-boards/
Another way of looking at this - if you have a successful site then where better to test the market, rather than investing time and money in a new site.
Just ensure you update robots.txt with the new address. Re-submit the new sitemap to the search engines. Should be as simple as that. Your URLs have not changed.
Your sitemap does not determine the indexing of all URLs but gives the search engines a good idea what to crawl, on top of being able to crawl from URL to URL on the site. It mainly is to help you see what has been indexed in comparison to what you have submitted. Unfortunately Webmaster Tools is not fully to speed anyway so may not show the same results as site:some-domain.com.
For the directories I think you need to go down the rewrite route. This isn't probably functioning correctly but you get the principle.
RewriteCond %{THE_REQUEST} ^GET\ /category/
RewriteCond %{REQUEST_URI} !^/category/sub1/
RewriteCond %{REQUEST_URI} !^/category/sub2/
RewriteRule ^category/(.*) http://www.newdomain.com/category/$1 [L,R=301]
RewriteCond %{THE_REQUEST} ^GET\ /category/sub1
RewriteRule ^category/sub1(.*) http://www.newdomain.com/category/sub1/$1 [L,R=301]
RewriteCond %{THE_REQUEST} ^GET\ /category/sub2
RewriteRule ^category/sub2(.*) http://www.newdomain.com/category/sub2/$1 [L,R=301]
The only way I can see that this is a 100 DA is if the site is possibly a subdomain of wordpress or similar, hence the disproportionate links? Is the site self hosted and you have access to the server?
Have you only recently updated your website, maybe the tool has not re-crawled yet?
What results do you get in Google for site:www.yourdomain.com?
What results do you get with https://moz.com/researchtools/crawl-test?
If you know it is backlinks then you should only need to cleanse your profile. Manual - submit a reconsideration request; algorithmic - wait for an update.
If you do use a new domain you have no age, no authority and ranking will take time. Being local you have to ensure every source that contains NAP is updated with the new domain as you cannot redirect if you suspect there is any infection within the existing profile.
All I am saying is that I don't know your domain or scenario, but just ditching to a new domain may give you more headaches than you envisage and should definitely not be a knee jerk reaction to a slow recovery.
Then as mentioned there is a trust thing with your visitors changing domain, especially if it is branded. If you do go down that road I would get some business objective information out there as to why the change in name (on the site, social etc).
The simple answer is we cannot give you confident advice. We don't know what the penalty/ies were, or if they are manual or algorithmic - do they tie up to any particular dates. Then that is only the start of real identification.
Does the domain have authority, age and an element of good link profile?
Does the domain have on-site technical issues hampering recovery?
My take on the question is that if you do not know precisely what the problem is then starting a new domain there is every reason to suppose you may fall down the same hole again. If you do know precisely the problem then fix it and wait, build out new content, create excellent links back to the site.
There are 2 distinct possible issues here
1. Search results are creating duplicate content
2. Search results are creating lots of thin content
You want to give the user every possibility of finding your products, but you don't want those search results indexed because you should already have your source product page indexed and aiming to rank well. If not see last paragraph.
I slightly misread your post and took the URLs to be purely filtered. You should add disallow /catalogsearch to your robots.txt and if any are indexed you can remove the directory in Webmaster Tools > Google Index > Remove URLs > Reason: Remove Directory. This from Google - http://www.mattcutts.com/blog/search-results-in-search-results/
If your site has any other parameters not in that directory you can add them in Webmaster Tools > Crawl > URL Parameters > Let Googlebot Decide. Google will understand they are not the main URLs and treat them accordingly.
As a side issue with your search results it would be a good idea to analyse them in Analytics. You might find you have a trend, maybe something searched for or not the perfect match for the returned result, where you can create new more targeted content.
To back up the detail Wesley gave you, you can also add URL parameters in Google Webmaster Tools
Everybody will direct you to the foot of the page to take a look at recommended companies - http://moz.com/community/recommended
Sponsorship is paid for so all those links should be no-follow
Having guest posts has, does and will still work if the primary purpose is to add value, be informative and be honestly relevant and not at all promotional - no keyword links spamming the content. Personally I wouldn't actively promote guest posts on the site, rather let it happen naturally or through your networking.
Are you sure about the direct algorithmic penalty/penalties, do they tie in with the update release dates and was there significant drops in traffic to identify which update version/s caused the problems?
I've worked on sites where I would describe as suffering the residue of penalties from linking domains, along with masses of on-site technical issues. Typically organic traffic has declined over a 1-2 year period.
I'm not suggesting you don't clean up your link profile, it is the right thing to do, but you want to be real sure what is causing the immediate problems.
I would install Fiddler and check the header response of your URLs. Otherwise screamingfrog is the way to go. Oleg might be right in that the developer might have set 302 redirects. If you post the URL we could confirm for you.
In .htaccess if the redirect doesn't explicitly end with the R=301 directive, typically [R=301,L] then the default is a 302 temporary redirect which doesn't pass on link value.
Also for an accurate assessment of links I would use Open Site Explorer.
Are you sure it's just on Mac,have you tried on PC? Do you have any other rules in include or perhaps a conflicting rule in exclude? Try running a single exclude rule, also on another small site to test.
Also from support if failing on all fronts:
To be sure - http://www.youtube.com/watch?v=eOQ1DC0CBNs
your directories have duplication. For example: http://www.titanappliancerepair.com/about-us.html and http://www.titanappliancerepair.com/about-us
You may also need RewriteRule ^(.*).html$ /$1 [R=301**,**L]
It looks like you are redirecting any URL back to itself.
You want something like this which will remove index.html and enforce a trailing slash-
RewriteRule ^(.*)/index.html$ /$1/ [R=301,L]
Apart from reaching retirement age before your link is live in DMOZ and spending a fortune for Yahoo I wouldn't bother. I would only consider doing these as 'an also' activity.
I would focus on reaching out to high authority, highly relevant sites and seeing if there are ways you can work with each other to get great content on their sites with a natural link back to your domain.
If your rankings are still faltering you may need to go through your link profile with a fine tooth comb and remove anything that is not good quality or spammy. Also ensure all your content is unique and you don't have multiple pages with thin content.
thanks. You've confirmed my thoughts on doing a thorough spring clean.
Thanks Chris. Yes I agree no harm. To be honest this is in the mix of getting Partial Matches: Unnatural links to your site—impacts links "...so for this incident we are taking targeted action on the unnatural links..."
I don't totally subscribe to Google just removing link flow from the links they refer to and that's that nothing to worry about. So in the back of my mind i'm looking at a complete purge even if not technically applicable.
When you redirect A to B, B does not equal A, but rather inherits some of A's authority.
You will need to further develop the link profile of B with other mixed authority linking domains in order to improve it's own domain authority. I wouldn't get hung up on pagerank, it's never current and only a contributing factor to overall domain authority.
I would use something like Open Site Explorer to gauge the overall domain authority and work from there.
Going back to your question the only way to retain that domain authority (in part pagerank) of A is to not 301 to B. But I guess that if A has been bought out they want it to disappear into the main pot.
Looking at the link profile anchor text of a site i'm working on new links keep popping up in the reports with let's say very distasteful anchor text. These links are obviously spam and link to old forum pages for the site that doesn't exist any more, so the majority seem to trigger the 404 page.
I understand that the 404 page (404 header response) does not flow any link power, or damage, but given the nature and volume of the sites linking to the "domain" would it be a good idea to completely disassociate and disavow these domains?
As soon as you place a link on a site that is site wide you have to wait potentially months for Google to crawl every page on that site, which is why GWT might still be showing the detail. GWT is not the decider as to whether Google takes some action.
As long as you have rectified with either a no-follow or removal you are fine and have done the right thing.
I agree with Chris and I would ignore the error/warning. The canonical does not hurt being there on the homepage but in any case it is set correctly.
What is the URL and what is the canonical tag in your head section?
Would you prefer to browse a site that is flat in terms of just providing static one dimensional information, or a hot site that is offering external resources and links to further information to give you the best experience possible? Always think of the user experience. Google probably knows that if you add links out within your content and in context to external authority sites you are attempting to give value to the visitor. So if you do link out don't use no-follow, as you are telling Google you don't trust the sites!
As well as parameters mentioned you may possibly have heaps of duplicating categories, tags etc. What I would also do is start searching Google with something like site:www.example.com/directory/ or possibly site:www.example.com/category/directory/directory/ so you are tightly narrowing down the results, switch to 100 results per page and manually look for clues.
Excellent response and advice. My only concern was Jonathon has the impression he "should" link to the pages linking to him by how I read his title.
So my advice based on that would be absolutely no as a reaction to gaining the link, then to stand back, consider and follow your advice.
I think you mean the other way round.
If you have a new site and you get a link from an authority site the chances are that site page might (and it is a might) get crawled more often, which then leads Google back to your site and stands you in good stead for Google to crawl and index your site. Other methods include submitting your sitemap to Webmaster Tools.
If good quality sites are linking to yours then that's great as it's telling Google your site is of value and will help with your rankings. You do not want to link back to those pages, you want to go out and find more sites to link to you.
EMD is not by itself enough to cause Google to take action. The key is the quality of the content and value to the visitor. Typically Google is after EMD sites with thin or poor content, affiliate smothered etc.
there is a good answer to this here - http://moz.com/community/q/reciprocal-links-4
Google will catch up with them, you can be sure of that.
The link operator only shows a snapshot and is in no way representative of the actual links. Open Site Explorer on here will give a more accurate account of your link profile.
You also want to take a close look at your organic traffic stats and compare with the Google update dates here http://moz.com/google-algorithm-change and see if there is a pattern emerging.
I have dealt with sites that have just generally declined organically over a year or two and usually it comes down to poor on-site SEO (including possibly panda) and the indirect effects of penguin.
As I see it, even if it wasn't wrong it's good housekeeping to keep the same format throughout all possible URLs.
In your case the canonical is saying that that URL is the original. Google won't crawl to it and is just concerned in the filename/main URL - and if it did the 301 kicks in but there is no link value issue. I would prefer to keep it tidy by matching the structure you want.
Books are ok,but there is nothing better than to get your hands dirty. As Steve says Beginners Guide to SEO is a great starting point. Then when you start working on a site you soon realise there is no one size fits all and all sites have their own peculiarities, needs and restrictions. That's when places like Moz will really help as you can research, absorb opinion, learn proper strategies and put it all into practice.
Those sites are just collections of links to sites
That doesn't sound good if you mean thin content as you are moving into the realms of Panda. Each and every site needs to be rich with unique and valuable content.
How many URLs are indexed in Google if you use site:yourdomain.com Has that figure dropped too?
Have you got anything in your robots.txt that could be blocking?
If you move the forum to a sub-domain you are still using a separate domain in Google's eyes. You will have to do all the work you would do if you loaded it on to a completely unique domain - SEO, link development etc to build up the authority of your sub-domain. Your sub-domain is disconnected from your main domain, it is a new website.
There is a list of recommended businesses via a link at the foot of the page. Most SEO guys are now very experienced in dealing with penalty issues as it has become so common.
This blog post should point you in the right direction - http://moz.com/blog/how-to-improve-your-rankings-with-semantic-keyword-research
To back up what Kyle says, yes there has be be relevance. The primary driver should always be does it add value and offer something that the visitor is looking for, rather than trying to shoehorn spuriously connected content onto a site. If it's not related in a positive way it will look manufactured, and probably be on a lower quality site. As soon as that happens you move across to join the penalty crew and take your chances.
What are you comparing to - if not Etsy what do you believe should be first position?
Etsy has a very high domain authority, a large link profile and social footprint. The keyword is in the URL, the meta title and in content so for me it is a perfect candidate for grabbing pole position. It also has a good volume of reviews.
Don't get disheartened. It can take a long time for Google to crawl the URLs your links are on. When you used disavow did you list the domain or each URL, as you are probably best removing the domain so you capture all links.
You cannot do this. If you have a poorly domain or homepage redirecting just injects the problem to the new location. You need to determine the problem links and deal with them by disavowing - http://moz.com/community/q/link-disavow
From a Google point of view crawling the pages you have/had links on can take hours/days/weeks/months depending on the authority of the domain and how accessible the URL is, so I imagine the moz crawler for example, will take similar times.
If you are getting links like this I would disavow the domains responsible.
It probably depends on the anchor. If the EMD was bluewidgets.com, then 'http://www.'bluewidgets.com' or 'bluewidgets.com' or 'bluewidgets' type anchor should be ok. If there was a high level of 'blue widgets' I would start to be a little concerned.
You might possibly have to detail the white-space explicitly with \s
RewriteEngine On
RewriteRule ^old\sfoldr/(.*) /newfolder/$1 [R=301,NC,L]