Not a video but a good step by step instruction list :
https://moz.com/help/guides/search-overview/crawl-diagnostics
I have found a video but its not an official Moz one so don't know how useful it will be:
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Not a video but a good step by step instruction list :
https://moz.com/help/guides/search-overview/crawl-diagnostics
I have found a video but its not an official Moz one so don't know how useful it will be:
Certainly sounds dodgy, but suddenly removing all of those backlinks might cause you some SEO issues.
Depending on how Google is currently reading your site it may improve as your site would seem less spammy without them or it may really hurt the site (at least to start with) loosing that many back links would maybe make Google think something is up with your site?
I would take the bullet and remove the duplicate content but warn your clients that it may take a while for the natural benefits to come through. Because if your site isn't penalised yet for having that many dodgy backlinks and duplicate content it soon will be!
Yep, broken for me as well!
Maybe just a bug as perhaps many people don't click that so it hasn't been pointed out for a while?
Might be worth raising a support request to make them aware of it
With the way that most search engine ranking factors work these days it would suggest that Google looks at your domain first.
Things like the general amount of backlinks going to your root domain have a massive impact on SERPs so that to me suggests it looks at your domain first.
It will obviously still look at the individual pages to determine which one is most relevant page for the specific keyword being searched but the fact that massive companies like WikiPedia and the likes will rank for almost every keyword says allot about the root domain having the big impact.
I would be more concerned about the user journey than anything! If I were a user on your site and it took me 6 clicks to get to a web page I originally wanted I would get bored of the site and its content very quickly.
And if you think that the content on the pages is similar chances are so will Google and it will penalise you for having duplicate content.
I would keep it simple, make it easy for customer to find what they want. Having lots of content on one page is good if its written correctly and you'll see better SEO benefits from a good website structure that users can navigate around with ease.
Getting a page to rank is not the same as getting a page to be indexed
Fetching as Google will not increase your rankings for anything it will just indicate any errors in the page that Google with have crawling it.
You can do a quick Google search to see if the page in question is not being indexed by using the search term - "site:yourdomain.com/page.html" if it doesn't show up in the results then it is not being indexed.
If it is a new page it might take a while to appear in organic search.
It would depend on the type of business you offer and how much you use LinkedIn - same as any other Social Media platform I guess.
We have used it and seen some good traffic come through to our website, hard to tell how much business it has given us directly but we use it to interact with other businesses and have had some good feedback from it.
Joining 'Groups' and sharing content to them has proven to be quite successful but be specific with the groups that you join as the content you are sharing will have to be relevant.
Best thing to do in your situation would be to trial it for a couple of months to get a feel for what your target audience would be from LinkedIn and how relevant they would be to the business. Try communicating with the audience in different ways throughout the trial, maybe do some split testing to see what gets the best response.
Be active and share relevant and useful information
It would have been allot easier if you had used the same url names as the old site.
If there is a way for you to change your url slugs to the old ones you can use this simple bit of code in the .htaccess file to redirect all old urls to the new one:
RewriteEngine on
RewriteCond %{HTTP_HOST} !^www.newdomain.co.uk
RewriteRule (.*) http://www.newdomain.co.uk/$1 [R=301,L]
My guess that it isn't working would be that maybe "/" isn't the original home page of your old site? Do you have a "index.html"or similar and are using a .htaccess rule in your old site to determine what the root page should be?
Maybe your .htaccess file should look like this:
Redirect 301 /index.html http://www.healthpointe.net
Don't forget to tell Google webmaster tools that you have changed the address of the website - https://support.google.com/webmasters/answer/83106?hl=en
I don't know what market your website is in but maybe there are a few blogs/news websites that would write a small article about how your website is new and launching, maybe you have something unique that people will be happy to share with their readers?
It might be worth trying to come up with an outreach strategy and target website owners directly about how your new website is so amazing, and new, and useful etc.
Local directories are good to start with but I don't think its as easy as there being a specific set of link building tips that people run through when they first launch (although if I'm wrong I would love to know them!!).
You could always redirect the whole sub-directory of the community pages to what ever page you want using 301's. So after you've deleted the community section and someone were to go to any page with /community/ in it would redirect to your home page or any other page you wish. Means you wont get any 404 errors.
Example if you wanted to redirect the /community/ pages to the root you would use the following code in your .htaccess file:
RewriteRule ^community/(.*)$ /$1 [R=301,NC,L]
Your'll need access to your .htaccess file and more instructions on how to do the 301's are here - http://coolestguidesontheplanet.com/redirecting-a-web-folder-directory-to-another-in-htaccess/
If you set up the ad in Google Adwords you can just change it in the settings of the ad.
https://support.google.com/adwords/answer/2404246?hl=en-GB
Otherwise if you don't have access to change the ad url or you are using a different way to advertise, the best way I can think to do this would be a 301 redirect.
If you have access to the .htaccess file in the root domain its a fairly simple process to do but be careful when you edit the .htaccess file as you can very easily stop your site from working with some incorrect code.
You would do it by adding the below bit of code to your .htaccess file:
Redirect 301 /oldpagename /newpagename
You can read more here: http://www.rackspace.com/knowledge_center/frequently-asked-question/how-can-i-do-a-301-redirect
I've never read or heard of there being any hard evidence that it wouldn't affect your social signal. It should have the same effect as it would come through on your Analytics as a Social visit. Almost everyone uses link shortener services now so I would be very surprised if it had a negative effect.
As for the description its good practice to get into the habit of writing new fresh descriptions and content for your posts, this will encourage more engagement with the post. Especially if you re-post the topic a couple of times, people will get bored if you just copy and paste the same text each time.
It is not recommended to have more than 1
Depends on how the site is coded to work but I would say its safer to have the titles as
My experience with domain authority is that it is greatly effected by backlinks to the root domain, so that would be my first comparison against you and your competitors.
Have you experienced low Domain Authority since the website started or have you only recently had a massive drop?
MOZ's latest API update has affected allot of domain authorities due to the crawl size as I know a few of our clients saw some really strange fluctuations which is worth noting if you are only seeing a drop this month?
We did this recently with a client of ours, we saw almost instant effects, they dropped of the radar completely for all old keywords (good, right!?) but then we only saw a slight increase in the new targeted keywords, so their overall search engine presence had dropped massively!
We will hopefully start to see a rise in the new focused keyword but its just something to keep in mind. Maybe if we were to slowly incorporate the new keyword into the page rather than change it all at once it would have had less of a dramatic change.
Doing 'No SEO' is probably a bit of a big assumption to make about your competitors. I'd be interested to know how you are coming to that conclusion.
Are they using the same PR channels as you or different?
It could be that Google just thinks your competitors content is more relevant to the users who are searching for the search terms. There are allot of articles around at the moment that suggest Google focuses on user behaviour to conclude SERPS.
We know how frustrating it can be knowing that you are doing everything correct in terms of SEO guidelines and still not performing above clients who look to be using black-hat techniques.
If you haven't done so already I would do a more thorough investigation into what might be causing your competitors to rank high. Use MOZ and other well known SEO tools to compare metrics like backlinks etc.
If the website you are linking to is relevant to the content and of good quality then it can't do your website any harm.
I would review the sites you are linking to and make sure they follow the best practices of quality, maybe do a quick Open Site Explorer review to make sure they don't have a high Spam score.
It could even give you a good SEO benefit you if you remove the no-follow tag on some of them.
It seems extremely weird that ALL of your sites have stopped recording and not just the odd one!
We often get this issue with our clients as they feel the need to go into their Wordpress sites and add the Analytics code them selves. Often causing it to duplicate the code and stop Analytics recording the data properly!
First, I would 'view source' of the web pages and check that their is only one set of Analytics code on the site
Then check to see if there are any filters on the Analytics account that is causing it to block the data
Not convinced that they will solve your issue as for them to affect multiple sites all at once is weird, and you may have already checked the above! But it's worth a shot.
It's best practice to only have one home page variation. Google won't be able to tell which is your main page and you will also get duplicate content issues etc. If you haven't already I would redirect the /default.asp pages to the root domain.
As for your original question it does seem a bit odd that they are different scores but I would imagine it's because your main root domain would have more backlinks to it compared to your /default.asp pages.
From your original post I presumed you had not wanted Google to index your pages?
If you want Google to index your pages it can take some time to happen naturally. You might want to submit a sitemap and ask Google to crawl your site within Webmaster Tools.
Robots.txt is normally only used to block crawlers, so you will not need to put any code in there for it to allow Google to crawl.
Wouldn't have thought so, you'd need to include this line of code as well:
Disallow: /
That will stop anything from crawling the site.
Not that I know of!
I think our customer is just a bit cautious that they aren't getting relevant traffic to their site.
I can't see why Google would think our website is relevant to anyone outside of the UK but I am finding it hard to see what search terms international visitors would be using. Any tips on this?
Thanks, at first I thought it was spam but looking at the source that the international traffic is coming from most of it is coming from Google which makes me think there isn't much we can do about it.
The site isn't optimised for local listings as such as it is a service available across the whole of the UK but I don't see why we would be appearing in any search results from other countries.
I have heard that some people may use IP address reverting so that it looks like their IP address may be coming from somewhere else in the world. Is it that popular that a fair amount of people are using it to visit our site?
Thanks Christian,
IP Address is UK based.
Company is a Service provider rather than an e-commerce
I'll check into the audit and try to narrow it down via traffic sources etc.
Regards
Dave
Hi All,
One of our clients has asked why they are getting so much traffic from international countries when they solely operate in the UK.
See attached image for % of customers who are visiting our site from different countries.
Is this a normal ratio of traffic when we are only targeting UK customers?
Thanks
Thanks for the responses!
We only submit to decent niche or local directories but have found a few of the smaller directories have this problem.
The site is set to redirect to the https version so it sounds like it is just the directories using the old scripts to validate the URL.
Cheers for the help though, it has cleared it up
Hi Everyone,
Whilst trying to submit some client sites to directories we have been struggling to post any of our clients sites that use the HTTPS protocol. When trying to submit they usually come up with an error stating that the website could not be found etc. It doesn't accept it if I remove the HTTPS or replace it with the HTTP version either.
Any ideas, or has anyone else had any issues with this? Couldn't find much information about it online anywhere.
Thanks
We are having the same issue with one of our sites. We have tried all of the suggestions we can find on the internet.
Submitted out site to Bing webmaster tools, submitted the Sitemap, checked the Robots.txt...
It is now crawling the site but still not indexing any of the pages! The only thing left for us to do is wait and hope that Bing will eventually index it, but it seems weird as its being indexed fine in other search engines and generally getting good search engine rankings for keywords.
The website is www.thriveinlife360.com for reference.
We have a client base of well over 20 websites and all of them are being indexed in Bing, except from this one.
Would be interesting to know how many websites this affects and if there is a pattern within those that Bing doesn't like.
If you can cut and past the text then as a rule Google can index it.
Check out this link for more information right from the horses mouth. http://googlewebmastercentral.blogspot.co.uk/2011/09/pdfs-in-google-search-results.html
I would definitely add rel canonical tags to the website pages to let Google know which is the original page as Robert has suggested.