I've just been running a report on a site and noticed that while they have a .co.uk domain it is hosted on a server in the United States and just wondered if anyone was aware, if the physical location of a server mattered to search engines for ranking purposes especially with local search?
- Home
- ben_dpp
ben_dpp
@ben_dpp
Job Title: IT Manager
Company: DP Publicity
Favorite Thing about SEO
The ever evolving nature of SEO and the way it encourages you to keep learning.
Latest posts made by ben_dpp
-
Does the physical location of a server effect the local rankings of a site?
-
RE: Keyword Stuffing - Image Alt
Did you get this sorted?
looking at your page now, it appears that while you have a lot of occurrences of the same keywords (track 43, curtain 29, and curtain track 22), the density of them is quite low, 4% or less.
This wouldn't be classed as keyword stuffing as far as i'm aware as you have plenty of other relevant content to dilute the frequency of them.
-
RE: Stock lists - follow of nofollow?
Thanks for the response.
The two routes i was looking at are both for the user. i'm looking at either not allowing search engines to serve the content that can expire, or redirecting them to similar vehicles/relevant content within the site.
i was purely wondering which would have additional benefits with google, as the first option is the easier of the two development wise.
-
Stock lists - follow of nofollow?
a bit of a catch 22 position here that i could use some advice on please!
We look after a few Car dealership sites that have daily (some 3 times a day) stock feeds that add and remove cars form the site, which in turn removes/creates pages for each vehicle.
We all know how much search engines like sites that have content that is updated regularly but the frequency it happens on our sites means we are left with lots of indexed pages that are no longer there.
now my question is should i nofollow/disallow robots on all the pages that are for the details of the vehicles meaning the list pages will still be updated daily for "new content" or allow google to index everything and manage the errors to redirect to relevant pages?
is there a "best practice" way to do this or is it really personal preference?
-
RE: URLS appearing twice in Moz crawl
Hi Jon,
Just to clarify is your report showing the domains with http and without, or with www. and without as the duplicate content?
i have seen this issue on a couple of my sites where it is reporting with and without www. and have just implemented the rel canonical to point towards one or the other as i am very confident that this will resolve the issue.
I'll let you know the outcome.
Regards,
Ben
Proud to be the largest Automotive Retail advertising agency in Europe, specialising in the management of some of the UK's biggest car dealer groups representing the World's best known brands.
A full service facility with around seventy colleagues, we know what success looks like and take the risk out of marketing. Gone are the days of putting your finger in the air, our success is derived from employing a scientific approach to every strategy and demonstrating the effect on every pound spent, a strategy that few are able to match.
Looks like your connection to Moz was lost, please wait while we try to reconnect.