Content Delivery Network
-
Hi
I have a question
My site is hosted on a server in the city of my target audience
Should i use a Content Delivery Network anyway?
Even if the closest edge location of the cdn is in the neighbor country?
Thank you
Paulo
-
Paulo, you'll have to test the speed of your site yourself, but you can do so at a place like: http://www.uptrends.com/aspx/free-website-server-network-monitoring-tool.aspx
It will display results from other countries and cities from around the world and help you decide whether or not you need a cdn.
-
Ryan,
Thank you for your answer
But i would like to know in terms of speed, loading of the site
Paulo
-
If the content of your site is location specific and geared toward your target audience in your target audience language with a target audience tld you shouldn't have to worry much about using a CDN, especially if the CDN helps speed up the user experience in all locations.
You can also use advertising that is city / neighborhood specific via Google AdWords and Facebook to help drive more initial traffic and possibly gain more links from other local sites.
If applicable, also add your site to location services such as Yelp, Google Local, etc.
All these things will help your target city rankings more so than CDN server location.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content and 404 errors
I apologize in advance, but I am an SEO novice and my understanding of code is very limited. Moz has issued a lot (several hundred) of duplicate content and 404 error flags on the ecommerce site my company takes care of. For the duplicate content, some of the pages it says are duplicates don't even seem similar to me. additionally, a lot of them are static pages we embed images of size charts that we use as popups on item pages. it says these issues are high priority but how bad is this? Is this just an issue because if a page has similar content the engine spider won't know which one to index? also, what is the best way to handle these urls bringing back 404 errors? I should probably have a developer look at these issues but I wanted to ask the extremely knowledgeable Moz community before I do 🙂
Technical SEO | | AliMac260 -
Content not being spidered
I've got a site with some serious content issues. The builder of the template doesn't understand what I'm asking (they're confusing spidering with indexing). If the page is run through a spider simulator (web confs won't work on this site for some reason) it shows the content is not being seen by Google. The template is Momentum and on Joomla. Most other sites I've found on the web have a similar issue. Basically it's reading the text in the header and footer, but nothing in the body. Any thoughts? www.rocksolidroof.com
Technical SEO | | GregWalt0 -
Mirrored content/ images
We are currently in the process of creating a new website in place of our old site (same URL etc.) We've recently created another website which has the same design/ layout/ pictures and general site architecture as our new site will have. If I was to add alt test to images only on one site would we still be penalised by Google as the sites 'look' the same, event thought they will have completely different URL's and different focusses on a similar topic. Content will be different also, but both sites will focus on a similar subject. Thanks
Technical SEO | | onlinechester0 -
Development Website Duplicate Content Issue
Hi, We launched a client's website around 7th January 2013 (http://rollerbannerscheap.co.uk), we originally constructed the website on a development domain (http://dev.rollerbannerscheap.co.uk) which was active for around 6-8 months (the dev site was unblocked from search engines for the first 3-4 months, but then blocked again) before we migrated dev --> live. In late Jan 2013 changed the robots.txt file to allow search engines to index the website. A week later I accidentally logged into the DEV website and also changed the robots.txt file to allow the search engines to index it. This obviously caused a duplicate content issue as both sites were identical. I realised what I had done a couple of days later and blocked the dev site from the search engines with the robots.txt file. Most of the pages from the dev site had been de-indexed from Google apart from 3, the home page (dev.rollerbannerscheap.co.uk, and two blog pages). The live site has 184 pages indexed in Google. So I thought the last 3 dev pages would disappear after a few weeks. I checked back late February and the 3 dev site pages were still indexed in Google. I decided to 301 redirect the dev site to the live site to tell Google to rank the live site and to ignore the dev site content. I also checked the robots.txt file on the dev site and this was blocking search engines too. But still the dev site is being found in Google wherever the live site should be found. When I do find the dev site in Google it displays this; Roller Banners Cheap » admin dev.rollerbannerscheap.co.uk/ A description for this result is not available because of this site's robots.txt – learn more. This is really affecting our clients SEO plan and we can't seem to remove the dev site or rank the live site in Google. In GWT I have tried to remove the sub domain. When I visit remove URLs, I enter dev.rollerbannerscheap.co.uk but then it displays the URL as http://www.rollerbannerscheap.co.uk/dev.rollerbannerscheap.co.uk. I want to remove a sub domain not a page. Can anyone help please?
Technical SEO | | SO_UK0 -
Is anyone using Canonicalization for duplicate content
Hi i am trying to find out if anyone is using Canonicalization for duplicate content on a joomla site. I am using joomla 1.5 and trying to find either a module or manually how to sort this out as i have over 300 pages of duplicate content because i am not using this technique any help and advice would be great
Technical SEO | | ClaireH-1848860 -
Rel=canonical for similar (not exact) content?
Hi all, We have a software product and SEOMOZ tools are currently reporting duplicate content issues in the support section of the website. This is because we keep several versions of our documentation covering the current version and previous 3-4 versions as well. There is a fair amount of overlap in the documentation. When a new version comes out, we simply copy the documentation over, edit it as necessary to address changes and create new pages for the new functionality. This means there is probably an 80% or so overlap from one version to the next. We were previously blocking Google (using robots.txt) from accessing previous versions of the sofware documentation, but this is obviously not ideal from an SEO perspective. We're in the process of linking up all the old versions of the documenation to the newest version so we can use rel=canonical to point to the current version. However, the content isn't all exact duplicates. Will we be penalized by Google because we're using rel=canonical on pages that aren't actually exact duplicates? Thanks, Darren.
Technical SEO | | dgibbons0 -
Question about content on ecommerce pages.
Long time ago we hired a seo company to do seo in our website and one of the things they did is that they wrote long text on the category pages of our products. Example here: http://www.theprinterdepo.com/refurbished-printers/wide-format-laser-refurbished-printers Now my marketing person is saying that if its possible to put the text below the items, technically I will find out how to do it, but from your seo experience, is it good or bad? What about if we short those texts to one paragraph only? Thanks
Technical SEO | | levalencia10 -
Complex duplicate content question
We run a network of three local web sites covering three places in close proximity. Each sitehas a lot of unique content (mainly news) but there is a business directory that is shared across all three sites. My plan is that the search engines only index the business in the directory that are actually located in the place the each site is focused on. i.e. Listing pages for business in Alderley Edge are only indexed on alderleyedge.com and businesses in Prestbury only get indexed on prestbury.com - but all business have a listing page on each site. What would be the most effective way to do this? I have been using rel canonical but Google does not always seem to honour this. Will using meta noindex tags where appropriate be the way to go? or would be changing the urls structure to have the place name in and using robots.txt be a better option. As an aside my current url structure is along the lines of: http://dev.alderleyedge.com/directory/listing/138/the-grill-on-the-edge Would changing this have any SEO benefit? Thanks Martin
Technical SEO | | mreeves0