Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • Hey guys. I have a server related question. One of our websites is hosted with a nasty slow company, and we want to make a change. The problem we have is that the site is 6 months old so it started on one server, the client then moved it to this slow host about 2 months ago, we now want to move it again. Will this negatively affect search engine rankings? As ever, thanks in advance 🙂

    | Nextman
    0

  • Need help: Do canonical tags do the exact same thing that wordpress already does with it’s permalink function?  Or are these 2 separate things?  thank you.

    | bonnierSEO
    1

  • Hi, I have a domain with no keywords on it, and I´ve been using it for years. Now I bought another domain with the keyword on it. I whant to work on seo for the second domain, with the keyword. What is the better way to work this out? 301? Duplicate de site? redirect in another way?

    | mgfarte
    0

  • Hello i have a doubt. in my webmaster tools my sitemap is showing like this | /sitemap.xml | OK | Images | Nov 27, 2011 | 2,545 | 1,985 | i am not sure why the type is showing like Images i have one blog attached to the same webmaster account and it is showing correctly.. | /blog/sitemap.xml | OK | Sitemap | Nov 28, 2011 | 695 | 449 |

    | idreams
    0

  • I have a client that wants to setup wordpress for their business site.  However, they are really concerned about the load time a wordpress install creates on their site in the root.  So, they want to setup wordpress with a sub-domain on a separate server.  For example: abc.domain.com As far as SEO is concerned with the main site, what are the advantages and disadvantages for using a separate server w/ subdomain?

    | VidenMarketing
    0

  • My company has an international website, and because of a technical issue visitors in one of our main countries cannot visits the "www" version of our site. Currently, the www version is our preferred domain - and the non www redirects to that page.  To solve this problem, I was thinking of proposing the following and would greatly appreciate any feedback! (Note: If you answered my www vs. non www question, thanks - this is a follow up) 1. Set non www site as the preferred version 2. Redirect from www to non www 3. Contact our current links and ask them to change to without “www” 4. Change canonical URLs to without “www”

    | theLotter
    0

  • Hi there Mozzers! I have a subdomain with duplicate content and I'd like to remove these pages from the mighty Google index. The problem is: the website is build in Drupal and this subdomain does not have it's own robots.txt. So I want to ask you how to disallow and noindex this subdomain. Is it possible to add this to the root robots.txt: User-agent: *
    Disallow: /subdomain.root.nl/ User-agent: Googlebot
    Noindex: /subdomain.root.nl/ Thank you in advance! Partouter

    | Partouter
    0

  • I have 300 URL specific websites that rank well in Yahoo and Bing.  Unfortunately I don't have access to the websites due to a previous marketing agreement (before my time). I do have access to the application that is i-framed into the websites.  I was thinking about adding a paragraph below the application with a link to the primary website. How does google look at these links. If I add the link, there will be an additional 300 links showing up at the same time. Not what they want to see from my personal knowledge base. At the same time, its not black hat SEO, I am just trying to link to the other websites which I own which are related. What are people thoughts.

    | FidelityOne
    0

  • The canonical URLs (and all our link building efforts) is on the www version of the site. However, the site is having a massive technical problem and need to redirect some links (some of which are very important) from the www to the non www version of the site (for these pages the canonical link is still the www version). How big of a SEO problem is this? Can you please explain the exact SEO dangers? Thanks!

    | theLotter
    0

  • Sort of a strange situation I'm having and I wanted to see if I could get some thoughts. Here's what has happened... Monday morning, I realized that my website, which had been showing up at the bottom of page 2 for a specific result, had now been demoted to the bottom of page 6 (roughly a 40 spot demotion). No other keyword searches were affected. I immediately figured that this was some sort of keyword-specific penalty that I had incurred. I had done a bit of link building over the weekend (two or three directory type sites and a bio link from a site I contribute to). I also changed some anchor text on another site to match my homepage's title tag (which just so happened to be the exact phrase match I had dropped in) - I assumed this was what got me. I was slowly beginning to climb up the rankings and just got a bit impatient/overzealous. Changed the anchor text back to what it originally was and submitted a reconsideration request on Tuesday. This morning, I get the automated response in Webmaster Tools that no manual action had been taken. So my question is, would this drop have been an automated deal? If that's the case, then it's going to be mighty hard to pinpoint what I did wrong, since there's no way to know when I did whatever it was to cause the drop. Any ideas/thoughts/suggestions to regain my modest original placement?

    | sandlappercreative
    0

  • Hi, I have a UK website which was formerly ranked 1<sup>st</sup> in Google.co.uk and .com for my keyword phrase and has recently slipped to 6<sup>th</sup> in .co.uk but is higher in position 4 in Google.com. I have conducted a little research and can’t say for certain but I wonder if it is possible that too many of my backlinks are US based and therefore Google thinks my website is also US based. Checked Google WmT and we the geo-targeted to the UK. Our server is also UK based. Does anyone have an opinion on this? Thanks

    | tdsnet
    0

  • Good afternoon. I've tried searching for an answer to the following question but I believe my circumstance is a little different than what has been asked in the past. I currently run a Australian website targeted at a specific demographic (50-75) and we produce a LARGE number of articles on a wide variety of lifestyle segments. All of our focus up until now has been in Australia and our SEO and language is dedicated towards this. The next logical step in my mind is to launch a mirror website targeted at the US market. This website would be a simple mirror of a large number of articles (1000+) on subjects such as Food, Health, Travel, Money and Technology. Our current CMS has no problems in duplicating the specific items over and sharing everything, the problem is in the fact that we currently use a .com.au domain and the .com domain in unavailable and not for sale, which would mean we have to create a new name for the US targeted domain. The question is, how will mirroring this information, targeted towards US, affect us on Google and would we better off getting a large number of these articles 're-written' by a company on freelancer.com etc? Thanks,
    Drew

    | Geelong
    0

  • Need a little bit of help on this one. I have a product page which actually has 3 products on the page in question: www.example.com/products I thought it would be best for each product to have a page on its own: www.example.com/product-1 www.example.com/product-2 www.example.com/product-3 however my question is: The page with the 3 products www.example.com/products where should the 301 go to? Can you do a 301 to all the new product pages? Hope that makes sense Kind Regards,

    | Paul78
    0

  • Hello, What does the following command mean - User-agent: * Allow: / Does it mean that we are blocking all spiders ?  Is Allow supported in robots.txt ? Thanks

    | seoug_2005
    0

  • As stated in the question, we have 2 sub domains that contain over 2000 reported errors from SEOMOZ. The root domain has a clean bill of health, and i was just wondering if these errors on the sub-domains could have a negative effect on the root domain in the eyes of Google. Your comments will be appreciated. Regards Greg

    | AndreVanKets
    0

  • Hi i am looking to promote my home page which is a lifestyle magazine www.in2town.co.uk and i am not sure what keywords i should be using to promote it. I am doing ok for the keyword lifestyle magazine but i am struggling on what other keywords i should be using to get people to the home page of the magazine. The magazine is nearly finished and we still have a couple of finishing touches to do but the basics of the magazine is as follows holiday and travel news, soap gossip, celebrity gossip, product reviews, lingerie brands, gastric band hypnotherapy, health, fashion and beauty and holiday reviews. I want the home page to be the main page where everyone visits but i am not sure what i should be doing to accomplish this. Any ideas would be of a great help

    | ClaireH-184886
    0

  • Hi all, We have a toys website that has several categories. It's setup such that each product has a primary category amongst the categories within it can be found. For example... Addendum's primary url is http://www.brightminds.co.uk/childrens-toys/board-games/addendum.htm but it can also be found here http://www.brightminds.co.uk/learning-toys/maths-learning/addendum.htm. Hence, in the for that url it has a rel=canonical that points to the first url. For some reason though seomoz ignores this and reports duplicate page content. It doesn't seem to record the canonical tag either. Any ideas what's going on? Thanks, Josh.

    | joshgeake_gmail.com
    0

  • I have received lots of warnings because of long urls. Most of them are because my website has many Attributes to FILTER out products. And each time the user clicks on one, its added to the URL. pls see my site here: www.theprinterdepo.com The warning is here: Although search engines can crawl dynamic URLs, search engine representatives have warned against using over 2 parameters in any given URL. The question to the community is: -What should I do?  These attributes really help the user to find easier the products. I could remove some of the attributes,  I am not sure if my ecommerce solution (MAGENTO), allows to change the behavior of this so that this does not use querystring parameters.

    | levalencia1
    0

  • I am thinking of implementing coral cdn to my site however I have serious doupts. It says you just add .nyud.net to the end of your urls. Now that is easy but in my read it totally masses up my url structure.How do you think google handles this? Does it handle that at all? I would like to implement a free cdn as for that particular page. Does anybody know any free service good and free? Anybody? Experience?

    | sesertin
    0

  • I see on sites like scriptlance where webmasters post a project for SEO services and bidders offering their services to get you to the top of Google in 30 days. Is that out of the question or is it reasonable because they use a service like SEOMoz?

    | webtarget
    0

  • I was thinking of using 301 redirects for trailing slahes to no trailing slashes for my urls. EG: www.url.com/page1/ 301 redirect to www.url.com/page1 Already got a redirect for non-www to www already. Just wondering in my case would it be best to continue using htacces for the trailing slash redirect or just go with Canonical URLs?

    | upick-162391
    0

  • I have recently created a page and added expires headers, nonconfigured e-tags and gzip to htaccess code and just after that according to pingdom tools my page load time has doupled although my yslow ponts went from 78 to 92. I always get a lite bit lost with this technical issue. I mean obviously a site should not produce worse results with adding these parameters and this increase in page load time should rather be due to bandwith usage. I suppose I should leave this stuff in the htacces. Than what is an accurate way to know if you have done a real improvement to your site or your load time has really went up? This question is more up to date with css sprites as I always read that sometimes spriting every picture is a waste of resources. How can you decide when to stop?

    | sesertin
    0

  • Hello, we're a new member of SEOMOZ and love it but have a problem. Obviously the reason for joining is to learn more about SEO and hopefully get our website ranked a lot better than it currently is. However, one particular page we've chosen to optimise (based on your tips) has since lost ranking and we can no longer find it in the searches. Is there a reason for this? We've only made the on page changes you suggested and also added more external and internal links so I can't understand why it would no longer be listed in the searches? I look forward to your reply/feedback. Many thanks Peter

    | mybabyradio
    0

  • Hello; I have the domain which registerd in 2006 and i opened website 1 months ago and i start to do some seo like bought links pr1-pr7 50 links and 2500 social bookmarks 2000 blog links and also some wiki links am i doing good or bad ?

    | Sadullah
    0

  • Can I verify an IP in google webmaster tools to search for any 404s? Or maybe i could do it with seomoz tools? Thanks!

    | tylerfraser
    0

  • How can I check a link to see if there are links going to it (internal and external)? How can I check a large number of links to see if there are any links going to them? Thanks!

    | tylerfraser
    0

  • Does anyone have much experience implementing Schema.org metadata for reviews? I run and operate a website that reviews study abroad programs and we've started the process of implementing this code to receive rich SERP snippets. We're going to use the framework used here: http://schema.org/Review My main question is how long does it generally take to see the results? I would also like to hear from people who implemented this code, but ran into problems, and how they overcame them. Any other tips and advice would be greatly appreciated! Cheers, Andrew

    | dunklea
    0

  • I have hundreds of domains that I have purchased over the years that arent going anywhere except GoDaddy's Cash Parking system, which returns very little revenue, if at all. I wonder if it would make more sense to just point these domains to actually e-commerce sites that I own.  If so, how best to take these domains and point them so that SEO credit is given properly. Most of these available domains dont have anything to do with the e-commerce stores.  So not sure it would help. Furthermore, if I were to purchase new domains that were more relevant to the keywords to our e-commerce sites, how best to set them up so we can generate traffic on them and point them over to the actual domains? Many thanks.

    | findachristianjob
    0

  • Hi, Is this the correct code for redirecting www to non www version on Apache server RewriteEngine OnRewriteCond %{HTTP_HOST} ^www.example.com RewriteRule (.*) http://www.example.com/$1 [R=301,L] Thanks

    | seoug_2005
    0

  • Hi Folks, Here is our situation we have an old brand domain www.asia-hotels.com >> that was redirecting to>>www.asiahotels.com By mistake, we let that domain expired and only noticed the drop a month later We lost all our pages and this for several weeks Not sure of the exact date but approximately around 24th of December, what a merry Xmas! 😞 Since then we have repurchased the domain, Put back all the pages as they were and re-instated all the 301 redirect as they were. Since that date we haven't seen any uplift in our visits or visibility score. Did we do something wrong with our 301 redirect? I know for sure we used ISAPI rewrite mod for the non www. domain although I am not entirely sure how the www. version has been handled. Is there something we should do at a DNS level to flag the site is back? Should we presetn a reconsideration request? Any help would be greatly welcomed. Thanks for your help. Cheers, Freddy More info I placed a bit more info and the visits graph on my blog: http://www.inmarketingwetrust.com.au/seo-effect-of-domain-expiry-on-301-redirects/ I am not sure if this is due to the fact that some information is cached but when i looked at the site on opensiteexplorer I found that the data is still showing as non redirected sites: http://www.opensiteexplorer.org/asia-hotels.com/www.asia-hotels.com/a!comparison effect-of-301-redirect-expired-on-SERP-visibility-300x204.jpg

    | Gus_Martin
    0

  • My site crawl diagnostics are showing a high number of duplicate page titles and content.  When i look at the flagged pages, many errors are simply listed from multiple pages of product category search results. This looks pretty normal to me and I am at a loss for understanding how to fix this situation.  Can I talk with someone? thanks, Gary

    | GaryQ
    0

  • I thought my site is fine until I joined SEOmoz and had my site crawled, took couple of hours to crawl the site I received a e-mail "This is a friendly notification that we successfully completed a Starter Crawl of your campaign. No need to stress, we made an easy-to-understand report so you can fix any errors we found." HMMMM I was surprised at the number of errors that my page has. I went over the crawl diagnostics and found out that I have a large number of duplicate page content. That can't be since I place the content on the site myself. All the pages from the back-end are single. I would appreciate if someone from this wonderful forum educated me on this topic and if possible provided me with a straight forward solution to this problem. Looking forward to hearing some comments.

    | Tolod
    0

  • I have a page and I want to remove .html ending from urls. What should I write in the htaccess?

    | sesertin
    0

  • I've got a client who verified their Google Places listing years ago, and noone knows who so I can't access it. The business is now moving and I need to update the address. What should I do? thanks

    | garymeld
    0

  • Guys am looking for a related posts script or tool that can read my site map or and post related articles under each of my articles. There are plugins like yarrp, linkwithin but they are for wordpress. I need something i can you use with a normal html website.

    | Emeka
    0

  • So I was just about to add a whole heap of CMS-related folders to my robots.txt file to exclude them from search, and thought "hey, I'm publicly telling people where my admin folders are"...surely that's not right?! Should I leave them out of the robots.txt file, and hope for the best that they never get indexed? Should I use noindex meta data on every page? What are people's thoughts? Thanks, James PS. I know this is similar to lots of other discussions around meta noindex vs. robots.txt, but I'm after specific thoughts around the security aspect of listing your admin folders in a robots.txt file...

    | James-Distinction
    0

  • <iframe style="border: 2px #CCCCCC solid;" src="http://www.cpsc.gov/cgi-bin/javascripts/cpscrss.aspx" title="CPSC RSS Feed" frameborder="0" scrolling="auto" width="224" height="258"></iframe> That is the code my client wants to add to an internal page where we can keep updated news on a specific subject. Only problem is this widget has links within it, these links are "followed". Should i worry about these links being followed? There are quite a few, does anyone know if they will be counted if within an iframe or is there a way to add "no-follow" attribute to them. Can i somehow tell the HTACCESS to add no follows to all links on specific pages? Any thoughts, solutions are greatly appreciated.

    | waqid
    0

  • I have some urls in my site due to a rating counter. These are like: domain.com/?score=4&rew=25
    domain.com/?score=1&rew=28
    domain.com/?score=5&rew=95 These are all duplicate content to my homepage and I want to 301 redirect them there. I tried so far: RedirectMatch 301 /[a-z]score[a-z] http://domain.com
    RedirectMatch 301 /.score. http://domain.com
    RedirectMatch 301 /^score$.* http://domain.com
    RedirectMatch 301 /.^score$.* http://domain.com
    RedirectMatch 301 /[a-z]score[a-z] http://domain.com
    RedirectMatch 301 score http://domain.com
    RedirectMatch 301 /[.]score[.] http://domain.com
    RedirectMatch 301 /[.]score[.] http://domain.com
    RedirectMatch 301 /[a-z,0-9]score[a-z,0-9] http://domain.com
    RedirectMatch 301 /[a-z,0-9,=,&]score[a-z,0-9,=,&] http://domain.com
    RedirectMatch 301 /[a-z,0-9,=&?/.]score[a-z,0-9,=&] http://domain.com None of them works. Anybody? Solution? Would be very much appriciated

    | sesertin
    0

  • What is a proper way to redirect any url containing a give word (anywhere in the url) to another sepcified url? Is it like this? RedirectMatch 301 ^thisword$ http://domain.com/newlocation

    | sesertin
    1

  • Hi All, I have an issue whereby print versions of my articles are being flagged up as "duplicate" content / page titles. In order to get around this, I feel that the easiest way is to just add them to my robots.txt document with a disallow. Here is my URL make up: Normal article: www.mysite.com/displayarticle=12345 Print version of my article www.mysite.com/displayarticle=12345&printversion=yes I know that having dynamic parameters in my URL is not best practise to say the least, but I'm stuck with this for the time being... My question is, how do I add just the print versions of articles to my robots file without disallowing articles too? Can I just add the parameter to the document like so? Disallow: &printversion=yes I also know that I can do add a meta noindex, nofollow tag into the head of my print versions, but I feel a robots.txt disallow will be somewhat easier... Many thanks in advance. Matt

    | Horizon
    0

  • Hi,,, Okay we have 1 main site , a few years back we went down the road of sub domains and generated about 10. They have page rank and age  but we wish to move them back to the main web site. What is the correct or best way to achieve this. 1 copy all content to the main web site creating  dup pages and then use a redirects from the sub pages  to the new dup pages on the main domain... or 2 write new content on the main domain for the subdomain pages and redirect to the new content. Problem with 2 is the amount of work involved...

    | NotThatFast
    0

  • I'm building a site for a manufacturer where all products will be entered as blog posts. Then I have to build some awareness for the site. What personna am I best off doing this with? Company Name, My Name, A ficticious name that works for the company, or some other? Any Mozzers want to share thoughts or past strategies that have been successful?

    | waynekolenchuk
    0

  • For example, take a look at http://www.dueds.com and scroll all the way to the bottom of the page.  See the link in the bottom left?  Does the fact that it is pushed all the way down to the bottom make the link worth less than if it was directly under the social media buttons?

    | adriandg
    0

  • Hi.. We sell Joomla addons here, http://www.joomclub.org/joomla-extensions/ This site was started in september 2011 and its converting very well from other sources, but I need it to rank high and get organic traffic as well. We have over 295 pages and almost alla re index, I guess, but doesnt come anywhere in the SERPS, so I thought of investigating it and found that, all the product pages h1 tag was messed up and gone. Just fixed it right now. So, how do I tell Google to refresh and recrawl my pages and index ?

    | qubesys
    0

  • We know that google now gives a penalty to a whole duplicate if it finds content it doesn't like or is duplicate content, but has anyone experienced a penalty from having duplicate content on their site which they have added noindex to? Would google still apply the penalty to the overall quality of the site even though they have been told to basically ignore the duplicate bit. Reason for asking is that I am looking to add a forum to one of my websites and no one likes a new forum. I have a script which can populate it with thousands of questions and answers pulled direct from Yahoo Answers. Obviously the forum wil be 100% duplicate content but I do not want it to rank for anyway anyway so if I noindex  the forum pages hopefully it will not damage the rest of the site. In time, as the forum grows, all the duplicate posts will be deleted but it's hard to get people to use an empty forum so need to 'trick' them into thinking the section is very busy.

    | Grumpy_Carl
    0

  • I want to have a page that describes a specific property and/or product. The top part of the page has media options such as video and photos while the bottom includes the description. I know I can set up the media in tabs and have it separated by javascript, but everything resides on one page so there are no duplicate content issues. Example: http://www.worldclassproperties.com/properties/Woodside BUT what if I need to the photos and the videos to have separate URLs so I can link to them individually? For example, for a real estate site blog, I may want to send visitors to the page of the home tour. I don't want to link them to the version of the page with the photos because I want them to arrive on the video portion. Example: http://www.worldclassproperties.com/properties/Woodside?video=1 Is there any way to get around the problem that would result from the duplicate content of the product/property description? I do not have the resources in the budget to make two unique descriptions for every page.

    | WebsightDesign
    0

  • Hello There Can anyone refer to me a document that will explain htaccess files to me from the begining to the end? I have done some Seo projects, so I know the benefits but I don't know the code for the files. Thanks

    | sesertin
    0

  • Hi all. I've purchased a domain name two years ago with the idea to offer wide range of services. I've also created a sub-domain providing specific service for highly competitive keyword. Sadly plans went wrong and I didn't use the root domain name at all, just the sub-domain providing that service. There aren't much links to that sub-domain, but all are quality links, until recently I've managed to keep positions between 5 and 7 without any effort, but yesterday I saw that it's dropped to 9. The question is, before I start to build links and write articles to get back up my domain, is it worth to move that sub-domain to my original root domain. As I said, there aren't much links to that sub-domain, it only has pagerank1, also for the last year the original root domain was redirected (301) to the sub-domain to not loose traffic and I'm scared if I reverse this procedure and redirect my sub-domain to the root domain that Google will get confused. It's a tricky question I know 🙂

    | VasilTasev
    0

  • Hi Checked our original rss feed - added it to Google reader and all the links go to the correct pages, but I have also set up the RSS feed in Feedburner. However, when I click on the links in Feedburner (which should go to my own website's pages) , they are all going to spam sites, even though the title of the link and excerpt are correct. This isn't a Wordpress blog rss feed either, and we are on a very secure server. Any ideas whatsoever? There is no info online anywhere and our developers haven't seen this before. Thanks

    | Kerry22
    0

  • Hi there I have a domain which is built in Drupal 1.5 . We managed to redirect all nodes to the actial SEF URL. The one issue we have no is redirecting the taxonomy urls to the SEF url. The obviuos answr is to do a manual 301 redirect n the htaccess file but this will a long process as there are over 500 urls affected. Is there a better way to do this automatically within Drupal? Your thoughts and ideas are welcome.

    | stefanok
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.