Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • We have e-Commerce site and an official blog to give advice about our products. This blog exists under our domain.  Usually we build links directly to our site. Recently our ranking started going down. Also, we have been experiencing backlash for spam based on our link building (we are working on this, including a change of staff,but we cannot be sure that this will not happen again).  This backlash has come through our social networking outlets (Facebook) in the form of very negative posts to our pages.  One of our "SEOs" has devised a plan to use secondary blogs which we would start building links for.  This blog would contain links back to our website.  The idea is that the blog acts as a gate in a sense, in this way backlash is either posted on the blog or is directed at the blog.  Also, we would be attempting to raise the page authority of these secondary blogs so in essence they act as high page authority links back to our website.  The concern is that these secondary blogs may undermine the legitimacy of the official primary blog, which is still in its early stages as far as ranking and authority goes. Also, we are concerned that this technique would further undermine the legitimacy of the website itself by creating a larger "spam-like" presence, since visitors may see through the use of the secondary link through blogs.

    | ctam
    0

  • We have a e-commerce website. Our own homegrown:-) We recently visited Google Webmaster tools and could see that Google mention we have double Meta tags for some main and subcategories. Each Product Category on our site have a subcategory/ Sub url -  "Bestseller", "On Sale", "just arrived".  The sub url is not a really a real category and we can therefore not make totally unique description and title for does urls. domain.com/category domain.com/category/bestseller
    domain.com/category/on-sale
    domain.com/category/just-arrived We are thinking about 2 solutions. 1. Canonical Url on subcategory pointing to main category.
    2. Or add a word bestseller, on sale or just arrived in front of the meta title/description. We can do this from code. I personally opt for option 1. But I am little unsure what is the best way to go. Thanks in advance for your advice

    | areygie
    0

  • We've noticed that the unit # is no longer showing on a clients Places profile page. Any thoughts on why and if relevant to rankings? Places page - https://skitch.com/kyegrace/8fimr/vuppie-real-estate-team or http://maps.google.ca/maps/place?hl=en&qscrl=1&nord=1&rlz=1T4GGNI_en-GBCA461CA461&gs_upl=&ion=1&bav=on.2,or.r_gc.r_pw.r_cp.,cf.osb&biw=1366&bih=641&wrapid=tlif133037839755610&um=1&ie=UTF-8&q=realtor&fb=1&gl=ca&hq=realtor&hnear=0x548673f143a94fb3:0xbb9196ea9b81f38b,Vancouver,+BC&cid=5594900399034912659&ei=o_ZLT4XHM6ObiQKZnbCdDw&sa=X&oi=local_result&ct=placepage-link&resnum=8&ved=0CIMBEOIJMAc and as it appears in the backend https://skitch.com/kyegrace/8fimk/google-places-analytics Any insight greatly appreciated!

    | kyegrace
    0

  • Yesterday my homepage had a PageRank of 4 according to the Google toolbar (which I know isn't accurate, but bare with me) and the majority of my other pages were PR2.  Today every page on my site is PR N/A, or PR -1.  I also used a couple of free page rank checkers to make sure it wasn't just my browser. The only link I have ever purchased is on botw.org.  I submitted my site to 6 free directories about two weeks ago, but only to the highly recommended ones, and with different anchor text at each. I have no messages in my Google Webmaster Tools account, nor do I have any significant crawl errors. My SERPs and resulting traffic have not (yet) been affected. Any ideas what's going on? I did start a Google AdWords campaign about 10 days ago that ran for 3 days and then got suspended due to the nature of my website. Could that be it?

    | PatrickGriffith
    0

  • Hi Everybody, I am having kind of an issue when it comes to the results Google is showing on my site. I have a multilingual site, which is main language is Catalan. But of course if I am looking results in Spanish (google.es) or in English (google.com) I want Google to show the results with the proper URL, title and descriptions. My brand is "Vallnord" so if you type this in Google you will be displayed the result in Catalan (Which is not optimized at all yet) but if you search "vallnord.com/es" only then you will be displayed the result in Spanish What do I have to do in order for Google to read this the way I want? Regards, Guido.

    | SilbertAd
    0

  • Hi , There is page i get when i do proper menu navigation Caratlane.com>jewellery>rings>casualsrings> http://www.caratlane.com/jewellery/rings/casual-rings/leaves-dew-diamond-0-03-ct-peridot-1-ct-ring-18k-yellow-gold.html When i do a site search in my search box by my product code number "JR00219" The same page is appears with different url http://www.caratlane.com/leaves-dew-diamond-0-03-ct-peridot-1-ct-ring-18k-yellow-gold.html So there is a duplicate content. How can we resolve it. Regards, kathir caratlane.com

    | kathiravan
    0

  • Hello SEO MOzzers, I am today wanting your feedback on a site that I recently went live with. My Google rankings for the main keywords are doing very well considering the site has been live for 3 weeks now. I of course have a list of items that i'm still working on, completing meta description tags, title tags, adding copy content to category pages, updating h1 tags, working on our backlinking campaign, etc. The site is www.profitness-supplies.com Let me know what you think Mozzers

    | seohive-222720
    0

  • I have a client that has been 301’ing googlebot to the canonical page. This is because they have a cart_id and session parameters in urls. This is mainly from when googlebot comes in on a link that has these parameters in the URL, as they don’t serve these parameters up to googlebot at all once it starts to crawl the site.
    I am worried about cloaking; I wanted to know if anyone has any info on this.
    I know that Google have said that doing anything where you detect goolgebots useragent and treat them different is a problem.
    Anybody had any experience on this, I would be glad to hear.

    | AlanMosley
    0

  • I have restricted around 1,500 links which are links to retailers website and links that affiliate links accorsing to webmaster tools Is this the right approach as I thought it would affect the link juice? or should I take the no follow out of the restricted by robots.txt file

    | ocelot
    0

  • If you embed a YouTube video on your page, does Google count that as part of their site speed calculation. Since it is in a iFrame, I would think that it is not counted.

    | ProjectLabs
    0

  • Good Afternoon form 13 degrees C totally Sunny Wetherby UK 🙂 Am i right in thinking that the only way to get images appearing like this in your serps: http://i216.photobucket.com/albums/cc53/zymurgy_bucket/innovia-merchant-immages-serpscopy.jpg is to be hooked up to Google Merchant? Which kind of means if the sight your working on has no images then this type of enhancement is out of bounds? Thanks in advance, David

    | Nightwing
    0

  • Hi, I have a client's site that posts job openings. There is a main list of available jobs and each job has an individual page linked to from that main list. However, at some point the job is no longer available. Currently, the job page goes away and returns a status 404 after the job is no longer available. The good thing is that the job pages get links coming into the site. The bad thing is that as soon as the job is no longer available, those links point to a 404 page. Ouch. Currently Google Webmaster Tools shows 100+ 404 job URLs that have links (maybe 1-3 external links per). The question is what to do with the job page instead of returning a 404. For business purposes, the client cannot display the content after the job is no longer available. To avoid duplicate content issues, the old job page should have some kind of unique content saying the job is longer available. Any thoughts on what to do with those old job pages? Or would you argue that it is appropriate to return 404 header plus error page since this job is truly no longer a valid page on the site? Thanks for any insights you can offer.
    Matthew

    | Matthew_Edgar
    1

  • Hi I have been approached to do some SEO work for a site that has been hit badly by the latest panda update 3.3, they have also had a warning in their Google webmaster tools account saying they had unnatural looking links to their site, they received this in 26 Feb and that prompted them to stop working with their excising seo company and look for a new one. Apparently their rankings for the keywords they were targeting have dropped dramatically, but it looks like just those they were actively building back links for, other phrases do not look affected. Before I take them on I want to be clear that it is possible to help them reclaim their rankings? I have checked the site and the on-page seo is good, the site build is good, just a few errors to fix but the links that have been built by the seo company are low quality with a lot of spun articles and the same anchor text so I see what the Google webmaster tools message is refuring to. I do not think these links can be removed as there is no contact details on the sites I checked I have not checked all of them but a random sample does not show promise, they are from low authority domains. So if I am to take them on as a client and help them to regain their previous rankings what is the best strategy? Obviously they want results yesterday and from our phone call they would rather someone else did the work than them, so my initial response of add some better quality content that others in your industry would link to as a reference did not go down well, to be fair I think it is a time issue there are only 3 people in the company and they are not technical at all. Thanks for your help Sean

    | ske11
    0

  • For example I'm thinking of running people through a squeeze page when they come from search engines (for first time visitors only/cookied) ... example: http://www.whitehouse.gov/ Is it going to hurt SEO? Because basically you are serving a page that is different, than the one displayed in the SERP's.

    | achilles13
    0

  • I had an issue where I was getting duplicate page titles for my index file.  The following URLs were being viewed as duplicates: www.calusacrossinganimalhospital.com www.calusacrossinganimalhospital.com/index.html www.calusacrossinganimalhospital.com/ I tried many solutions, and came across the rel="canonical". So i placed the the following in my index.html: I did a crawl, and it seemed to correct the duplicate content.  Now I have a new message, and just want to verify if this is bad for search engines, or if it is normal.  Please view the attached image. i9G89.png

    | pixel83
    0

  • Is there a way for me to do a "view source" for an entire website without having to right-click every page and select "view source" for each of them?

    | SmartWebPros
    0

  • I have a client who wants to publish the same information about a hotel (summary, bullet list of amenities, roughly 200 words + images) to two different websites that they own. One is their main company website where the goal is booking, the other is a special program where that hotel is featured as an option for booking under this special promotion. Both websites are pulling the same content file from a centralized CMS, but they are different domains. My question is two fold: • To a search engine does this count as duplicate content? • If it does, is there a way to configure the publishing of this content to avoid SEO penalties (such as a feed of content to the microsite, etc.) or should the content be written uniquely from one site to the next? Any help you can offer would be greatly appreciated.

    | HeadwatersContent
    0

  • Hello, Is it possible that a website gets penalised by Google because your hosting company blocked you from sending emails? Basically I got a message from my hosting company saying that they were blocking me from sending emails from our server and domain because too many had mistakes or were complained about. The same day we dropped from 2<sup>nd</sup> on a keyword to about 600<sup>th</sup> while still being ranked for other keywords. The drop was for our main keyword. Can the fact we sent “bad emails” be related to a rank drop? For the record, those were confiormation emails for account creation, they were legit, not spam. That's off-topic though.

    | EndeR-
    0

  • Hi Everybody, My clients owns a lot of domains related to his website. I redirected them to the website. So his website is: www.vallnord.com but if you type Vallnordski, vallnordsnow, etc etc they will go to the website, but they will not change the url and will keep vallnordski, vallnordsnow instead of going to vallnord.com Not very clear actually, so if you have 20 seconds to type them you will see it very clear. I was wondering if this was a good practice or it is better to actually redirect someone completely (If they type vallnordski.com take them to vallnord.com)? Is redirecting a good SEO practice? Regards, Guido.

    | SilbertAd
    0

  • Hi Mozzers, I have a site that has returned 5,168 issues with duplicate content. Where would you start? I started sorting via High page Authority first the highest being 28 all the way down to 1. I did want to use the rel=canonical tag as the site has many redirects already. The duplicates are caused by various category and cross category pages and search results  such as ....page/1?show=2&sort=rand. I was thinking of going down the lines of a URL rewrite and changing the search anyway. Is it work redirecting everything in terms of results versus the effort of changing all the 5,168 issues? Thanks sm

    | Metropolis
    0

  • Hello Mozzers, One of my clients sites is "domain.co.uk" and they are looking to rank in the USA with the same domain. They are looking to change host (for unrelated reasons) and I think it may be beneficial for them to get hosting in the USA. Essentially the business is moving to the USA but they want to retain their domain name as they cannot get their hands on a domain with their company name in that is .com / .net / .org etc. . . I know that the .co.uk domain will adversely affect click through rates in the states, but there seems to be no way around this if they want their retain the company name as their domain name. Would American based hosting help them rank better for searches from the USA or is the benefit of this negligible? Net66

    | net66
    0

  • When I check for keyword rankings directly in Google or Bing, I find relatively high ratings. when my client checks at his location, he gets another ranking, much lower. What accounts for the difference?

    | TexasBebe
    0

  • Howdy Mozzers! If I do a 301 redirect from a domain that say has 200 linking root domains, to a fresh domain will the fresh domain now (when updated) have a linking root domain count of 200? Also, is it beneficial or detrimental to do a 301 redirect for an unrelated website i.e. garden hose website, to a children's playgorund equipment website in order to capture the link juice? Best

    | clickfactory
    0

  • I help develop an online shopping cart and after a request from management about some products not showing up in the SERP's I was able to pinpoint it down to mostly a duplicate content issue. It's a no brainer as some times new products are inserted in with copied text from the manufacturers website. I recently though stumbled across a odd problem. When we partially re-wrote the content to seem unique enough it seemed to remedy the issue for some keywords and not others. A) If you search the company name our category listing shows as #1 ahead of the manufacturers website. We always did rank for this term. B) If you search the product name our product page is listed #3 behind two other listings which belong to the manufacturer. C) If you search the keywords together as "company product" we are still being filtered out as duplicate content. When I allow the filtered results to show we are ranking #4 It's been a full month since the changes were indexed. Before I rewrite the content even further I thought I would ask to see if any one has any insight as to what could be happening.

    | moondog604
    0

  • My company is based in Brighton. We run courses in London. If you search 'london business writing' in Google UK, you get this: http://i39.tinypic.com/35me3qs.jpg Lolwut. Google is placing a link for a map to our Brighton offices beneath the second result. For a London-related keyword that links to a page for our London courses that contains an address for our London venue. We are registered on Google maps as being based in Brighton; we also have a map of our Brighton office on our contact page. But obviously, this is not relevant to this search. How do I get rid of this map for this keyword?

    | JacobFunnell
    0

  • Working on a site for a dentist.  They have a long list of services that they want us to flesh out with text.  They provided a bullet list of services, we're trying to get 1 to 2 paragraphs of text for each. Obviously, we're not going to write this off the top of our heads.  We're pulling text from other sources and trying to rework. The question is, how much rephrasing do we have to do to avoid a duplicate content penalty?  Do we make sure there are changes per paragraph, sentence, or phrase? Thanks! Eric

    | ericmccarty
    0

  • Hi there Today I had the following question from a client Why my pages are only listed with a title (but no snippet) in the search results. What could be the possible reason? Can anyone help me to answer him? Thanks

    | nyanainc
    0

  • Hello again, after posting below problem I have received this answer and changed sitemap name Still I receive many duplicate titles and metas as google still compares old urls to new ones and sees duplicate title and description.... we have redirectged all pages properly we have change sitemap name and new sitemap is listed in webmastertools - old sitemap includes ONLY new sitemap files.... When you deleted the old sitemap and created a new one, did you use the same sitemap xml filename? They will still try to crawl old URLs that were in your previous sitemap (even if they aren't listed in the new one) until they receive a 404 response from the original sitemap. If anone can give me an idea why after 3 month google still lists the old urls I'd be more than happy thanks a lot Hello, We have changed cms for our multiple language website and redirected all odl URl's properly to new cms which is working just fine.
    Right after the first crawl almost 4 weeks ago we saw in google webmaster tool and SEO MOZ that google indexes for almost every singlepage the old URL as well and the new one and sends us for this duplicate metatags. 
    We deleted the old sitemap and uploaded the new and thought that google then will not index the old URL's anymore. But we still see a huge amount of duplicate metatags. Does anyone know what else we can do, so google doe snot index the old url's anymore but only the new ones? Thanks so much Michelle

    | Tit
    0

  • Im having over 400 crawl errors over duplicate content looking like this: http://www.mydomain.com/index.php?task=login&prevpage=http%3A%2F%2Fwww.mydomain.com%2Ftag%2Fmahjon http://www.mydomain.com/index.php?task=login&prevpage=http%3A%2F%2Fwww.mydomain.com%2Findex.php%3F etc.. etc... So there seems to be something with my login script that is not working, Anyone knows how to fix this? Thanks

    | stanken
    0

  • We recently deleted all the nofollow values on our website. (2 weeks ago) The number of pages indexed by google is the same as before? Do you have explanations for this? website : www.probikeshop.fr

    | Probikeshop
    0

  • Currently for every page I automatically add my brand name Ie: poduct xxx - brand name product yyy - brand name. is this considered good or bad practice?

    | AsafY
    0

  • How hard is it to grab expired domains? I have my eye on a domain that is expiring in 3 days, but I don't think it's quite that simple. Doesn't it go through months of waiting to become available? Is there an easy way to grab domains that are set to expire? Are the services that offer this type of service good? And who do you guys recommend?

    | applesofgold
    0

  • One of my clients is looking to start a new company and they are thinking of SEO right from the get go. While this is great for me there are a few issues that I have never really encountered before. For instance, my client knows that she will be expanding into a different city in the future but wants to generate local traffic to start with. She will initially start with CITY-A before moving to CITY-B one year later. Which of the following would be a better solution: 1)      Have CITY-A targeted on the root domain for one year, build links and grow the site for CITY-A then create two sub domains in one year targeting CITY-A and CITY-B (ie. CITY-A.companyname.com and CITY-B.companyname.com), then make the root domain a generic company site with no mention of location (or mentions of both locations). . . 2)      Create the two sub domains now and begin with CITY-A.companyname.com and have the root domain be a general overview of the company and our services without being location specific. 3)      Create the root domain (companyname.com) and have that target CITY-A and keep it targeting the initial city, then create a sub-domain in a year to target CITY-B I keep going between these solutions and seem to have hit a mental block. What are your thoughts? Any other ideas are more that welcome! Thanks, Net66

    | net66
    0

  • Hello everybody, I have the following erros after my first crawl: Duplicate Page Content http://www.peruviansoul.com http://www.peruviansoul.com/ http://www.peruviansoul.com/index.php?id=2 Duplicate Page title http://www.peruviansoul.com http://www.peruviansoul.com/ http://www.peruviansoul.com/index.php?id=2 Do you think I could fix them redirecting to http://www.peruviansoul.com with a couple of 301 in the .htaccess file? Thank you all for you help. Gustavo

    | peruviansoul
    0

  • If you do a search for my own company name or products we sell the inner pages rank higher than the homepage and if you do a search for exact content from my home page my home page doesn't show in the results. My homepage shows when you do a site: search so not sure what is causing this.

    | deciph22
    0

  • We host a few Japanese sites and Japanese fonts tend to look a bit scruffy the larger they are. I was wondering if image replacement for H1 is risky or not? eg in short... spiders see: Some header text optimized for seo then in the css h1 {
    text-indent: -9999px;
    } h1.header_1{ background:url(/images/bg_h1.jpg) no-repeat 0 0; } We are considering this technique, I thought I should get some advise before potentially jeopardising anything, especially as we are dealing with one of the most important on page elements. In my opinion any attempt to hide text could be seen as keyword stuffing, is it a case that in moderation it is acceptable? Cheers

    | -Al-
    0

  • A site we are working on currently gives no indication of the subfolders in the URL. Eg. the site uses: www.examplesite.com/brand-name Rather than: www.examplesite.com/popular-products/brand-name There are breadcrumbs on site to show the user what part of the site they are in and how they navigated there. We are building a new site and have to decide what route to take: Since the site is already performing relatively well in the SERPs and the URLs are nice and short this way, is it a good idea to keep them like this or is it better for usability to include the subfolders? This post suggests that we would be best off to keep the URLs as they are - particularly since less would be changed http://www.seomoz.org/blog/should-i-change-my-urls-for-seo Thanks in advance for your opinions! Liz @lizstraws

    | oneresult
    0

  • Our client has two almost identical sites targeting: Australia (www.mysite.com.au) Rest of World (www.mysite.com) Currently they have  splash page on www.mysite.com asking users to select: Australia Rest of World (redirects to: www.mysite.com/home) I'm thinking they should get rid of slash page and simply auto detect if user on www.mysite.com is based in Australia and serve a message "Do you want to visit our .com.au site. It's not helped by the fact the .com site appears to get served ahead of .com.au iin australia as both sites are hosted in US.  Looking to change this! Thanks in advance for your help!

    | steermoz7
    0

  • I manually listed my site in a few hundred free directories, two paid directores (Joe ant $40, and dirmania $12), and 50 directories that require a reciprocal link ( I paid for a cheap service that gets around having to do the reciprocal). I made the big mistake of having the title and the description for these as the same or very close to the same...is this a huge problem?  Should I have my site removed from the free directories or just let it go? I've since stopped focusing on all the directories, and considering saving up to get in Yahoo directory.  Working now on getting legit and relevant links from .edu sites.

    | eugenecomputergeeks
    0

  • I have a fairly large FAQ section and every article has a "print" button. Unfortunately, this is creating a page for every article which is muddying up the index - especially on my own site using Google Custom Search. Can you recommend a way to block this from happening? Example Article: http://www.knottyboy.com/lore/idx.php/11/183/Maintenance-of-Mature-Locks-6-months-/article/How-do-I-get-sand-out-of-my-dreads.html Example "Print" page: http://www.knottyboy.com/lore/article.php?id=052&action=print

    | dreadmichael
    0

  • Hi, I have been wondering for a few weeks if the order of keyword usage for a long tail keyword made a difference. Today I ran an on-page report here for a new page which is a review of a product. The report warned about the keyword usage in URL which made me question my knowledge about this. let's say the page is titled Razer Mouse Review my URL is www.example.com/review/razer-mouse I thought it was a bad idea to repeat the same word in a URL, that's why I categorized all my reviews under review directory and avoided using the word "review" more than once. Should I modify this url and make it www.example.com/review/razer-mouse-review Note: I see the report listed this under "moderate importance factors" and still gave the page A grade. any ideas appreciated!

    | Gamer07
    0

  • Our site, IrishCentral.com has been experiencing issues with GA since 6:00AM ET 3/15. Our "realtime" analytics withing the new GA interface have been fine and no changes have been made to the site code at all. I'm wondering if anyone else is experiencing these issues and if there is a resolution. We are fine w/o them as long as we know that the aggregation of the data is delayed and not forgotten. We are reaching 1 million uniques this month and would be a shame to lose this data. Any help is greatly appreciated. Joe

    | Irishcentral
    1

  • Hi, Lets say you have a cleaning company, you have a services page, which covers window cleaning, carpet cleaning etc, lets say the content on this page adds up to around 750 words. Now lets say you would like to create new pages which targeted location specific keywords in your area. The easiest way would be to copy the services page and just change all the tags to the location specific term but now you have duplicate content. If I wanted to target 10 locations, does this now mean I need to generate 750 words of unique content for each page which is basically the services page rewritten? Cheers

    | activitysuper
    0

  • Hi Guys, Maybe a weird question but how would you advise using Schema.org for product listings or if you prefer a sub category page with products listed in it. Thanks, Walid

    | walidalsaqqaf
    0

  • We are currently looking at a site for a client, where instead of featuring standard file structure, every folder is being buried two folders deep by the CMS. So the homepage is: www.domain.com.au/folder/folder And a subpage is: www.domain.com.au/folder/folder/subpage Is this necessarily and SEO problem?  Will it be positive for rankings to pull out the two redundant folders? Any insights are appreciated! Cheers

    | MarketingResults
    0

  • Hi there, I know many people might ask this kind of question, but nevertheless .... 🙂 In our CMS, one single URL (http://www.careers4women.de/news/artikel/206/) has been produced nearly 9000 times with strings like this: http://www.careers4women.de/news/artikel/206/$12203/$12204/$12204/ and this http://www.careers4women.de/news/artikel/206/$12203/$12204/$12205/ and so on and so on... Today, I wrote our IT-department to either a) delete the pages with the "strange" URLs or b) redirect them per 301 onto the "original" page. Do you think this was the best solution? What about implementing the rel=canonical on these pages? Right now, there is only the "original" page in the Google index, but who knows? And I don't want users on our site to see these URLs, so I thought deleting them (they exist only a few days!) would be the best answer... Do you agree or have other ideas if something like this happens next time? Thanx in advance...

    | accessKellyOCG
    0

  • Hello, is there a way to allow a certain child directory in robots.txt but keep all others blocked? For instance, we've got external links pointing to /user/password/, but we're blocking everything under /user/.  And there are too many /user/somethings/ to just block every one BUT /user/password/. I hope that makes sense... Thanks!

    | poolguy
    0

  • I know Matt Cutts suggestions we allow bots to crawl css and javascript folders (http://www.youtube.com/watch?v=PNEipHjsEPU) But what if you have lots and lots of JS and you dont want to waste precious crawl resources? Also, as we update and improve the javascript on our site, we iterate the version number ?v=1.1... 1.2... 1.3... etc. And the legacy versions show up in Google Webmaster Tools as 404s. For example: http://www.discoverafrica.com/js/global_functions.js?v=1.1
    http://www.discoverafrica.com/js/jquery.cookie.js?v=1.1
    http://www.discoverafrica.com/js/global.js?v=1.2
    http://www.discoverafrica.com/js/jquery.validate.min.js?v=1.1
    http://www.discoverafrica.com/js/json2.js?v=1.1 Wouldn't it just be easier to prevent Googlebot from crawling the js folder altogether? Isn't that what robots.txt was made for? Just to be clear - we are NOT doing any sneaky redirects or other dodgy javascript hacks. We're just trying to power our content and UX elegantly with javascript. What do you guys say: Obey Matt? Or run the javascript gauntlet?

    | AndreVanKets
    0

  • I have 4 pages for a single product Each of the pages link to the Main page for that product Google is indexing the secondary pages above my preferred landing page How do I fix this?

    | Bucky
    0

  • What are the precautions to be taken in redesigning the website ? do it effect on link building? I am planing to re design my website, most of the Keywords are already optimized by Google, and i have given many back links to it . After redesigning my website will it get effected? Kindly answer my question

    | PrasanthMohanachandran
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.