Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • Hi Guys, I have a (newbie) question, untill recently I didn't had my robot.txt written properly so Google indexed around 1900 pages of my site, but only 380 pages are real pages, the rest are all /tag/ or /comment/ pages from my blog. I now have setup the sitemap and the robot.txt properly but how can I get the other pages out of Google? Is there a trick or will it just take a little time for Google to take out the pages? Thanks! Ramon

    | DennisForte
    0

  • Hi everyone. Two questions regarding keyword domains (e.g. "widgets.com") If we have to choose a domain with an extra word, does it make a difference to have the added word before or after? E.g. "my-widgets.com" vs "widgets-now.com" Does it make a difference if the extra word is a generic vs a 'real' word? E.g. "my-widgets.com" vs "japanese-widgets.com" Thanks a lot for your feedback!

    | hectorpn
    0

  • Hi, So, I have a term that would bounce around #3 to 5 for. I make a page some months ago that is solely targeted to that term. And, voila! Google sees the new page as the best result instead of the home page and the new page ranks, but at #9 or #10. Of course the homepage is a stronger page, but the new page is better targeted to that term. Using the handy SEOMoz toolbar, the homepage has a page authority of 59 and the newer interior page has 32. Both are equally functional to my purpose. Part of me just wants to 301 in the interior page to the homepage and forget about it. It would take forever to get that interior page up to a similar page authority through the magic of links. What do you think I should do. I feel like I'm channeling Wile E. Coyote. Thanks! Best...Mike

    | 94501
    0

  • Hi, I'm having some confusion with the rel-canonical tag. A few months ago we implemented the rel-canonical tag because we had many errors specifically duplicate page content come upon the SEOmoz web  app (mostly because we use tracking code). I had asked what to do about this and was advised by the SEOmoz web app to implement the rel-canonical tag. However, when I'm working on the Keyword Optimizer Tool, it always checks off that I'm using the rel-canonical tag improperly, and then when I go into our sites' CMS for that page and uncheck "Use Canonical URL", the keyword optimizer tool up's my grade for that correction/that I've made an improvement. So my question is if the page I'm working on is the one I want search engines to find, should I not be using the Canonical URL tag? Should the Canonical URL tag only be used on URL's with the tracking code?

    | aircyclemegan
    0

  • Hi First, to be upfront - these are not my websites, I'm asking because they are trying to compete in my niche. Here's the details, then the questions... There is a website that is a few months old with about 200 indexed pages and about 20 links, call this newsite.com There is a website that is a few years old with over 10,000 indexed pages and over 20,000 links, call this oldsite.com newsite.com acquired oldsite.com and set a 301 redirect so every page of oldsite.com is re-directed to the front page of newsite.com newsite.com & oldsite.com are on the same topic, the 301 occurred in the past week. Now, oldsite.com is out of the SERPs and newsite.com is pretty much ranking in the same spot (top 10) for the main term. Here are my questions; 1. The 10,000 pages on oldsite.com had plenty of internal links - they no longer exists, so I imagine when the dust settles - it will be like oldsite.com is a one page site that re-diretcts to newsite.com ... How long will a ranking boost last for? 2. With the re-direct setup to completely forget about the structure and content of oldsite.com, it's clear to me that it was setup to pass the 'Link Juice' from oldsite.com to newsite.com ... Do the major SE's see this as a form of SPAM (manipulating the rankings), or do they see it as a good way to combine two or more websites? 3. Does this work? Is everybody doing it? Should I be doing it? ... or are there better ways for me to combat this type of competition (eg we could make a lot of great content for the money spent buying oldsite.com - but we certainly wouldn't get such an immediate increase to traffic)?

    | RR500
    0

  • We have a site where we have certain navigational links solely for the human user. These links help the user experience and lead to pages that we don't need crawled by googlebot. We have these links in javascript so if you disable javascript these links are invisible. Will these links be considered cloaking even though our intention is not to cloak but save our Google crawl for pages we do want indexed?

    | CruiseControl
    0

  • So here is the problem... We have setup a 301redirect for our clients website. When you search the clients name it comes up with the old .co.uk website. We have made this redirect to the new .com website. However on the SERPs when it shows the .co.uk it shows the old title pages which currently say 'Holding Page'. When you click on that link it takes you to the fully functioning .com website. My question is, will the title tags in the SERPs which show the .co.uk update to the new ones from the .com? I'm thinking it will be just a case of Google catching up on things and it will sort itself out eventually. If anyone could help I would REALLY appreciate it. Thanks Chris

    | Weerdboil
    0

  • Hello everybody, my question focus on special parameters in URL. I i am working for a website that use a lot of special entities in their URLS. For instance: www.mydomain.com/mykeyword1-mykeyword2%2C-1%2Cpage1.html I am about to make 301 redirect rules for all these urls to clean ones. IE: www.mydomain.com/mykeyword1-mykeyword2%2C-1%2Cpage1
    would become:
    www.mydomain.com/mykeyword1-mykeyword.html I just wanted to know if anybody has already done this kind of "cleanup" and if i could expect a positive boost or not. Thanks

    | objectif-mars
    0

  • Hi there, I would be very grateful if you can provide me with an explanation to the following so I understand it better - what do these heading mean? Domain Authority: (out of 100) Domain MozRank: Domain MozTrust: Total Links: Ext. Followed Links: Linking Root Domains: Followed Linking Root Domains: Linking C-Blocks: Thanks very much guys, much apprciated. Thanks Gareth

    | GAZ09
    0

  • Hello, Little confused on this matter. From a website architectural and content stand point, what is the difference between a category page and a content page? So lets say I was going to build a website around tea. My home page would be about tea. My category pages would be: White Tea, Black Tea, Oolong Team and British Tea correct? (  I Would write content for each of these topics on their respective category pages correct?) Then suppose I wrote articles on organic white tea, white tea recipes, how to brew white team etc...( Are these content pages?) Do I think link FROM my category page ( White Tea) to my ( Content pages ie; Organic White Tea, white tea receipes etc) or do I link from my content page to my category page? I hope this makes sense. Thanks, Bill

    | wparlaman
    0

  • I run a baseball site (http://www.mopupduty.com) that is in a very good link neighbourhood. ESPN, The Score, USA Today, MSG Network, The Toronto Star, Baseball Prospectucs, etc etc. New content has not been getting indexed on Google ever since the last update. Site has no dup content, 100% original. I can't think of any spammy links, we get organic links day after day. In the past Google has indexed the site in minutes. It currently has expanded site links within Google search. Bing & Yahoo index the site in minutes. Are there any quick fixes I can make to increase my chance to get indexed by Google. Or just keep pumping out content and hope to see a change in the upcoming future?

    | mkoster
    1

  • Hi all, I have a site that has a great deal of duplicate content because my clients list the same content on a few of my competitors sites. You can see an example of the page here: http://tinyurl.com/62wghs5 As you can see the search results are on the right. A majority of these results will also appear on my competitors sites. My homepage does not seem to want to pass link juice to these pages. Is it because of the high level of Dup Content or is it because of the large amount of links on the page? Would it be better to hide the content from the results in a nofollowed iframe to reduce duplicate contents visibilty while at the same time increasing unique content with articles, guides etc? or can the two exist together on a page and still allow link juice to be passed to the site. My PR is 3 but I can't seem to get any of my internal pages(except a couple of pages that appear in my navigation menu) to budge of the PR0 mark even if they are only one click from the homepage.

    | Mulith
    0

  • So I have this client who's got a killer blogger blog—tons of inbound links, great content, etc. He wants to move it onto his new website. Correct me if I'm wrong, but there isn't a single way to 301 the darn thing. I can do meta refresh and/or JavaScript redirects, but those won't transfer link juice, right? Is there a best practice here? I've considered truncating each post and adding a followed "continue reading…" link, which would of course link to the full post on the client's new site. It would take a while and I'm wondering if it would be worth it, and/or if there are any better ideas out there. Sock it to me.

    | TheEspresseo
    0

  • I'm looking to move 5 of my sites from Hostgator's shared servers to Media Temple's dedicated virtual servers. Anyone have experience with (mt)? I'm planning on adding a few more sites this year and several things they offer are attractive to me: A (virtually) dedicated environment: Faster websites, better user experience, plus I like having some control over my site's resources Scalability: I can add more resources easily (although not super cheap) Unique control panels for each site: More control for my tech savvy clients. Unique IPs for $1 a month: More linkjuice between my related sites. $50/month is a big jump from my $12/month Hostgator account but I'm thinking it will be worth it. Am I on the right track or is this a fool's errand?

    | AaronParrish
    0

  • Noticed that a high-profile site uses a very flat structure for there content. It essentially places most landing pages right under the root domain folder. So a more conventional site might use this structure: www.widgets.com/landing-page-1/ www.widgets.com/landing-page-1/landing-page-2/ www.widgets.com/landing-page-1/landing-page-2/landing-page-3/ This site in question - a successful one - would deploy the same content like this: www.widgets.com/landing-page-1/ www.widgets.com/landing-page-2/ www.widgets.com/landing-page-3/ So when you're clicking deeper into the nav. options the clicks always roll up to the "top level." Top level pages are given more weight by SEs but conventional directory structures are also beneficial seen as ideal.  Why would a site take the plunge and organize content in this way? What was the clincher?

    | DisneyFamily
    1

  • We use CSS to arranging (absolute positioning) our content to makes it easier to crawl. I am using your On-Page Keyword Optimization tool and other tools to check our pages (i.e. http://www.psprint.com/gallery/invitation-cards), to make sure it works. For the “On-Page Keyword Optimization” tool, it gives a petty good grade (I guest it sees the text in the body). However, when I am using other tool to test the page (e.g. http://tools.seobook.com/general/spider-test/) it could not see the text in the body. Did we do something wrong? Thanks Tom

    | tomchu
    0

  • Hi, Operating from the Netherlands with customers troughout Europe we have for some countries the same content. In the netherlands and Belgium Dutch is spoken and in Germany and Switserland German is spoken. For these countries the same content is provided. Does Google see this as duplicate content? Could it be possible that a german customer gets the Swiss website as a search result when googling in the German Google? Thank you for your assistance! kind regards, Dennis Overbeek Dennis@acsi.eu

    | SEO_ACSI
    0

  • My site takes double the time per kb than my competitors. it hosted on shared hosting with Godaddy.com Any ideas why this may be happening?

    | atohad
    0

  • Hi, We have recently made some changes to our agency site. Looking in webmaster tools we have identified a number of old pages with existing link juice. Not a great deal mostly 32/100 PA. There are a mixture of URLs "meet the team" and people pages etc. The anchor text on the majority of pages is our brand name. Could we now 301 all these pages to one page, or is this a no no in the eyes of Google? Any help greatly appreciated. Best Regards Sean

    | Yozzer
    0

  • I run a blog/directory site. Recently, I changed directory software and, as a result, Google is showing 404 Not Found crawling errors for about 750 non-existent pages. I've had some suggest that I should implement a 301 redirect, but can't see the wisdom in this as the pages are obscure, unlikely to appear in search and they've been deleted. Is the best course to simply manually enter each 404 error page in to the Remove Page option in Webmaster Tools? Will entering deleted pages into the Removal area hurt other healthy pages on my site?

    | JSOC
    0

  • Have an issue with a large number of 'Not Found' webpages being listed in Webmaster Tools.  In the 'Detected' column, the dates are recent (May 1st - 15th).  However, looking clicking into the 'Linked From' column, all of the link sources are old, many from 2009-10. Furthermore, I have checked a large number of the source pages to double check that the links don't still exist, and they don't as I expected. Firstly, I am concerned that Google thinks there is a vast number of broken links on this site when in fact there is not. Secondly, why if the errors do not actually exist (and never actually have) do they remain listed in Webmaster Tools, which claims they were found again this month?! Thirdly, what's the best and quickest way of getting rid of these errors?  Google advises that using the 'URL Removal Tool' will only remove the pages from the Google index, NOT from the crawl errors.  The info is that if they keep getting 404 returns, it will automatically get removed.  Well I don't know how many times they need to get that 404 in order to get rid of a URL and link that haven't existed for 18-24 months?!! Thanks.

    | RiceMedia
    0

  • I am working on removing unnecessary meta tags that have little impact on SEO and I have read so many mixed reviews about using the Meta 'Cache' tag. I need to informative information on whether or not this tag should be used.

    | ImagetecLP
    0

  • Hi - We're switching eCommerce platforms, and naturally we're worried about losing organic search ranking. From what I've read on the message boards, I understand it's important to try to minimize as many 301 redirects as possible. Here's my problem: Our Product URLs are like this (ex: http://www.stupid.com/fun/TOLMG.html). On the new platform, URLs cannot contain capital letters. 😞 According to the new eCommerce platform's design team: "Google and other search engines do not see that as a change in URL, they are not case sensitive and will not affect search listings" How accurate is this? And how come on our current platform, if I use an all lowercase URL, it get a 401? (ex: http://www.stupid.com/fun/tolmg.html) Will we be fine switching our Product URLs to lowercase on the new platform? One thing also to note: Our Category URLs will remain the same. Are there any other areas of a typical eCommerce store that I should avoid changing URLs if I want to prevent SEO loss? Thanks! -Justin

    | JustinStupid
    0

  • Hi, What's the best/easiest way for a client to grant access to his Google Webmaster Tools to me? Thanks! Best...Michael

    | 94501
    0

  • I just found something weird I can't explain, so maybe you guys can help me out. In Google http://www.google.nl/#hl=nl&q=internet. The number 3 result is a big telecom provider in the Netherland called Ziggo. The ranking URL is https://www.ziggo.nl/producten/internet/. However if you click on it you'll be directed to https://www.ziggo.nl/#producten/internet/ HttpFox in FF however is not showing any redirects. Just a 200 status code. The URL https://www.ziggo.nl/#producten/internet/ contains a hash, so the canonical URL should be https://www.ziggo.nl/. I can understand that. But why is Google showing the title and description of https://www.ziggo.nl/producten/internet/, when the canonical URL clearly is https://www.ziggo.nl/? Can anyone confirm my guess that Google is using the bulk SEO value (link juice/authority) of the homepage at https://www.ziggo.nl/ because of the hash, but it's using the relevant content of https://www.ziggo.nl/producten/internet/ resulting in a top position for the keyword "internet".

    | NEWCRAFT
    0

  • Today's sitemap webinar made me think about the disallow feature, seems opposite of sitemaps, but it also seems both are kind of ignored in varying ways by the engines. I don't need help semantically, I got that part. I just can't seem to find a contemporary answer about what should be blocked using the robots.txt file. For example, I have folders containing site comps for clients that I really don't want showing up in the SERPS. Is it better to not have these folders on the domain at all? There are also security issues I've heard of that make sense, simply look at a site's robots file to see what they are hiding. It makes it easier to hunt for files when they know the directory the files are contained in. Do I concern myself with this? Another example is a folder I have for my xml sitemap generator. I imagine google isn't going to try to index this or count it as content, so do I need to add folders like this to the disallow list?

    | SpringMountain
    0

  • I'm changing title and description several times throught 1-2 months...  will Google penalize me? I'm worried that testing several title and description ideas would affect the site rankings on google. Anyone had a problem with that... or can i continue testing as many titles and descriptions as i want?

    | mosaicpro
    0

  • Hello I want to create a new web site, I have to choose between: 1/ using one of my former domain name that dates  about 10 years ago, which has a lot of backlink (PageRank 4) but without the anchor text of the new area of activity that I want to launch ... 2/or purchase a new domain name with two keywords which I want to be positioned in the first place: P Which method is better for good and quickly results ? Thank you for your help

    | Moncer
    0

  • I never thought of this question before. Maybe because i didn't focus myself on content but only on optimizing existing content from clients. So how do you measure the content on a specific page?

    | mosaicpro
    0

  • I know this can be a "it depends" answer so I'll try to explain. Qualifications on your answers would be great. I use the Wordpress architecture for myself and clients on sites and blogs. Almost every business site we create has a blog and I'm always working to improve results on them. My strategy has been the following: Categories: General, main content types, general keywords. Index, follow Tags: Very specific, post specific, may only be used once for one post. My categories have descriptions that are displayed on the category pages with excerpts. Tags rarely have a description but are displayed with excerpts on the page. My idea has been to index the categories to crawl the content and they have unique content by showing the category description. Tags shouldn't be archived because they may be all over the place and may have only 1 post with no tag description. I'm trying to reduce duplicate content but I don't want to limit results for my clients and myself. Should I set tags to noindex, follow or should I have them indexed? The only thing I'm thinking with having the tags indexed is that I may be able to get additional traffic through the more specific tags (i.e. tag = meta tags, category = SEO).

    | JaredDetroit
    0

  • My company will undergo a domain change in the next few months. Other than implementing 301 redirects, what else can I do to help the search engines realize the site has moved? What kind of impact on rankings can I expect to see? Thanks in advance!

    | raylau
    0

  • We're working on a tool using the seomoz api ... for domains we're always getting the right values, but for longer URLs we're having troubles ... Example: http://www.seomoz.org/blog/6-reasons-why-qa-sites-can-boost-your-seo-in-2011-despite-googles-farmer-update-12160 won't work http://www.seomoz.org/blog works Any idea what we might be doing wrong?

    | gmellak
    0

  • We run a network of three local web sites covering three places in close proximity. Each sitehas a lot of unique content (mainly news) but there is a business directory that is shared across all three sites. My plan is that the search engines only index the business in the directory that are actually located in the place the each site is focused on. i.e. Listing pages for business in Alderley Edge are only indexed on alderleyedge.com and businesses in Prestbury only get indexed on prestbury.com - but all business have a listing page on each site. What would be the most effective way to do this? I have been using rel canonical but Google does not always seem to honour this. Will using meta noindex tags where appropriate be the way to go? or would be changing the urls structure to have the place name in and using robots.txt be a better option. As an aside my current url structure is along the lines of: http://dev.alderleyedge.com/directory/listing/138/the-grill-on-the-edge Would changing this have any SEO benefit? Thanks Martin

    | mreeves
    0

  • Hi Moz Team, somehow the last question i raised a few days ago not only wasnt answered up until now, it was also completely deleted and the credit was not "refunded" - obviously there was some data loss involved with your restructuring. Can you check whether you still find the last question and answer it quickly? I need the answer 🙂 Here is one more question: I bought a website that has a huge forum, loads of pages with user generated content. Overall around 500.000 Threads with 9 Million comments. The complete forum is noindex/nofollow when i bought the site, now i am thinking about what is the best way to unleash the potential. The current system is vBulletin 3.6.10. a) Shall i first do an update of vbulletin to version 4 and use the vSEO tool to make the URLs clean, more user and search engine friendly before i switch to index/follow? b) would you recommend to have the forum in the folder structure or on a subdomain? As far as i know subdomain does take lesser strenght from the TLD, however, it is safer because the subdomain is seen as a separate entity from the regular TLD. Having it in he folder makes it easiert to pass strenght from the TLD to the forum, however, it puts my TLD at risk c) Would you release all forum sites at once or section by section? I think section by section looks rather unnatural not only to search engines but also to users, however, i am afraid of blasting more than a millionpages into the index at once. d) Would you index the first page of a threat or all pages of a threat? I fear duplicate content as the different pages of the threat contain different body content but the same Title and possibly the same h1. Looking forward to hear from you soon! Best Fabian

    | fabiank
    0

  • Quick question: I want some pages that will have canonical tags, to show up in internal results for a Google site search that's built into the site. I'm not finished with the site, but is it correct to assume that pages with canonical will NOT show up in internal site search results, when powered by Google?

    | EricPacifico
    0

  • My Tech lead is concerned that his use of a script to generate XML sitemaps for some client sites may be causing negative issues for those sites. His concern centers around the fact that the script generates a sitemap which indicates that every URL page in the site was last modified at the exact same date and time. I have never heard anything to indicate that this might be a problem, but I do know that the sitemaps I generate for other client sites can choose server response or not. What is the best way to generate the sitemap? Last mod from actual time modified, or all set at one date and time?

    | ShaMenz
    0

  • We have just had an outside SEO agency report on our site: One of things brought up were arounf broken links, and how they class them as broken links. Could any body tell me whether this statement holds true please, as I am not aware of this "Our latest intelligence shows that google are downgrading ranking from sites that feature 301 redirects within the internal link structure". Any help would be greatly appreciated Regards

    | Yozzer
    0

  • As most people know it's usual that the main menu is after the top right small links in html. My questions are: What is more important for google or how does google tell which link is more important than the other to pass juice on? If the top right links are in front of the main menu in html would they get more link juice than the main menu? Should i focus in working on better html structure but still keeping the same look ( reverse the html code but keep the same look through css)? Any suggestions?

    | mosaicpro
    0

  • Hello, I am trying for the first time to implement a canonical redirect on a page and would really appreciate it if someone could tell me if this was done correctly. I am trying to do a canonical redirect: -from http://www.diamondtours.com/default.aspx -to http://www.diamondtours.com/ As you will see in the source code of the default.aspx page, the line of code written is: <link rel="canonical" href="http://www.diamondtours.com" /> Is this correct?  Any guidance is greatly appreciated. Jeffrey Ferraro

    | JeffFerraro
    0

  • Hello, We rank fairly high for a lot of terms but Google is not indexing our descriptions properly. An example is with "arnold schwarzenegger net worth". http://www.google.ca/search?q=arnold+schwarzenegger+net+worth&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a When we add content, we throw up a placeholder page first. The content gets added with no body content and the page only contains the net worth amount of the celebrity. We then go back through and re-add the descriptions and profile bio shortly after. Will that affect how the pages are getting indexed and is there a way we can get Google to go back to the page and try to index the description so it doesn't just appear as a straight link? Thanks, Alex

    | Anti-Alex
    0

  • I have a smaller site that has a google penalty, webmaster tools says there might be a doorway page. Im wondering if, once the site is fixed, is it necessary to file a reconsideration request, or, if the site is no longer in violation,  will the site be included eventually by itself, without a reconsideration request?

    | serpent2011
    0

  • Hello everyone, My name is Davys and I'm what you call a newbie...so the question may sound stupid....But where we go. In this campaign I will be targeting around 130 to 150 keywords to my store. So here is the technical question. What is the right way of indexing 150 keywords? Should I attempt to have all 150 going to www.mysite.com or should I break it down into smaller pages if I can put it that way. Like www.mysite.com/pages/bikiniwax for example.  Even if I break it down, what is the right amount of keywords that I should SEO per page? Or do 150 pages? Please helppppppppppppppppppppp....  🙂 Thanks a lot for your help.

    | Davys
    0

  • I use z-indexing for a floating bar that scrolls vertically along the side of my page.  I'm not hiding anything.  Is this safe or not?

    | BradBorst
    0

  • This may sound like a stupid question, however it's important that I get this 100% straight. A new client has nearly 6k duplicate page titles / descriptions.  To cut a long story short, this is mostly the same page (or rather a set of pages), however every time Google visits these pages they get a different URL.  Hence the astronomical number of duplicate page titles and descriptions. Now the easiest way to fix this looks like canonical linking.  However, I want to be absolutely 100% sure that Google will then recognise that there is no duplicate content on the site.  Ideally I'd like to 301 but the developers say this isn't possible, so I'm really hoping the canonical will do the job. Thanks.

    | RiceMedia
    0

  • I have 2 pages showing as errors in my Crawl Diagnostics, but I have no idea where these pages have come from, they don't exist on my site. I have done a site wide search for them and they don't appear to be referenced are linked to from anywhere on my site, so where is SEomoz pulling this info from? the two links are: http://www.adgenerator.co.uk/acessibility.asp http://www.adgenerator.co.uk/reseller-application.asp The first link has a spelling mistake and the second link should have an "S" on the end of "application"

    | IPIM
    0

  • Hi One week ago, i created a blog on wordpress added the url of my blog on google, bing and yahoo. In that blog i put a link of my webshop (the site im working on SEO) but when i checked the backlinks of my webshop (with seomoz tools and yahoo explorer) , the link from the blog still doesnt show. How many days it takes for a backlink to be registered ? Thanks

    | nipponx
    0

  • OK ok . . . the SEOMox report card told me it's actually better NOT to have meta tag keywords on my page, because my competitors can then look at my page to see what words I am trying to target . . . That makes since, but is also painfully counter intuitive.  I thought I would just double check and make sure . ..  NO META TAGS KEYWORDS? and if so . . .. what (if anything) should I have in the meta tags?

    | damon1212
    0

  • Let's say you have www.example.com. On this website, you have www.example.com/example-image.jpg. When someone links externally to this image - like below... { is < {a href="www.example.com/example-image.jpg"} {img src="www.example.com/example-image.jpg"} {/a} The external site would be using the image hosted on your site, but the image is also linked back to the same image file on your site. Does this have any value even though the link is back to the image file and not the website? Also - how much value do you guys feel image links have in relation to tech links? In terms of passing link juice and adding to a natural link profile. Thanks!

    | qlkasdjfw
    1

  • Hi all, I'm taking over a site that has some redirect issues that need addressed and I want to make sure this is done right the first time. The problem: Our current setup starts with us allowing both non-www and www pages. I'll address this with a proper rewrite so all pages will have www. Server info:  IIS and runs PHP. The real concern is that we currently run a browser detection for language at the root and then do a 302 redirect to /en, /ge or /fr. There is no page at the www.matchware.com. It's an immediate redirect to a language folder. I'd like to get these to a 301(Permanent) redirect but I'm not sure if a URL can have a 301 redirect that can go to 3 different locations. The site is huge and a site overhaul is not an option anytime soon. Our home page uses this: <%
    lang = Request.ServerVariables("HTTP_ACCEPT_LANGUAGE")
    real_lang = Left(lang,2)
    'Response.Write real_lang
    Select case real_lang
        case "en" 
            Response.Redirect "/en"
        case "fr"
            Response.Redirect "/fr"
        case "de"
            Response.Redirect "/ge"
        case else
            Response.Redirect "/en" End Select
    %> Here is a header response test. ++++++++++++++++++++++++++++++++++++++++++++++++++++++++ HTTP Request Header Connect to 87.54.60.174 on port 80 ... ok GET / HTTP/1.1[CRLF] Host: www.matchware.com[CRLF] Connection: close[CRLF] User-Agent: Web-sniffer/1.0.37 (+http://web-sniffer.net/)[CRLF] Accept-Charset: ISO-8859-1,UTF-8;q=0.7,*;q=0.7[CRLF] Cache-Control: no-cache[CRLF] Accept-Language: de,en;q=0.7,en-us;q=0.3[CRLF] Referer: http://web-sniffer.net/[CRLF] [CRLF] HTTP Response Header --- --- --- Status: HTTP/1.1 302 Object moved Connection: close Date: Fri, 13 May 2011 14:28:30 GMT Server: Microsoft-IIS/6.0 X-Powered-By: ASP.NET Location: /ge Content-Length: 124 Content-Type: text/html Set-Cookie: ASPSESSIONIDQSRBQACT=HABMIHACEMGHEHLLNJPMNGFJ; path=/ Cache-control: private Content (0.12 <acronym title="KibiByte = 1024 Byte">KiB</acronym>) <title></span>Object moved<span class="tag"></title> # Object Moved This object may be found <a< span="">HREF="/ge">here. +++++++++++++++++++++++++++++++++++++++++++++++++++++ To sum it up, I know a 302 is a bad option, but I don't know if a 301 is a real option for us since it can be redirected to 1 of 3 pages? Any suggestions?</a<>

    | vheilman
    1

  • We need to move our domain from one account to another at our domain registrar (which is Moniker).  Both the "from" account and "to" account will be at Moniker.  The "from" account currently has privacy settings enabled and we'd also put these in place for the "to" account. Has anyone done this and see any impact on SEO?  Is there any big or common mistakes that I should be aware of? Thanks all!

    | evoNick
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.