Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • I'm redesigning a client's site, breaking up content into separate pages and moving the site to a managed server keeping the same domain name.  Will I lose any link juice or keyword ranking for site even if I 301 redirect the old pages to the new titled pages?

    | sirmarkthomas
    0

  • Forgive me for the novice question. But I was recently looking at open site explorer and was checking out my site www.visualawards.com , I know we have a re-direct to www.visualawards.com/home.php . After checking both URL's, I found that I have links pointing to both. Is this bad, am I diluting the links? If yes, which one should I point the future ones to, and is there anyway to recover the current links already? Thanks again for your help!!!

    | RENDEV
    0

  • I was recently looking at a blog post here or a webinar and it showed a website where you could see all of the local sites (yelp, Google places) where your business has been submitted.  It was an automated tool.  Does anyone remember the name of the site?

    | beeneeb
    0

  • I just have a quick query and I have a feeling about what the answer is so just wanted to see what you guys thought... Basically I am working on a client site. This client has a few other websites that are divisions of their company. However these divisions/websites are no longer used. They are wanting to delete the websites but redirect the domains to their name main website. They believe this will pass on SEO benefits as these old division sites are old and have a good PR and history. I'm unsure for DEFINITE, which way is correct?

    | Weerdboil
    0

  • Hello, I am brand new to SEO and I'm learning on the go everyday. I am having issues with Google and getting any sort of ranking or analysis or even just traffic reports. I understand the site has never really been optimized so it might really not have any reports. So basically my real question is what helpful tricks or hints do you guys have that I can implement? Anything and everything helps. So far I have run the crawl diagnostics and I'm working on fixing the errors. Thanks fr your help.

    | Future13
    0

  • I'm trying to find the best tool to check for broken links on our site. We have over 11k pages and I'm looking for something fast and thorough! I've tried Xenu and LinkChecker. Any other ideas?

    | CIEEwebTeam
    0

  • When our Canadian users who search on google.ca for our brand (e.g. Travelocity, Travelocity hotels, etc.), the first few results our from our US site (travelocity.com) rather than our Canadian site (travelocity.ca). In Google Webmaster Tools, we've adjusted the geotargeting settings to focus on the appropriate locale, but the wrong country TLD is still coming up at the top via google.ca. What's the best way to ensure our Canadian site comes up instead of the US site on google.ca? Thanks, Tory Smith
    Travelocity

    | travelocitysearch
    0

  • I have looked around and only saw older and contradicting responses to this question but what effect does having a domain with VALUABLE-KEYWORD.com forward to MAINSITE.com or COMMON-MISSPELLING.com forward to MAINSITE.com in terms of SEO and is it considered spammy or looked down upon

    | treytt
    0

  • Okay, so I'm working on my site, minding my own business and everything is just great. Many different terms on pages 1 -4, all inching ever upward. When suddenly, one term that got all the way to #11 gets blasted to oblivion... #75! Is there some kind of landmine one could step on that would cause that? All the other terms are fine.Nothing on the page changed since it first went up two months ago. Thanks!

    | 94501
    1

  • Hi, Our site was originally created with a very flat folder structure  - most of the pages are at the top level. Because we will adding more content I want to tidy up the structure first. I just wanted to check what the best way to go about this was. Is it best to: First configure all the new 301 redirects to point to the new pages, while leaving the actual links on our site pointing to the old pages. Then changing the links on the site after a few weeks. Configure the redirects and change the actual links on my website at the same time to point to the new locations. My thinking that if I go with option 1 route then I will give Google a chance to process all the redirects and change the locations in their index before I start pointing them to the new locations. But does it make any difference? What is the best wat to go about making this sort of change to minimize any loss in rankings, page rank etc? Thanks for the help.

    | Maximise
    0

  • Hello, here something interesting I'm Using Rank Tracker from SEOMOZ And from the link-assistant's Rank Tracker as Well... I need to track Google.com and Google.co.ve (venezuela) so I did... i got my keyword an here are my results. 1 Keyword A at google.com (united states) Rank Tracker SEOMOZ = pos 6 Rank Tracker OTHER = pos 6 Manual Query on google.com = 9 (I used the exact url seomoz tells me its using) 2 Keyword A at Google.co.ve Rank Tracker SEOMOZ = pos 8 Rank Tracker OTHER = pos 7 Manual query on google.co.ve = pos 8 So.... Why it's that?, so far I think that google.com for me down here (it actually says "Español") it's a different index? for latinamerica? only spanish pages? maybe it's because there's a couple of minutes between looking with one tool and the other... any help, would be great... Dan

    | daniel.alvarez
    0

  • Okay, So yesterday I asked a question about setting up custom error pages in IIS 6.0 to properly do a 24 hour 503 Service Temporarily Unavailable. With no answer, (not to the despair of the community as the question has NO simple or easy answer 🙂 So after a night of dreaming about solutions 🙂 I realized that we have the ability to just clone the site.... So basically it would just become a redundant server or mirror site for 24 hours. With all that being said the question is..... What SEO pitfalls might I encounter from this if any I suspect none as load balancing and redundancy is a fact of life in the WEB world, especially since it will be a MAX of 24 hours downtime for maint.

    | Jinx14678
    0

  • I current have a site that when run though a site map tool (screaming frog or xenu) returns a 404 error on a number of pages The pages are indexed in Google and when visited they do 301 to the correct page? why would the sitemap tool be giving me a different result? is it not reading the page correctly?

    | EAOM
    0

  • My site is www.optionmonster.com It has a few different areas of "Free" and "Premium" content. 1. We have a first click free program for Google visitors, where they can read the first news article in full. 2. Other visitors and subsequent news article views have a content preview, which includes headline and first paragraph of content, and then asks for a name and email address in order to consume the rest of the free news. 3. We have premium content (paid), and all of those folders are currently on our robots.txt file My questions are as follows.... a. How much content is needed in front of the "pay/reg wall" in order to still get proper juice from linkbacks/pagerank crawls b. Should I make the premium content a robots meta noindex tag, or because you can't see the article at all without logging in, is a proper place to include on robots.txt

    | Yun
    0

  • As per google the May update takes care of all content scrapping sites Then why is this site - http://www.viduba.com is still having good ranking ? All of its videos are hotlinkled from youtube

    | krishru
    0

  • I have a couple of sites using 3dcart, the ecommerce platform.  Their tech support recently told me that they do not list sub-categories in the XML sitemap, only products and top-tier categories. Am I the only one that sees a problem with this? Thanks

    | poolguy
    0

  • Hey Guys, I'm wondering which URL is preferable when targeting the keyword phrase "ski goggles" a) http://www.evo.com/shop/ski/ski-goggles.aspx  or b) http://www.evo.com/shop/ski/goggles.aspx URL a includes the keyword phrase exactly with a dash but also repeats the word "ski" and feels redundant.  Any research/ testing to support either case? Thanks a bunch. Will

    | evoNick
    0

  • Hello there, We have a lot of PDFs that seem to end up on other websites. I was wondering if there was a way to make sure that our website gets the credit/authority as the original creator. Besides linking directly from the PDF copy to our pages, is anyone aware of strategy for letting Google know that we are the original publishers? I know search engines can index HTML versions of PDFs, so is there anyway to get them to index a rel="canonical" tag as well? Thoughts/Ideas?

    | Tektronix
    0

  • Hi, I like to do something, but first like to take some opinions from seomoz. My question is: 1. I have a domain: brandtrends.com and i like to move this from brandtrends.com to trends.brand.com because: brandtrends.com is on the position #15 on the second page of SERP for my "brand" keyword. I like to move it under trends.brand.com but all inbound-links are @brandtrends.com What do you thing, if i move permanently 301 from brandtrends.com to trends.brand.com does it rank under brand.com on the 1st page of SERP..? like to rank trends.brand.com under brand.com on results page...!!! I have the backlinks of brandtrends.com on my hands too, should i leave the inbound links @brandtrends.com and the new ones i build with trends.brand.com or should i change the inbound links from brandtrends.com to trends.brand.com Hope you got it! THanks

    | leadsprofi
    0

  • Hey Mozzers, I've moved a domain 301 redirect to a new site... after around 2-3 weeks the seomoz toolbar sais i have PA & DA 33... which were both 1 as being a new domain. Does Google credit the same kind of value to the new website if seomoz found the PA & DA being "33"?

    | mosaicpro
    0

  • We we're doing pretty good with our SEO, until we added product listing pages. The errors are mostly Duplicate Page Content/Title. e.g. Title: Masterpet | New Zealand Products MasterPet Product page1 MasterPet Product page2 Because the list of products are displayed on several pages, the crawler detects that these two URLs have the same title. From 0 Errors two weeks ago, to 14k+ errors. Is this something we could fix or bother fixing? Will our SERP ranking suffer because of this? Hoping someone could shed some light on this issue. Thanks.

    | Peter.Huxley59
    0

  • What is the exact difference of "Page Level Keyword Usage" and "Page Level Keyword Agnostic Features"?

    | petrakraft
    0

  • Im working on implementing structured data properties into my product detail pages. (http://schema.org/Book) My site sells books and many books have both a 13 digit ISBN # and a 10 Digit ISBN. Should I apply the itemprop "isbn" to both of them or just the one with higher search volume? Some books also have multiple authors, how should I handle that?

    | myork0724
    0

  • I have an established blog.We update it on daily basis. In the past, when I would publish a new post, it would get indexed within a minute or so. But since a month or so, its taking hours. Sometimes like 10-12 hours for new posts to get indexed. Only thing I have changed is robots.txt. This is the current robots file. User-agent: * Disallow: /cgi-bin Disallow: /wp-admin Disallow: /wp-includes Disallow: /wp-content/plugins Disallow: /wp-content/cache Disallow: /wp-content/themes Disallow: /wp-login.php Disallow: /*wp-login.php* Disallow: /trackback Disallow: /feed Disallow: /comments Disallow: /author Disallow: /category Disallow: */trackback Disallow: */feed Disallow: */comments Disallow: /login/ Disallow: /wget/ Disallow: /httpd/ Disallow: /*.php$ Disallow: /*?* Disallow: /*.js$ Disallow: /*.inc$ Disallow: /*.css$ Disallow: /*.gz$ Disallow: /*.wmv$ Disallow: /*.cgi$ Disallow: /*.xhtml$ Disallow: /*?* Disallow: /*? Allow: /wp-content/uploads User-agent: TechnoratiBot/8.1 Disallow: # ia_archiver User-agent: ia_archiver Disallow: / # disable duggmirror User-agent: duggmirror Disallow: / # allow google image bot to search all images User-agent: Googlebot-Image Disallow: /wp-includes/ Allow: /* # allow adsense bot on entire site User-agent: Mediapartners-Google* Disallow: Allow: /* Sitemap: http://www.domainname.com/sitemap.xml.gz Site has tons of backlinks. Just wondering if something is wrong with the robots file or if it could be something else.

    | rookie123
    0

  • In the event that a person uses a service like Blogger or a photo service like Photo Shelter, but use a CNAME to resolve example.blogspot.com or example.photoshelter.com to example.com, how does that affect Domain Authority and Page Rank in real world results, and how does it affect the user when/if they leave the service and establish their own site? For example: A client has a blog on Blogger called johndoephotography.blogspot.com but uses CNAME so what is shown is johndoephotography.com. The Domain Authority is quite high since he is really on Yahoo's domain. How does that affect SERP rankings? Is it ignored, since it is merely a sub-domain, or does the parent domain actually give a benefit? The second part: If John Doe decides to host his own WordPress blog, what happens to that domain authority? Has he lost it all?

    | WilliamBay
    0

  • Something I've never been a fan of is having a blog as the home page of a site. I've always thought that it's a bit like walking into someone's house through the kitchen out back.
    If it's a vistors first time, it can be a little disconcerting or ackward even if they are not familiar with the writers style. But something just dawned on me, and I'd love a second opinion on this. For websites that focus on multiple keywords (in my most of my client's case it's usually a mix of Wedding Photography, Engagement Photography, Portrait Photography, Family Photography, etc). A lot of these clients will include the photos in a blog post along with a snippet of text that may talk about the people they're photographing and maybe a bit about where they photographed. But they're usually optimizing for the overarching keyword (Wedding... Portrait..., etc as per above). Now I'm wondering if having three or 5 posts on the home page, where most of them are focusing on a specific keyword like New York Wedding Photographer, is actually diluting the keyword they are trying to rank for. My theory is that if I have them move their blog to a domain.com/blog, and solely focus on the desired keyword on the home page, that they would do substantially better in the SERPs. Can anyone subtantiate this? Thanks!

    | WilliamBay
    0

  • First i had a problem with duplicate title errors, almost every page i had was double because my website linked to both www.funky-lama.com and funky-lama.com I changed this by adding a code to htaccess to redirect everything to www.funky-lama.com, but now my website was crawled again and the errors were actually doubled. all my pages now have duplicate title errors cause of pages like this www.funky-lama.com/160-confetti-gitaar.html funky-lama.com/160-confetti-gitaar.html www.funky-lama.com/1_present-time funky-lama.com/1_present-time

    | funkylama
    0

  • Because of hosting problems we're trying to work out, our domain was down all weekend, and we have lost all of our rankings. Doe anyone have any experience with this kind of thing in terms of how long it takes to figure out where you stand once you have the site back up? what the best SEO strategy is for immediately addressing this problem? Besides just plugging away at getting links like normal, is there anything specific we should do right away when the site goes back up? Resubmit a site map, etc? Thanks!

    | OneClickVentures
    0

  • We have been asked to look at a website and have found a 301 redirect from the domain www.domain.com to www.domain.com/home.aspx.  Why would someone do this, this way round?  We can't think of a good reason and are wondering if we have overlooked something? Thanks for your help.

    | travelinnovations
    0

  • A sector competitor has decided to link to me using the method[](javascript:OpenLink('http://www.example.com')) [It's a contextually rich page and the link is in the body surrounded by relevant text although not so high in the code](javascript:OpenLink('http://www.example.com')) [Moz Metrics of the page/domain linking to me are: PA 30 mR 2.67 mT4.61
    21 links from 9 root domains
    Root Domain DA 88 DmR 6.77 DmT 6.68 2.6m links from 24k domains Is the method of linking to me strictly from an SEO perspective: A. Positive
    B. Neutral
    C. Negative Thanks!](javascript:OpenLink('http://www.example.com'))

    | PaulGaileyAlburquerque
    0

  • Greetings! Is there an advantage in no-following links to pages like "Terms Of Use" and "Privacy Policy"... pages one isn't trying to rank for? Of course, the idea would be to not waste link juice on unimportant pages. Your thoughts? Thanks!

    | 94501
    0

  • I am monitoring two domains; a .com and a .com.au in quite a competitive sector (men's underwear online sales) last month the .com site completely dropped from rankings sight in Google International SERPs but, the .com.au site is actually ranking (quite respectively too) in the International/US Google SERPs. I have no idea how, when or why but it's now been the case for two months. Help please?

    | Vovia
    0

  • Is it possible to get a subdomain blog.site.com that is on tumblr to count toward site.com. I hoped I could point it in webmaster tools like we do www but alas no. Any help would be greatly appreciated.

    | oznappies
    0

  • For some reason SEOmoz's crawl tool is returning duplicate content URL's that don't exist on my website.  It is returning pages like "mydomain.com/pages/pages/pages/pages/pages/pricing"  Nothing like that exists as a URL on my website.  Has anyone experienced something similar to this, know what's causing it, or know how I can fix it?

    | MyNet
    0

  • They have a caching system where they assign multiple ip's based on location and we are curious how it affects SEO.

    | DragonSearch
    1

  • I have been trying to figure this out with different redirects but cannot seem to get this correct. Some of our forums link to pages that do not exist or are very old. They have (.htm) extension. I do not want to redirect the .htm to .php because the actual names of the link have changed too. What is the best code to redirect any link that has a .htm extention to the root domain? right now I have this code to redirect index.htm to the root, but that is all it works for. I think. RewriteCond %{THE_REQUEST} ^.*/index.htm RewriteRule ^(.*)index.htm$ http://www.example.com$1 [R=301,L]

    | hfranz
    0

  • Hi, 3 weeks ago I wanted to release a new website (made in WordPress), so I neatly created 301 redirects for all files and folders of my old html website and transferred the WordPress site into the index folder. Job well done I thought, but after a few days, my site suddenly disappeared from google. I read in other Q&A's that this could happen so I waited a little longer till I finally saw today that there was a meta robots added on every page with "noindex, nofollow". For some reason, the WordPress setting "I want to forbid search engines, but allow normal visitors to my website" was selected, although I never even opened that section called "Privacy". So my question is, will this have a negative impact on my pagerank afterwards? Thanks, Sven

    | Zitana
    0

  • Meaning, if you have a lot of bad quality links (directories, blog comments) that are giving great rankings for some terms (on a homepage of a site), could the low quality of these links negatively affect the crawling frequency of interior pages or perhaps even give interior pages a ranking penalty?

    | qlkasdjfw
    0

  • I've looked at most of the question in the Q&A who speak about pagination but didn't find a clear answer to my concern. So here is my question : On the website i work for, we have list of recipes with this info for each recipe : picture, title, type, difficulty, time and author. 10 recipes per pages and X pages for each list. Would you use link rel canonical on page X with first page as value ? (i've seen this answer in one question here)
    Or canonicalize to page X keeping each page of the list in the index ?
    Would the content be seen as duplicate if we don't use rel canonical and just add page X in the title? Or would it be unique enough with all the infos? Thanks for your help on this !

    | kr0hmy
    0

  • Hi all, I have been working in online marketing for a while, predominantly working on the affiliate marketing and business development side. Over the last year I have decided to focus my energies on getting to understand SEO and have picked up some actionable tips and strategies to put together an SEO strategy for the business I work for whilst at the same time broadening my skillset in online marketing. I would like to think I now have a good understanding on key areas of the SEO framework such as keyword research, on page optimisation and link building and look to put my learinings into practise as well as continue to play around through my personal blog An area that I see as more and more important so to get a full understanding is the technical side, getting a decent understanding on the learnings of HTML and CSS and putting this into practise. Can anyone recommend a detailed tutorial, powerpoint that provides insights and learnings on how to best understand this side of SEO? Thanks Simon

    | simonsw
    0

  • Three tools to render a site as Googlebot would see it: SEOMoz toolbar.
    Lynxviewer (http://www.yellowpipe.com/yis/tools/lynx/lynx_viewer.php )
    Fetch as Googlebot. I have a website where I can see dropdown menus in regular browser rendering, Lynxviewer and Fetch as Googlebot. However, in the SEOMoz toolbar 'render as googlebot' tool, I am unable to see these dropdown menus when I have javascript disabled. Does this matter? Which of these tools is a better way to see how googlebot views your site?

    | qlkasdjfw
    0

  • Google is using a secure version of a page (https) that is meant to be displayed using only http.  I don't know of any links to the page using https, but want to verify that.  I only have 1 secure page on the site and it does not link to the page in question. What is the easiest way to nail down why Google is using the https version?

    | TheDude
    0

  • I'd like to know what I can do to get the correct title tag + meta description that I have on the page for www.myescondidomovers.com/ to actually show up in the SERP's on Google? It's currently just showing my main keyword and the domain name, nothing else. See attached and thanks in advance for you help. Much appreciated. SERPS.png

    | afranklin
    0

  • Hello there, My site is basically an Ajax application. We assume lots of people link into deep pages on the site, but bots won't be able to read past the hashmarks, meaning all links appear to go to our home page. So, we have decided to form our Ajax for indexing. And so many questions remain. First, only Google handles indexable Ajax, so we need to keep our static "SEO" pages up for Bing and Yahoo. Bummer, dude, more to manage. 1. How do others deal with the differences here? 2. If we have indexable Ajax and static pages, can these be perceived as duplicate content? Maybe the answer is to disallow google bot from indexing the static pages we made. 3. What does your canonical URL become? Can you tell different search engines to read different canonical URLs? So many more questions, but I'll stop there. Curious if anyone here has thoughts (or experience) on the matter. Erin

    | ErinTM
    2

  • Hi everyone, A few weeks ago i purchased a .nu TLD, it's a really nice domainname but i'm wondering if it's possible to rank with this domain in Google? I've heard several times that it doesn't matter for Google which TLD a site has, but i normally never see .nu's high in the SERP's, what are your opinions about this? Is it possible for me to get this site high in google with good links and quality content? Thanks!

    | iwebdevnl
    0

  • I have spotted that some countries in South America generate lot's of traffic on my site and  I don't want to sell my service there. Can I be penalized for blocking IPs from certain counties? Thanks!

    | Xopie
    0

  • Hi, Just wondering if anyone can explain for me why it seems every tag that is entered in WP blog posts on a site creates a duplicate page (identified by ROGER and friends in SEOmoz crawl)? Obviously if you can offer a solution (apart from the extremely obvious "don't use tags") I would be immensely grateful. Thanks so much,

    | ShaMenz
    0

  • We have a domain which performs well within the local search and has got good authority and trust but we are now moving further afield to rank for keywords country wide. Our current domain contains our local area, does this effect your chances of ranking for broader searches? You don't seem to see many general searches bring domains up with the location keywords within their domain.

    | DragonsDesign
    0

  • I have a number of 404 error pages showing in webmaster tools and some of the url's have numbers, % symbols, and some are pdf's.  My usual 301 redirect in my htaccess file does NOT redirect these pages where the url's have special characters.  What am I doing wrong?

    | BradBorst
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.