Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • I've seen that for some domains Google will show a nice clickable site heirarchy in place of the actual URL of a search result.  See attached for an example.  How do I go about achieving this type of results? categorized.png

    | Carlito-256961
    0

  • It seems that .html pages do better for the long tail...

    | DavidS-282061
    0

  • Hi, Just a question, On one campaign for example - Vegas Hotel (vegashotel.com.au) Google ranking for **Clubs King Cross **is ZERO on Google AU when really its result 2 on google AU There are tones of other keywords the same 😞 Please let me know why Thanks MOZ Kiddies 😄 Ray

    | kayweb2
    0

  • Hello, Dose anyone has experience with Amazon s3 ? Hosting images, site elements and static files (pdf etc) on Amazon S3 bucket can influence the link structure / juice ? (by having external links to the amazon s3 bucket / account). If you host all the site images and site resources with Amazon S3 will all those "links" count as outgoing external links ? Is it worst then having everything linking internally ? Dose anyone knows if that can hurt you ? Personally I was not able to test it and I am sure I can't test it in the near future. Hope someone has and I can skip some steps with this. Thanks !

    | eyepaq
    1

  • My dmoz description is not as KW rich as my sites normal description.  IS there an advantage or disadvantage to either? If so,  How do I prevent google from doing this?

    | DavidS-282061
    0

  • Hello all We have an e-commerce website with approximately 3,000 products. Many of the products are displayed in multiple categories which in turn generates a different URL! 😞 Accross the entire site I have noticed that the product pages are always outranked by competitors who have lower page authority, domain authority, total links etc etc. I am convinced this is down to duplicate content issues. I understand there is no direct penalty but how would this affect our rankings? Is page rank split between all the duplicates, which in turn lowers it's ranking potential? I have looked for a way to identify duplicate content using Google analytics but i've been unsuccessful. If the duplicate content is the issue and page rank is divided am i best using canonical or 301 redirects? Sorry if this is an obvious question but If i'm correct we could see a huge improvement in rankings accross the board. Wow! Cheers Todd

    | toddyC
    0

  • We are using some branded terms in URLs that we have been recently told we need to stop using.  If the pages in question get little traffic, so we're not concerned about losing traffic from broken URLs, should we still do 301 redirects for those pages after they are renamed? In other words, are there other serious considerations besides any loss in traffic from direct clicks on those broken URLs that need to be considered? This comes up because we don't have anyone in-house that can do the redirects, so we need to pay our outside web development company.  Is it worth it?

    | PGRob
    0

  • I was wndering if there was a trick to getting google to crawl my website daily?

    | labradoodlelocator
    0

  • Due to some historic difficulties with our URL Rewriter, we are in the position of having the root of our site 301 redirected to another page. So the root of our site: http://www.propertylive.co.uk/ has a 301 redirect to: http://www.propertylive.co.uk/home.aspx We're aware that this isn't great and we're working to fix this completely, but what impact will this have on our SEO?

    | LianWard86
    0

  • This question is for MichaelC who was helping me with a previous question that is now closed.  Please refer to my question with Subject "Double 301 Redirect" It was about redirecting /home.aspx to simply "/" because that was an old URL and we have some backlinks pointing to it. If the best I could do is redirect "/home.aspx" to something like "#hm", would that work, since everything after the hash symbol is ignored? Thanks Clint

    | poolguy
    0

  • I bought supplies recently at barcodesinc.com.  While searching I noticed it is clearly the same site as barcodediscount.com.  How do they not get hurt by duplicate content?

    | jotham2
    0

  • The website for my London based plumbing company has thousands of specifically tailored pages for the various services we provide to all the areas in London. It equates to approximately 6000 pages in total. When google has all these pages indexed, we tend to get a fair bit of traffic - as they cater pretty well for long tail searches. However, every once in a while Google will drop the vast majority of our indexed pages from SERPs for a few days or weeks at a time - for example at the moment Google is only indexing 613 whereas last week it was back at the normal ~6000. Why does this happen? We of course lose a lot of organic traffic when these pages don't displayed - what are we doing wrong? Website: www.pgs-plumbers.co.uk

    | guy_andrews
    0

  • Look for tools that can visualise a sites architecture (idealy automated). Also looking for tools that can visualise internal linking sturures

    | Motionlab
    0

  • I'm curious what correlations or impacting variables SEO professionals have found that have increased or decreased ranking with the most recent algorithm change. It appears that many innocent sites have fallen victim, especially larger sites. It also appears that Google is maintaining that specific sites were not targeted... Meaning there must be proven characteristics.

    | douglaskarr
    0

  • Working on a forum site that has multiple versions of the URL indexed. The WWW version is a top 3 and 5 contender in the google results for the domain keyword. All versions of the forum have the same PR, but but the non-WWW version has 3,400 pages indexed in google, and the WWW has 2,100. Even worse yet, there's a completely seperate domain (PR4) that has the forum as a subdomain with 2,700 pages indexed in google. The dupe content gets completely overwhelming to think about when it comes to the PR4 domain, so I'll just ask what you think I should do with the forum. Get rid of the subdomain version, and sometimes link between two obviously related sites or get rid of the highly targeted keyword domain? Also what's better, having the targeted keyword on the front of Google with only 2,100 indexed pages or having lower rankings with 3,400 indexed pages? Thanks.

    | Hondaspeder
    0

  • Hi, we are actually building a new microsite that will live on promo.domain.com.  The site will have a promo for about 5 days and will be changed for a new promo after 5 days. Considering that the site www.domain.com has a high autority (67) and is well indexed in search engine, should I try to optimise this site for keywords such as promo "keyword" rebate "keyword" cheap "keyword" even if the site will be optimized for those keywords only for 5 days. We are already doing PPC campaign on these keywords but I am wondering if Google will have the time to rank us in the top 10 results in those 5 days or if I am loosing my time. My other option is to leave the TITLE of this site always the same like Groupon is doing and focus on very generic keywords.  Which option do you think is the best?

    | Adviso
    0

  • I have mysite.net and mysite.com......They are both the same age, however, we always had it so that the mysite.com address forwarded to the mysite.net address. The mysite.net address was our main address forever. We recently reversed that and made the mysite.com address the main address and just have mysite.net forward to the mysite.com address. I'm wondering if this change will affect our rankings since a lot of the backlinks we've acquired are actually pointing to mysite.net and not mysite.com (our new main address)???

    | B24Group
    0

  • I don't know if these two scenarios are any different as far as SEO is concerned, but I wanted to ask to get an opinion. On my website: http://www.rainchainsdirect.com you can see there is a top menu with "About" "Info" "Questions" etc.  Some of these links lead to further pages that are essentially a indeces for multiple further links. My question is: in terms of SEO, is it better to A) have all links (that are now on the pages that the menu links lead to) displayed in a drop down menu directly from the top menu (and bypassing an intermediate page) or B) to have it as it is now where you have to click to an intermediate page (like "rain chain info") to get access to the links (and not have such a large drop down menu) Is there a difference in terms of SEO?  In terms of useability it almost seems like a toss up between the two, so if there were better SEO value to one of the other, then I would choose that one. By the way, I know that the way it is structured now is strange, where there is only one drop down that leads to the same page as the top menu item, but that will be fixed, fyi. Thanks!

    | csblev
    0

  • I am thinking to Add Sub Domains to get better rankings for Local Searches. So I will develop City Specific Sites with Specific Language. For Example qatar.wisnetsol.com. IT will be in Arabic. If my Good standing and Ranking on Google for wisnetsol.com will help my subdomain to rank better? if we setup wisnetsol.com/qatar, how it can target Qatar in Google Webmaster tools? Will links for qatar.wisnetsol.com and wisnetsol.com are seprate? What do you think about this strategy? Is it good or bad?

    | Khuram
    0

  • Hi Mozzers. I've a client that has done a little bit of mess rewriting the URLs of its site. In fact, also the data base driven URLs are rewritten, but the dev forgot to change the space with "-", so that now the 95% of the URLs are like this one: http://www.portalesardegna.com/search/Appartamenti e Residence/ Obviously not really a pretty URL. I am not so sure if this issue has an SEO consecuences (in fact, the site ranks pretty well also with those kind of url), but I am thinking more on usability issue. Could you suggest me any easy fix to this rewrite problem?

    | gfiorelli1
    2

  • Hello you suggest http://www.tynt.com/publisher-tools/copy-and-paste-to-share-content/ in your pro tipps but I wonder why you are not using it for your own blog under seomoz.org. I noticed that when somebody copy and paste something form my blog a strange code is added to the link: Expample: http://www.janik.cc/webdesigner-blog/2011/02/sample/#ixzz1FQvadWNV Is the #ixzz1FQvadWNV maybe not that good in a seo view of point?

    | MichaelJanik
    0

  • I deal with primarily small businesses in the construction and maintenance industries and I'm looking for some advice. Traditionally in these categories you find absolutely awful websites with the below attributes, all ranking 1-5 in the SERPs. Weak/Limited content No blog Awful title tags Minimal backlinks Poor on-site optimization Ect. In most cases I am going to assume that these sites have been indexed for the last ten years, and that is why they are retaining such high rankings. My issue is trying to compete with them! So far I have worked to have my client's website submitted to and accepted by all of their competitor's backlinking site (which didn't take as long as you might think). I've also listed my client in several of the top paying and free directories (dmoz, joe ant, ect.) and still I see limited results. Lastly and most annoyingly my clients website, according to SEOmoz is currently leading the domain authority race in every category. Does anyone have a suggestion as to what is going on.

    | calin_daniel
    0

  • Hey guys, First of all, a big thanks to SEOmoz and the community.  I've been an avid reader for about a year now and have seen some great improvements. I'm always focusing my efforts on strategies that work well for my niche.  Although I've come accross one of my competitors that doesn't seem to have much going for him, although he ranks very well.  His root domain is ****E and the URL where (seemingly) spammed links point to is ***.  If you do a site: command he has 1000+ pages although most consist of "events calendar" (empty).  Also, I ran some of his content through copyscape and there seems to be multiple versions of it throughout the web. After all this, he ranks very well for money keywords in our niche, although his on-page is horrible so there are many opporunities I've capitalized on. Is there something I'm missing?  I'm trying to find the value in his website but its not very clear to me since his backlink profile is (seemingly) junk and his on-page goes against all I was told to implement.

    | reegs
    0

  • Hey guys, I'm not too sure if I'm over-thinking this, but I've seen no-follow being used with SEOmoz and I'm looking to implement this  myself.  Most of my links point to my root domain (yes I'm working on building links to deep pages) so would it make sense to 'limit' or 'no-follow' links on my root domain so that only the most important pages are being passed link juice? Thanks

    | reegs
    0

  • A very active news website has a very active blog as a sub-domain. For all good SEO reasons, they want to transfer the blog from a subdomain to a subcategory. Now, the issue is this blog is very active, updated daily and lots of people comment on it. So how should the website go about making this change while causing minimal seo loss and ensuring that the commenting process remains active and on going. A detailed answer would be much appreciated! Thanks! I cam accross this blog post that guides on how to change a subcategory to a subdomain: http://devilsworkshop.org/moving-a-wordpress-blog-from-a-subdirectory-to-subdomain-preserving-permalinks/. Any words on this would also be appreciated!

    | RishadShaikh59
    0

  • Hi all, I'm new to SEO and excited to see the launch of this forum. I've searched for an answer to this question but haven't been able to find out. I  "attended" two webinars recently regarding SEO. The above subject was raised in each one and the speakers gave a polar opposite recommendations. So I'm completely at a loss as to what to do with some domains that are related to a domain used on a live website that I'm working to improve the SEO on. The scenario: Live website at (fictitious) www.digital-slr-camera-company.com. I also have 2 related domain names which are parked with the registrar: www.dslr.com, www.digitalslr.com. The question: Is there any SEO benefit to be gained by pointing the two parked domains to the website at www.digitalcamercompany.com? If so, what method of "pointing" should be used? Thanks to any and all input.

    | Technical_Contact
    0

  • I've just been asked the question and didn't have a great answer, what a great reason to try the new Q&A! We have a .com domain based in the UK but we'd like to optimise for Australian searches. Are there any tips about useful practices to carry out on the site to highlights it's relevance for Australian users?

    | eazytiger
    0

  • Hi If the same content is placed on different URL's for the purposes of providing information on different channels (i.e mobiles), or has been translated into a different language (but is still the same content), do the serach engines still count this as duplicate content and will a canonical URL have to be tagged in these instances? Thanks in advance for your assistance.

    | jimmyseo
    1

  • This is just a quickie: On one of my campaigns in SEOmoz I have 151 duplicate page content issues! Ouch! On analysis the site in question has duplicated every URL with "en" e.g http://www.domainname.com/en/Fashion/Mulberry/SpringSummer-2010/ http://www.domainname.com/Fashion/Mulberry/SpringSummer-2010/ Personally my thoughts are that are rel = canonical will sort this issue, but before I ask our dev team to add this, and get various excuses why they can't I wanted to double check i am correct in my thinking? Thanks in advance for your time

    | Yozzer
    0

  • In PHP, I'm wanting to store a session variable based upon a link that's clicked. I'm wanting to avoid query strings on pages that have content. My current workaround is to have a link with query strings to a php file that does nothing but snags the variables via $_GET, stores them into $_SESSION, and then redirects. For example, consider this script, that I have set up to force to a mobile version. Accessed via something like a href="forcemobile.php?url=(the current filename)" session_start(); //Location of vertstudios file on your localhost. Include trailing slash $loc = "http://localhost/web/vertstudios/"; //If GET variable not defined, this page is being accessed directly. //In that case, force to 404 page. Same case for if mobile session variable //not defined. if(!(isset($_GET["url"]) && isset($_SESSION["mobile"]))){ header("Location: http://www.vertstudios.com/404.php"); exit(); } //Snag the URL $url = $_GET["url"]; //Set the mobile session to true, and redirect to specified URL $_SESSION["mobile"] = true;header("Location: " . $loc . $url); ?> Will this circumvent the issue caused by using query strings?

    | JoeQuery
    0

  • Hi, I have an issue that can be solve with a canonical tag, but I am not sure yet, we are developing a page full of statistics, like this: www.url.com/stats/ But filled with hundreds of stats, so users can come and select only the stats they want to see and share with their friends, so it becomes like a new page with their slected stats: www.url.com/stats/?id=mystats The problems I see on this is: All pages will be have a part of the content from the main page 1) and many of them will be exactly the same, so: duplicate content. My idea was to add the canonical tag of "www.url.com/stats/" to all pages, similar as how Rand does it here: http://www.seomoz.org/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps But I am not sure of this solution because the content is not exactly the same, page 2) will only have a part of the content that page 1) has, and in some cases just a very small part. Is the canonical tag useful in this case? Thank you!

    | andresgmontero
    0

  • I know It is not a good idea to have duplicate titles across a website on pages as Google does not like this. Is it ok to have duplicate titles on pages that aren't being optimised with SERP's in mind? or could this have a negative effect on the pages that are being optimised?

    | iSenseWebSolutions
    0

  • Hello, I recently launched my new site (Nov.  25, 2011) but still have the old site live because I still need old customer data from the old admin for customer service issues and I cannot delete the old front-end without deleting the old back-end!. I am seeing a lot of referrals coming from the old site IP address with many backlinks to the new site but dont know if this is actually hurting the new site due to duplicate content, ect .. Any input would be greatly aaaaaapreciated 😉 Thanks in advance, Byron-

    | k9byron
    0

  • Most of my work now involves converting older websites to CMS-based sites (in Wordpress) and I'm wondering about best practices here. If I create a "dev" or "sandbox" directory for my development work how do I keep the pages from being indexed while I am working on the new site? Can I "noindex" a directory? What do I do with the old html files when the new site goes live? I'm assuming I will do a 301 redirect from domain.com/index.html to the new domain.com/, and also on all of the inner pages that have equivalent pages in the new site. But there will be a lot of old files left that have no equal in the new site. Do I just delete these, or noindex nofollw them?

    | bvalentine
    0

  • Having been a member of SEOmoz Pro tools for only a couple of months, I'm now at a point where there are certain issues with our recently overhauled site: On my latest "Open Site Explorer Report" I am seeing a number of external links going to the HTTP://Domainname.com  and a number pointing at HTTP://wwwDomainname.com. This only appears when I pull the report from the Root Domain. If I pull a report from the Sub-domain all URL's are the same. Does this matter too much? Would best practice be to put a rel=canonical on the Non www ? Thanks for any help in advance Sean

    | Yozzer
    0

  • Do I have to redirect "/" in the domain by default? My root domain is e.g. petra.at
    --> I redirect via 301 to www.petra.at Do I have to do that with petra.at/ and www.petra.at/, too?

    | petrakraft
    0

  • Hey guys, We have been link building and optimizing our website since the beginning of June 2010. Around August-September 2010, our site appeared on second page for the keywords we were targeting for around a week. They then dropped off the radar - although we could still see our website as #1 when searching for our company name, domain name, etc. So we figured we had been put into the 'google sandbox' sort of thing. That was fine, we dealt with that. Then in December 2010, we appeared on the first page for our keywords and maintained first page rankings, even moving up the top 10 for just over a month. On January 13th 2011, we disappeared from Google for all of the keywords we were targeting, we don't even come up in the top pages for company name search. Although we do come up when searching for our domain name in Google and we are being cached regularly. Before we dropped off the rankings in January, we did make some semi-major changes to our site, changing meta description, changing content around, adding a disclaimer to our pages with click tracking parameters (this is when SEOmoz prompted us that our disclaimer pages were duplicate content) so we added the disclaimer URL to our robots.txt so Google couldn't access it, we made the disclaimer an onclick link instead of href, we added nofollow to the link and also told Google to ignore these parameters in Google Webmaster Central. We have fixed the duplicate content side of things now, we have continued to link build and we have been adding content regularly. Do you think the duplicate content (for over 13,000 pages) could have triggered a loss in rankings? Or do you think it's something else? We index pages meta description and some subpages page titles and descriptions. We also fixed up HTML errors signaled in Google Webmaster Central and SEOmoz. The only other reason I think we could have been penalized, is due to having a link exchange script on our site, where people could add our link to their site and add theirs to ours, but we applied the nofollow attribute to those outbound links. Any information that will help me get our rankings back would be greatly appreciated!

    | bigtimeseo
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.