Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Moz Q&A is closed.

After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.

  • This question is deleted!

    0
  • This question is deleted!

    0

  • My company has an ecommerce website that's been online for about 5 years.  The url is www.betterbraces.com.  We're getting ready to launch an australian version of the website and the url will be www.betterbraces.com.au. The australian website will have the same look as the US website and will contain about 200 of the same products that are featured on the US website.  The only major difference between the two websites is the price that is charged for the products.  The australian website will be hosted on the same server as the US website. To ensure Australians don't purchase from the US site we are going to have a geo redirect in place that sends anyone with a AU ip address to the australian website. I am concerned that the australian website is going to have duplicate content issues.  However, I'm not sure if the fact that the domains are so similar coupled with the redirect will help the search engines understand that these sites are related. I would appreciate any recommendations on how to handle this situation to ensure oue rankings in the search engines aren't penalized. Thanks in advance for your help. Alison French

    | djo-283669
    0

  • The question is in the title: Should HTML Heading Tags ALWAYS be in Hierarchical Order? For example, using them in order: H1, H2, H3... etc. Or is it OK to have H2 tags before the main H1 tag on a page? - for example sidebar content with H2 headings before the main content H1 tag? Your thoughts are greatly appreciated.

    | Peter264
    0
  • This question is deleted!

    0
  • This question is deleted!

    0
  • This question is deleted!

    0
  • This question is deleted!

    0
  • This question is deleted!

    0

  • I need to redirect a page/URL that is purely .html to a new location.  I don't know how to do this.  All the redirects I can find are for server side code pages .php/.aspx etc.  From my understanding I can't put a server side redirect in a .html file.  I am hosting on a microsoft server, however the new page I am redirecting to is .php.  I am running some WordPress (.php) files on the server.  I need to make it redirect before the old page loads so visitors don't start reading something that is about to get redirected Can someone please help me?

    | MyNet
    0
  • This question is deleted!

    0
  • This question is deleted!

    0
  • This question is deleted!

    0
  • This question is deleted!

    0
  • This question is deleted!

    0
  • This question is deleted!

    0

  • Previously our site was using this as our URL structure: www.site.com/page.html. A few months ago we updated our URL structure to this: www.site.com/page & we're not using the .html. I've read over this guide & don't see anywhere that discusses this: http://www.seomoz.org/learn-seo/redirection. I've currently got a programmer looking into, but am always a bit weary with their workarounds, as I'd previously had them cause more problems then fix it. Here is the solution he is looking to do: The way that I am doing the redirect is fine. The problem is of where to put the code. The issue is that the files are .html files that need to be redirected to the same url with out a .html on them. I can see if I can add that to the 404 redirect page if there is one inside of there and see if that does the trick. That way if there is no page that exists without the .html then it will still be a 404 page. However if it is there then it will work as normal. I will see what I can find and get back. Any help would be greatly appreciated. Thanks, BJ

    | seointern
    0
  • This question is deleted!

    0
  • This question is deleted!

    0

  • We recently acquired a company, and now we are going to redirect all of the pages on their site to their respective pages on our site. Do we need to keep the original pages on their site active? For how long? Ideally, we would like to redirect everything and remove the old site entirely so we don't have to pay to keep hosting it. Is this possible? Thanks!

    | pbhatt
    1
  • This question is deleted!

    0
  • This question is deleted!

    0
  • This question is deleted!

    0
  • This question is deleted!

    | _Z_
    0

  • Hi, I have a site on wordpress and I want to add eCommerce to it. We want to go with Shopify but Shopify only allows to host their platform on a subdomain. I like to have it on a subdorectory, so my question is: Would it make sense to redirect the whole subdomain to a subdirectory (move everything from shop.domain.com to domain.com/shop) for SEO purposes? Would Google see these pages as if they were part of the main domain? Thanks! Julien

    | julienraby
    0
  • This question is deleted!

    0
  • This question is deleted!

    | royb
    0
  • This question is deleted!

    0
  • This question is deleted!

    0
  • This question is deleted!

    0
  • This question is deleted!

    0

  • I am targeting my website in US so need to get high organic ranking in US web search. One of my competitor is restricting website access to specific IP address or Geo location. I have checked multiple categories to know more. What's going on with this restriction and why they make it happen? One of SEO forum is also restricting website access to specific location. I can understand that, it may help them to stop thread spamming with unnecessary Sign Up or Q & A. But, why Lamps Plus have set this? Is there any specific reason? Can I improve my organic ranking? Restriction may help me to save and maintain user statistic in terms of bounce rate, average page views per visit, etc...

    | CommercePundit
    1
  • This question is deleted!

    0
  • This question is deleted!

    | Melia
    0
  • This question is deleted!

    0
  • This question is deleted!

    0
  • This question is deleted!

    0

  • Hey I have recently come across an issue with several of a sites urls being seen as a 404 by bots such as Xenu, SEOMoz, Google Web Tools etc. The funny thing is, the pages exist and display fine. This happens on many of the pages which use the Modx CMS, but the index is fine. The wordpress blog in /blog/ all works fine. The only thing I can think of is that I have a conflict in the htaccess, but troubleshooting this is difficult, any tool I have found online seem useless. Have tried to rollback to previous versions but still does not work. Anyone had any experience of similar issues? Many thanks K.

    | Found
    0
  • This question is deleted!

    0

  • Today, I was reading about NoFollow on Wikipedia. Following statement is over my head and not able to understand with proper manner. "Google states that their engine takes "nofollow" literally and does not "follow" the link at all. However, experiments conducted by SEOs show conflicting results. These studies reveal that Google does follow the link, but does not index the linked-to page, unless it was in Google's index already for other reasons (such as other, non-nofollow links that point to the page)." It's all about indexing and ranking for specific keywords for hyperlink text during external links. I aware about that section. It may not generate in relevant result during any keyword on Google web search. But, what about internal links? I have defined rel="nofollow" attribute on too many internal links. I have archive blog post of Randfish with same subject. I read following question over there. Q. Does Google recommend the use of nofollow internally as a positive method for controlling the flow of internal link love? [In 2007] A: Yes – webmasters can feel free to use nofollow internally to help tell Googlebot which pages they want to receive link juice from other pages
    _
    (Matt's precise words were: The nofollow attribute is just a mechanism that gives webmasters the ability to modify PageRank flow at link-level granularity. Plenty of other mechanisms would also work (e.g. a link through a page that is robot.txt'ed out), but nofollow on individual links is simpler for some folks to use. There's no stigma to using nofollow, even on your own internal links; for Google, nofollow'ed links are dropped out of our link graph; we don't even use such links for discovery. By the way, the nofollow meta tag does that same thing, but at a page level.) Matt has given excellent answer on following question. [In 2011] Q: Should internal links use rel="nofollow"?  A:Matt said: "I don't know how to make it more concrete than that." I use nofollow for each internal link that points to an internal page that has the meta name="robots" content="noindex" tag. Why should I waste Googlebot's ressources and those of my server if in the end the target must not be indexed? As far as I can say and since years, this does not cause any problems at all. For internal page anchors (links with the hash mark in front like "#top", the answer is "no", of course. I am still using nofollow attributes on my website. So, what is current trend? Will it require to use nofollow attribute for internal pages?

    | CommercePundit
    0
  • This question is deleted!

    0
  • This question is deleted!

    0
  • This question is deleted!

    0
  • This question is deleted!

    | Paul78
    0

  • Hi, I have an old Wordress website with about 300-400 original pages of content on it. All relating to my company's industry: travel in Africa. It's a legitimate site with travel stories, photos, advice etc. Nothing spammy about. No adverts on it. No affiliates. The site hasn't been updated for a couple of years and we no longer have a need for it. Many of the stories on it are quite out of date. The site has built up a modest Mozrank value over the last 5 years, and has a few hundreds organically achieved inbound links. Recently I set up a swanky new branded website on ExpressionEngine on a new domain. My intention is to: Shut down the old site Focus all attention on building up content on the new website Ask the people linking to the old site to my new site instead (I wonder how many will actually do so...) Where possible, setup a 301 redirect from pages on the old site to their closest match on the new site Setup a 301 redirect from the old site's home page to new site's homepage Sounds good, right? But there is one issue I need some advice on... The old site has about 100 pages that do not have a good match on the new site. These pages are outdated or inferior quality, so it doesn't really make sense to rewrite them and put them on the new site. I call these my "black sheep pages". So... for these "black sheep pages" should I (A) redirect the urls to the new site's homepage (B) redirect the urls the old site's home page (which in turn, redirects to the new site's homepage, or (C) not redirect the urls, and let them die a lonely 404 death? OPTION A: oldsite.com/page1.php -> newsite.com
    oldsite.com/page2.php -> newsite.com
    oldsite.com/page3.php -> newsite.com
    oldsite.com/page4.php -> newsite.com
    oldsite.com/page5.php -> newsite.com
    oldsite.com -> newsite.com OPTION B: oldsite.com/page1.php -> oldsite.com
    oldsite.com/page2.php -> oldsite.com
    oldsite.com/page3.php -> oldsite.com
    oldsite.com/page4.php -> oldsite.com
    oldsite.com/page5.php -> oldsite.com
    oldsite.com -> newsite.com OPTION 😄 oldsite.com/page1.php : do not redirect, let page 404 and disappear forever
    oldsite.com/page2.php : do not redirect, let page 404 and disappear forever
    oldsite.com/page3.php : do not redirect, let page 404 and disappear forever
    oldsite.com/page4.php : do not redirect, let page 404 and disappear forever
    oldsite.com/page5.php : do not redirect, let page 404 and disappear forever
    oldsite.com -> newsite.com My intuition tells me that Option A would pass the most "link juice" to my new site, but I am concerned that it could also be seen by Google as a spammy redirect technique. What would you do? Help 😐

    | AndreVanKets
    1
  • This question is deleted!

    0
  • This question is deleted!

    0
  • This question is deleted!

    0

  • I am advising a client who wants to streamline their online customers experience through the use of cookies. The first time someone visits mysite.com, they will visit the normal index page, and on that page will be asked to identify themselves as a Personal or Business customer - and taken through to a relevant page. This will result in a cookie being added. The next time they come back to mysite.com, the cookie will automatically direct them from the index page to mysite.com/personal/ or mysite.com/business/. My question is, what are the SEO implications of this, especially given the fact the index page is their primary landing page for almost all organic traffic? Bots I realise that googlebot etc do not store cookies, so this should result in no change from the bots perspective (i.e. no redirect) but is it that simple? In effect we'll be showing the bot one thing and second time + visitors something else. Is this not effectively cloaking? All advice gratefully received!

    | seomasters
    0
  • This question is deleted!

    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.