Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • Good afternoon! For SEO I put all of the cities and states my customer serves ( over 40) in the footer.  Will this help or hurt seo. Also if it does hurt is it better to create a page of cities we serve and write some content around the different communities? Thank you!

    | EmSt
    0

  • Hi i have checked passed Q&As and couldn't find anything on this so thought I would ask.
    I have recently noticed my URLS adding the following to the end: mydomain.com/?fullweb=1 I cant seem to locate where these URLS are coming from and how this is being created? This is causing duplicate content on google. I wanted to know ig anyone has had any previous experience with something like this? If anyone has any information on this it would be a great help. thanks E

    | Direct_Ram
    0

  • Hello Friends, I am having some problem in my website. I hosted this before 4 months. I have done some seo work related to back links. But when I see in opensiteexplorer than it does not show any back links even It shows 1 DA and 1 PA but why?

    | docbeans
    0

  • Hi Everyone! My website is not being crawled regularly by Google - there are weeks when it's regular but for the past month or so it does not get crawled for seven to eight days. There are some specific pages, that I want to get ranked but they of late are not being crawled AT ALL unless I use the 'Fetch As Google' tool! That's not normal, right? I have checked and re-checked the on-page metrics for these pages (and the website as a whole, backlinking is a regular and ongoing process as well! Sitemap is in place too! Resubmitted it once too! This issue is detrimental to website traffic and rankings! Would really appreciate insights from you guys! Thanks a lot!

    | farhanm
    1

  • I have a big site, there are a way to know what page are not indexed? I know that you can use site: but with a big site is a mess to check page by page. This is a tool or a system to check a entire site and automatically find non-indexed pages?

    | markovald
    0

  • I've been kind of neglecting wordpress installations on my websites and noticed many showing duplicate content for pages showing under author and tags, tags and single post, categories and single post. Should this be a concern? Whats the best way of fixing this? Thanks

    | cgman
    0

  • I'm experiencing some recent significant drops in rankings across the board for a client of mine and I suspect that it's probably related to Panda. Their internet presence features completely unique, useful, well written content by certified industry experts. Further, all content is of proper length and again serves a core purpose, providing helpful information to their viewers. Where I think things potentially go wrong is that they have around 20 micro sites in operation, including multiple web 2.0 blogs. There are also multiple sites in operation that target more specific areas of the same city. Again all of the content is unique, but they all feature content that's of the same industry and broad topic. Despite everything being 100% unique, I fear it's too excessive. Anyone know if Panda may target this type of approach even if the quality and uniqueness is appropriate?

    | BrandishJay
    0

  • Good Morning I put forward the following question in December 2014 https://moz.com/community/q/google-still-listing-old-domain as pages from our old domain www.fhr-net.co.uk were still indexed in Google. We have submitted two change request in WMT, the most recent was over 6 months ago yet the old pages are still being indexed and we can't see why that would be Any advice would be appreciated

    | Ham1979
    0

  • I have an automotive rental blog with articles that explain the pros of renting a specific model. So in this articles the advantages of rental versus the buying of a new model. This advantages are a list with bullets like this:
    Rental | Buy new car
    Rental:
    Free car insurance
    Free assistance
    etc.
    Buy new car
    You have to pay insurance
    You have to pay assistance
    etc. etc. I want to do this because i want to make all articles like landing pages...
    This "advantages box" have 100 characters. The general length of articles on my blog is 500/600 characters. So i have an average of 15/20% internal duplicated content on all my articles. Is this bad for seo? Any alternatives?

    | markovald
    0

  • My Squarespace site www.thephysiocompany.com has seen a sudden jump in 302 redirects in the past 30 days. Gone from 0-302 (ironically). They are not detectable using generic link redirect testing sites and Squarespace have not explanation. Any help would be appreciated.

    | Jcoley
    0

  • Hi I have recently noticed my site works with / and the end of a URL and without. I wanted to know if there is any SEO impact on this? Will it be seen as 2 different pages? if so what is the best option to go for www.mydomain.com/page/ or www.mydomain.com/page Thanks E

    | Direct_Ram
    0

  • Hello moz community ! I would to make a special 301 redirection through my htaccess file. I am a total noob concerning regexp and 301 redirection. I would like to redirect(301) this url : http://www.legipermis.com/stages-points/">http://www.legipermis.com/stages-points/</a></p>; yes yes it's in the index of google, this strange url includes the last ; to http://www.legipermis.com/stages-points/ I have already include a canonical tag by security, i would like to remove url with a 301 redirection and by remove this url through GWT (but the removal tool can't "eat' this kind of URL) Please consider the fact that i am not an expert about 301 redirections and regexps. No 301 redirect generator works properly for such a strange URL (which triggers content duplication corrected anyway with canonical tag). Thanks for your help.

    | LegiPermis
    0

  • A few months ago we switched all product page urls in our e-commerce site to https:  Recently working on the site I was appalled at how slow our pages were loading and on investigating further with our hosting partner they advised to switch back to http instead of https for all of the site content to help page speed. We obviously still need to use https in the cart and check-out. I think that that Google will be pushing all commerce web pages to https but for now I need to improve page load speed. Will the switch back from https to http impair our keywords?  https://www.silverandpewtergifts.com/

    | silverpewter
    0

  • Hello, I have 15 online shops owned by one company and the web developer put in the header of each website a link to the 15 other sites. It looks like a link farm but they are all for one company. And in some shops we have a huge number of products i.e. huge number of outgoing links to other shops, like a total of a million outgoing follow link to the other 15 online shops.Yes we have it almost 2 years ago and we didn't get penalized by Google and it seems that Google discovered that all are owned by one company.My questions are:1- Will we be penalized in future by Google if they think that they are a kind of spam links?2- What will happen if I changed these links to no follow links? Or if I removed them from the header?3- Do these outgoing links effect the ranking?4- Does PR is still a metric to Google since they calculate it depends on a formula based on the incoming and outgoing links?Thanks in AdvancedHousam Smadi

    | anubis2
    0

  • Hello mozzers! So my company is about to launch a large scale content project with over 100 pieces of newly published content. I'm being asked what the date-stamp for each article should be. Two questions:
    1- Does it hurt article's SEO juice to have a lot of content with the same "published on" date?
    2- I have the ability to manually update each articles date stamp. Is there a recommended best practice? p.s. Google has not crawled any of these pages yet.

    | Vacatia_SEO
    1

  • Hi So Ive done a crawl of the site using screaming frog. There are a few old category and sub category pages which don't exist any more but somehow the crawler is finding them. An example is below: http://www.ebuyer.com/store/Home-Appliances/cat/Health-&-Beauty/subcat/Male-Grooming Just wondering if anybody had any ideas about how I could go and find these urls and remove them off the site. Any ideas would be really appreciated. Thanks Andy

    | Andy-Halliday
    0

  • I'm working with a client who uses a CMS which loads meta tags into their site through its backend. On-page I see this in the source:

    | medtouch
    0

  • There is an issue on one of our sites regarding many of the sitemap url's not being indexed.  (at least 70% is not being indexed) The url's in the sitemap are normal url's without any strange characters attached to them, but after looking into it, it seems a lot of the url's get a #. + a number sequence attached to them once you actually go to that url. We are not sure if the "addthis" bookmark could cause this, or if it's another script doing it. For example Url in the sitemap: http://example.com/example-category/0246 Url once you actually go to that link: http://example.com/example-category/0246#.VR5a Just for further information, the XML file does not have any style information associated with it and is in it's most basic form. Has anyone had similar issues with their sitemap not being indexed properly ?...Could this be the cause of many of these url's not being indexed ? Thanks all for your help.

    | GreenStone
    0

  • My client has recently built a new site (we did not build this), which is a subdomain of their main site. The new site is: https://addstore.itelligencegroup.com/uk/en/. (Their main domain is: http://itelligencegroup.com/uk/) This new Addstore site has recently gone live (in the past week or so) and so far, Google appears to have indexed 56 pdf files that are on the site, but it hasn't indexed any of the actual web pages yet. I can't figure out why though. I've checked the robots.txt file for the site which appears to be fine: https://addstore.itelligencegroup.com/robots.txt. Does anyone have any ideas about this?

    | mfrgolfgti
    0

  • Hello all, I have added the suggested code to my homepage as follows but not getting box in SERP.: Is there syntax error, placement error or something else. Please help!!! This was added 2 month ago.

    | vivekrathore
    0

  • Hi, I'm dealing with an ecommerce client who sells furniture. Each category landing page has a menu on the left hand side that allows you filter by colour, material, brand etc. Take the www.example/double-beds page, as an example: if you select 'Wood' from the 'Material' filter, the URL changes to www.example/Category/Browse?PageNumber=&ViewAs=&ObjectEntityKey=1916&PageSize=15&SortBy=&filterOptions=47&filterOptions=47 and all the wooden double beds are displayed. As this new URL contains some of the same products/content as www.example.com/double-beds, where do we stand from an SEO/duplicate content point of view? Are we at risk of a duplicate content slap? Cheers, Lewis

    | PeaSoupDigital
    0

  • If the switch is made from http to https (with 301 redirects from http to https) should the disavow file be copied over in GWT so it is also uploaded against the https as well as the http version?

    | twitime
    0

  • Hi all, Our category Meta Title Tags are a little woeful and so I'm in the process of rewriting them. Let's say you have a product for sale.... some inkjet cartridges for a Canon BJ10V printer for example. In an effort to keep things concise I was thinking that for this category I should have the meta title set simply as: 'Canon BJ10V Inkjet Cartridges' and perhaps our company name after this text (and a pipe delimiter) This takes us just under 50 characters which is ideal but doesn't include any real keyword variation and will result in the company name being duplicated at the tail of the title tag on 6,000 odd pages. A large number of my competitors have title tags along the lines of: 'Canon BJ10V Cheap Inkjet Cartridges for Canon BJ-10V Ink Printers' I understand the reasoning behind this but does the variation of keywords compensate for the fact that the title looks spammy (to both humans and Search Engines). What would you do? Keep it clean and concise or stuff the title full of keywords. In the event of the former would you include the company name in each title in the knowledge they would be well under 50 characters without? Thanks for your help.

    | ChrisHolgate
    1

  • My Website has a .com domain. However I have noticed that for local businesses all of them have a .co.uk (UK business) TLD (check plumbers southampton for example). I have also noticed that on checking my serp rankings, I'm on page 1 if searched on Google.com but page 2 if searched on google.co.uk. Now being UK based I would assume most of my customers will be redirected to google.co.uk so I'm wondering how much of an impact this actually makes? Would it be worth purchasing .co.uk domain and transferring my website to that? Or run them both at the same time and set up 301 direct on my .com to .co.uk? Thanks

    | Marvellous
    0

  • For example my website has mysite.com/randomunusedpage.html No links go into that page from the website but it is published (came with the WP theme). Will that hurt my SEO and should I delete the page or is it harmless? Thanks

    | Marvellous
    0

  • Hello fella SEOs! I have a very intriguing question concerning different TLDs across the same domain. For eg: www.mainwebsite.com, www.mainwebsite.eu, www.mainwebsite.au, www.mainwebsite.co.uk etc... Now, assuming that all these websites are similar in terms of content, will our lovely friend Google consider all these TLDs as only one and unique domain or will this cause a duplicate content problem? If yes, then how should I fix it? Thnx for your precious help guys!

    | SEObandits
    1

  • Now my website need to change to new domain, Since we start this website we have the landing page for sale and blog. So I need to redirect both a.com to b.com and a.com/blog/ to b.com/blog/ and I need to do by these step to keeping the benefit of SEO. For a.com Preparing the new site by duplicate the information of old site. (Get them no-index, no-follow) Export all URL to excel files by Xenu then look up together for re-check the information are they the same or not, I think we can check that if we used the same title. Redirect together by 301 codes. Get google bot know by submission to Google Webmaster Tools (Google Search Console) For a.com/blog/ Since we used wordpress to be our blog CMS, So we need to install wordpress CMS to b.com/blog/, Then backup SQL from PHP C Panel & files. Re-upload that SQL database & files to new server. Export all URL contain /blog/ then look up together by excel again. Redirect by 301 codes. So for this method what am i wrong or need to change please suggestion me. According to this is the technical SEO stuff that I never done before. After we redirect the old domain to new domain how much time Google will take for checking and give the benefit for us with the same benefit. And what we have to do for Moz.org to checking our issues & ranking. Plus - what can we do for no-index, no-follow page that belong to a.com because my website also doing these page as well for each marketing campaign to avoid Google to index.

    | ASKHANUMANTHAILAND
    0

  • New to Moz and this forum, so be gentle. 🙂 I’m in the process of overhauling a generally neglected website and have just finished some research on long tail keywords. My question is, how do I implement these? For example, I’ve got a product “Acme Widget” which has its own page on the site (and ranking well for the product name itself). I have lots of long tail keyword sets which describe key benefits of the products – some of which appear in the product copy, others which don’t (perhaps because the thing that a user may search for is ugly/bad-English in copy). For the sake of argument, let’s say I have the following long tail keywords for my Acme Widgets. cheap red widget los angleles widget strong green widgets florida What is the best way to implement these? Do I need to simply incorporate the text into my main Acme Widgets page, or do I need to have separate pages which are highly targeted to each long tail keyword? The problem with the former is unnatural/ugly copy. The problem with the latter is that coming up with enough content to justify (and rank) a page on each keyword set would be quite a challenge. Regards,
    Warren

    | Warren_Vick
    0

  • We have a Magento store, with multiple stores/domains setup. There is only really one reason that we have the multiple domains; we use an automatic GEOIP store switcher to send a customer to the right store, so that they pay the proper shipping, see the proper pricing etc, and a couple small differences in the design templates. But all the content is identical. So we have: domain.com (main website)
    domain.ca (where most other countries are directed to based on GEOIP)
    domain.eu Since the content is the same, what is the best strategy here? I looked at several options: 1. Custom canonical urls, making each page on the .ca and .eu use canonical url of the .com
    2. Completely block the .ca and .eu from robots.
    3. Leave it the way it is

    | maartenvr
    0

  • Here is the javascript I am using to send users to the mobile version of my website: This is causing major issues in Bing and Yahoo as the mobile website is the only thing ranking. I'd love any help dissecting this issue. Thanks in advance.

    | ShawnW
    0

  • Evening all, I've performed a Screaming Frog technical crawl of a site, and it's returning links like this as 404s: http://clientsite.co.uk/accidents-caused-by-colleagues/js/modernizr-2.0.6.min.js Now, I recognise that Modernizr is used for detecting features in the user's browser - but why would it have created an indexed page that no longer exists? Would you leave them as is? 410 them?  Or do something else entirely? Thanks for reading, I look forward to hearing your thoughts! Kind regards, John.

    | Muhammad-Isap
    0

  • For one of my sites, A-1 Scuba Diving And Snorkeling Adventures, Google is seeing way more pages than I actually have. It sees almost 550 pages but I only have about 50 pages in my XML. I am sure this is an error on my part. Here is the search results that show all my pages. Can anyone give me some guidance on what I did wrong. Is it a canonical url problem, a redirect problem or something else. Built on Wordpress. Thanks in advance for any help you can give. I just want to make sure I am delivering everything I can for the client.

    | InfinityTechnologySolutions
    0

  • For several search terms I get site links for the page http://www.waikoloavacationrentals.com/kolea-rentals/kolea-condos/ It makes sense that that page be a site link as it is one of my most used pages, but the problem is google gave it the site link "Kolea 10A".  I am having 0 luck making any sense of why that was chosen.  It should be something like "Kolea Condos" or something of that nature. Does anyone have any thoughts on where google is coming up with this?

    | RobDalton
    0

  • Recently, I discovered that only the first 4 reviews on our product pages are crawled and indexed. Example: http://www.improvementscatalog.com/eucalyptus-deep-seat-furniture-group/253432 I'm assuming it's due to the canonical that's on the product page http://www.improvementscatalog.com/eucalyptus-deep-seat-furniture-group/253432" />. When you click on page 2 of the reviews, the url does not change, but the next batch of reviews appears on the product page. Same with page 3, etc… The problem is the additional pages are not being crawled and indexed. We have to have the canonical on the product page because our platform creates multiple urls for each product page by including each category where the product resides, related link parameters, etc in the product url (example: http://www.improvementscatalog.com/eucalyptus-deep-seat-furniture-group/patio-furniture/outdoor-furniture/253432) – trust me, it gets ugly! I've researched other Moz answers and I've found that there appears to be a couple of ways to fix the issue. Any ideas/help/guidance/examples on the below options is greatly appreciated!!!! Show only 4 reviews on the first page and place the remaining reviews on a new page by themselves (similar to how Amazon does it). However, I would rather keep all of the reviews on the product page if possible. Add page 2, page 3, etc parameters to the url to display the remaining reviews and adding  rel=prev/next. If we chose option 2, would each product page have a different canonical? If so, would it create a duplicate content issue since the above-the-fold content, title tag and meta descriptions would all be the same? Also, would you include each additional page in the sitemap? We had a similar issue with our category pages and we implemented the "viewall" in the canonical. Would that work for our reviews? Thanks in advance for your help!

    | Improvements
    0

  • I searched in Google, the number of URLs indexed left in the seomoz.org domain since it changed to moz.comI am surprised that after all this time more than 15,000 URLs indexed:https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=site%3Aseomoz.org%20inurl%3Aseomoz.org If I clicked on any of the results it will be redirect (301) to the new domain, so it is working, but Google still keep these URLs in the index.
    What could be the reason?Will not cause duplicated content issue on moz.com?

    | Yosef
    0

  • We're writing more out of curiosity... Clicking on "Download latest links" within 'Links to your site' in Google's WebMaster Tools would usually bring back links discovered recently. However, the last few times (for numerous accounts) it has brought back a lot of legacy links - some from 2011 - and includes nothing recent. We would usually expect to see a dozen at least each month. ...Has anyone else noticed this? Or, do you have any advice? Thanks in advance, Ant!

    | AbsoluteDesign
    0

  • I have made no changes to my site for awhile and on 7/14 I had a 20% drop in indexed pages from the sitemap. However my total indexed pages has stayed the same. What would cause that?

    | EcommerceSite
    0

  • Recently SEMRush added a feature to its site audit tool called "SEO Ideas." In the case of specific the site I'm looking at it with, it's ideas consist mostly of suggesting words to add to the page for the page/my phrase(s) to perform better. It suggests this even when the term(s) or phrases(s) it's looking at are #1. Has anybody used this tool for this or something similar and found it to be valuable and if so how valuable? The reason I ask is that it would be a fair amount of work to go through these pages and find ways to add the select words and phrases and, frankly, it feels kind of 2005 to me. Your thoughts? Thanks... Darcy

    | 94501
    0

  • I am managing my company's spammy backlinks using Open Site Explorer.  Our company owns a few URLs that are related to our company or are iterations of our main URL.  All of these additional URLs have 301 redirects to our main domain.  Open Site Explorer has identified one of these URLs as having a spam score of 8 indicating a 56% chance of Google crawler penalization.  Obviously, this is a red flag.  Instead of being redirected to our main domain upon visiting the URL, I was directed to what seems to be an automatically generated, generic webpage with links that seem to have been generated by keywords from our main domain.  I have seen this type of webpage before when incorrectly typing in URLs from other pages.  They tend to look the same.  They have a black background with the URL written in grey at the top and a rectangular related links bar.  Is anyone familiar with my problem and could you offer any advice? Thanks, Ben

    | SOLVISTA
    0

  • Does anyone knows some info about  "teracent-feed-processing" user agent? IP's from which user agent reside:  74.125.113.145, 74.125.113.148, 74.125.187.84 .... In our logs, 2 out of 3 requests are made by it,   causing server crash.

    | propertyshark
    0

  • Hello, One of our clients - a cleaning business - has a heck of a lot of spammy nofollow links pointing to their site. The majority of the links are from comments or 'pingbacks', most with the anchor text 'cheap nfl jerseys' or 'cyber monday ugg boots'. After researching the subject of spammy nofollow links, it seems there is a lot of uncertainty regarding the negative affect these could have on your SEO efforts. So I guess my question to the community is: if your site was suddenly hit by a plethora of spammy nofollow links, what would you do and why? Cheers, Lewis

    | PeaSoupDigital
    0

  • Hi guys, When you have several tabs on your website with products, you can most likely navigate to page 2, 3, 4 etc...
    You can add the link rel="prev" and link rel="next" tags to make sure that 1 page get's indexed / ranked by Google. am I correct? However this still means that all the pages can get indexed, right? For example a webshop makes use of the link rel="prev" and ="next" tags. In the Google results page though, all the seperate tabs pages are still visible/indexed..
    http://www.domain.nl/watches/?tab=1
    http://www.domain.nl/watches/?tab=24
    http://www.domain.nl/watches/?tab=19
    etc..... Can we prevent this, and make sure only the main page get's indexed and ranked,  by adding a canonical link on every 'tab page' to the main page --> www.domain.nl/watches/ I hope I explained it well and I'm looking forward to hearing from you. Regards, Tom

    | AdenaSEO
    1

  • We have three different domains for geotargeting  (za,uk and .com). Each site at at the moment has the same content with only country specific details changed like currency etc. What is the best way to get maximum SEO benefit when posting new content.When we post new content should we repost to all three domains (the same content) or will Google only index the url on the domain which is crawled first. Thanks in advance

    | aquaspressovending
    0

  • We recently launched a new site - on June 4th we submitted our site map to google and almost instantly had all 25,000 URL's crawled (yay!). On June 18th, we made some updates to the title & description tags for the majority of pages on our site and added new content to our home page so we submitted a new sitemap. So far the results have been underwhelming and google has indexed a very low number of the updated pages.  As a result, only a handful of the new titles and descriptions are showing up on the SERP pages. Any ideas as to why this might be?  What are the tricks to having google re-index all of the URLs in a sitemap?

    | Emily_A
    0

  • Hello fellow Digital Marketeers! As an in-house kinda guy, I rarely get to audit sites other than my own. But, I was tasked with auditing another. So I ran it through Screaming Frog and the usual tools. I got a couple of URLs come back with timeout messages, so I checked them manually-  they're apparently part of a blog's archive: http://www.bestpracticegroup.com/tag/training-2/ I click 'read more' and it takes you to: http://www.bestpracticegroup.com/pfi-contracts-3-myth-busters-to-help-achieve-savings/ The first URL seems entirely redundant. Has anyone else seen something like this? Just an explanation as to why something like that would exist, and how you'd handle that would be grand! Much appreciated, John.

    | Muhammad-Isap
    0

  • Hi there,
    I have a strange issue where pages are redirecting to the homepage.Let me explain - my website is http://thedj.com.au Now when I type in www.thedj.com.au/payments it redirects to https://thedj.com.au (even though it should be going to the page https://thedj.com.au/payments). Any idea why this is and how to fix? My htaccess file is below: BEGIN HTTPS Redirection Plugin <ifmodule mod_rewrite.c="">RewriteEngine On
    RewriteRule ^home.htm$ https://thedj.com.au/ [R=301,L]
    RewriteRule ^photos.htm$ http://photos.thedj.com.au/ [R=301,L]
    RewriteRule ^contacts.htm$ https://thedj.com.au/contact-us/ [R=301,L]
    RewriteRule ^booking.htm$ https://thedj.com.au/book-dj/ [R=301,L]
    RewriteRule ^downloads.htm$ https://thedj.com.au/downloads/ [R=301,L]
    RewriteRule ^payonline.htm$ https://thedj.com.au/payments/ [R=301,L]
    RewriteRule ^price.htm$ https://thedj.com.au/pricing/ [R=301,L]
    RewriteRule ^questions.htm$ https://thedj.com.au/faq/ [R=301,L]
    RewriteRule ^links.htm$ https://thedj.com.au/links/ [R=301,L]
    RewriteRule ^thankyous/index.htm$ https://thedj.com.au/testimonials/ [R=301,L]
    RewriteCond %{HTTPS} off
    RewriteRule ^(.*)$ https://thedj.com.au/ [L,R=301]</ifmodule> END HTTPS Redirection Plugin BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
    RewriteBase /
    RewriteRule ^index.php$ - [L]
    RewriteCond %{REQUEST_FILENAME} !-f
    RewriteCond %{REQUEST_FILENAME} !-d
    RewriteRule . /index.php [L]</ifmodule> END WordPress RewriteCond %{HTTP_HOST} ^mrdj.net.au$ [OR]
    RewriteCond %{HTTP_HOST} ^www.mrdj.net.au$
    RewriteRule ^/?$ "https://thedj.com.au/" [R=301,L] RewriteCond %{HTTP_HOST} ^mrdj.com.au$ [OR]
    RewriteCond %{HTTP_HOST} ^www.mrdj.com.au$
    RewriteRule ^/?$ "https://thedj.com.au/" [R=301,L] RewriteCond %{HTTP_HOST} ^thedjs.com.au$ [OR]
    RewriteCond %{HTTP_HOST} ^www.thedjs.com.au$
    RewriteRule ^/?$ "https://thedj.com.au/" [R=301,L] RewriteCond %{HTTP_HOST} ^theperthweddingdjs.com$ [OR]
    RewriteCond %{HTTP_HOST} ^www.theperthweddingdjs.com$
    RewriteRule ^/?$ "https://thedj.com.au/" [R=301,L] RewriteCond %{HTTP_HOST} ^thedjs.net.au$ [OR]
    RewriteCond %{HTTP_HOST} ^www.thedjs.net.au$
    RewriteRule ^/?$ "https://thedj.com.au" [R=301,L]

    | HeadStud
    0

  • Hi, Are there any implications of having a parallax website and the URL not changing as you scroll down the page? So basically the whole site is under the same URL? However, when you click on the menu the URL does change? Cheers

    | National-Homebuyers
    0

  • I am a relatively new SEO professional, Can someone please look at this link and tell me if this is white or black hat SEO cloaking practices? http://loghomeconstructionpro.com/ It has an overlay landing page over a html page. I had a partner promote this to me as a proprietary software when really it just looks like cloaking. I want to do my business above board and this doesn't feel right. However, I would like some opinion on it before i pull the plus on my partner. Thanks all for the advice and the help. GD

    | gdavey
    0

  • Hi, What's the "correct" way of redirecting typo domains? DNS A record goes to the same ip address as the correct domain name Then 301 redirects for each typo domain in the .htaccess Subdomains on typo urls still redirect to www or should they redirect to the subdomain on the correct url in case the subdomain exists?

    | kuchenchef
    0

  • Hello I have a question for internal links build in the pattern below does google value these kinds of pattern of internal links... For example i have 3 pages on website A, B and C, The page A is homepage, B is cateogory page and C is product page and I am on page C, where I build internal links like this Home > Catogory > product page

    | tanveerayakhan
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.