Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • Hello, I am looking for some help here with an estate agent property web site. I recently finished the MoZ crawling report and noticed that MoZ sees some pages as duplicate, mainly from pages which list properties as page 1,2,3 etc. Here is an example: http://www.xxxxxxxxx.com/property-for-rent/london/houses?page=2
    http://www.xxxxxxxxx.com/property-for-rent/london/houses?page=3 etc etc Now I know that the best practise says I should set a canonical url to this page:
    http://www.xxxxxxxxx.com/property-for-rent/london/houses?page=all but here is where my problem is. http://www.xxxxxxxxx.com/property-for-rent/london/houses?page=1 contains good written content (around 750 words) before the listed properties are displayed while the "page=all" page do not have that content, only the properties listed. Also http://www.xxxxxxxxx.com/property-for-rent/london/houses?page=1 is similar with the originally designed landing page http://www.xxxxxxxxx.com/property-for-rent/london/houses I would like yoru advise as to what is the best way to can url this and sort the problem. My original thoughts were to can=url to this page http://www.xxxxxxxxx.com/property-for-rent/london/houses instead of the "page=all" version but your opinion will be highly appreciated.

    | artdivision
    0

  • Hi there, We'll be redesigning our website www.example.com and as such want to 302 users from www.example.com and all other pages to a new URL www.example.com/landingpage while we go through the redesign. The new landing page will have copy and a sign up form on it and once the redesign is completed, we plan on removing the 302 and sending all traffic back to the original url www.example.com. I'd just like to check that a 302 is the most relevant option here? Obviously, once redesign is completed we'll 301 any old URLs to their new locations once completed.

    | Hemblem
    0

  • Hey Moz Community, The Shopify ecommerce platform auto generates xml sitemaps and robots.txt for you. Frustratingly there is no way to augment either of these. If I noindex on a page it will still show up in the site map... Causing inconstancy with the sitemap submitted to GWT. In theory if put the MY version of the sitemap on site and point GWT to MY version.. Would this solve the inconstancy ? Or would Googlebot go in and still crawl the default /sitemap.xml anyway? Any suggestions and insight is greatly appreciated!

    | paul-bold
    0

  • Hi I hope someone can help me. I have launched a new website and trying hard to make everything perfect. I have been using Google Webmaster Tools (GWT) to ensure everything is as it should be but the crawl errors being reported do not match my site. I mark them as fixed and then check again the next day and it reports the same or similar errors again the next day. Example: http://www.mydomain.com/category/article/ (this would be a correct structure for the site). GWT reports: http://www.mydomain.com/category/article/category/article/ 404 (It does not exist, never has and never will) I have been to the pages listed to be linking to this page and it does not have the links in this manner. I have checked the page source code and all links from the given pages are correct structure and it is impossible to replicate this type of crawl. This happens accross most of the site, I have a few hundred pages all ending in a trailing slash and most pages of the site are reported in this manner making it look like I have close to 1000, 404 errors when I am not able to replicate this crawl using many different methods. The site is using a htacess file with redirects and a rewrite condition. Rewrite Condition: Need to redirect when no trailing slash RewriteCond %{REQUEST_FILENAME} !-f
    RewriteCond %{REQUEST_FILENAME} !.(html|shtml)$
    RewriteCond %{REQUEST_URI} !(.)/$
    RewriteRule ^(.)$ /$1/ [L,R=301] The above condition forces the trailing slash on folders. Then we are using redirects in this manner: Redirect 301 /article.html http://www.domain.com/article/ In addition to the above we had a development site whilst I was building the new site which was http://dev.slimandsave.co.uk now this had been spidered without my knowledge until it was too late. So when I put the site live I left the development domain in place (http://dev.domain.com) and redirected it like so: <ifmodule mod_rewrite.c="">RewriteEngine on
      RewriteRule ^ - [E=protossl]
      RewriteCond %{HTTPS} on
      RewriteRule ^ - [E=protossl:s] RewriteRule ^ http%{ENV:protossl}://www.domain.com%{REQUEST_URI} [L,R=301]</ifmodule> Is there anything that I have done that would cause this type of redirect 'loop' ? Any help greatly appreciated.\

    | baldnut
    0

  • Almost everything we talk about here involves making our content, or ourselves, more findable. However, a few years ago someone asked me this question: "How do I disappear from the internet?" She had a blog, a Facebook account, and some miscellaneous community engagement. Perhaps she was had witnessed a murder by a mafia hit man. Maybe she had an ex-boyfriend who was stalking her. She could have been going overseas as a spy. Or maybe she was in trouble with the law. Or she could have just been looking for a job but fearful of the content from her college years. Whatever it was she wanted to disappear from the web ... to go off the grid. Here was my advice: Export the blog content and delete the blog. If you don't delete the blog, hide it with robots.txt, put it behind a login, and put a meta noindex, nofollow, noarchive, noodp tag on all the pages. Export your Facebook account and delete it. Find all the forums and sites you posted to. See if you can delete your posts and profile. Or at least change your name and other profile content. Then maybe delete the email accounts you used for those profiles. Set up Google alerts with your name or specific phrases from your profile. I saw a news article today that reminded me of her question and made me wonder what I missed, if anything. In my college art classes, we would sometimes draw the area surrounding the subject - the negative space - rather than draw the actual subject. This question is like that. What does un-findability look like? Maybe that will bring out some insights on optimization we haven't thought of. So if it were you, how would you go about disappearing from the internet given your current level of engagement online? Maybe it would be helpful to imagine yourself in a witness protection scenario. 🙂 zig3tMp.png

    | justin-brock
    0

  • I'm getting ready to do a redesign for a client and one thing that annoys me about the directory structure of the website is that he has files buried deep in the directories. For example, the images are buried like four folders deep in some cases and I would like to move all of those images into an images folder directly below the root. All of those images, however, have already been indexed by google and show up in google images. If I start moving those images around, could it hurt his rankings?

    | ScottMcPherson
    0

  • Hi mozzers, I wanted to know if this type of navigation SEO friendly. Is it better than the regular drop down menu navigation? Thanks! Ug4MhZw.png

    | Ideas-Money-Art
    0

  • Hi, I received my weekly web crawl and it is saying this: | 4 | Duplicate Page Content |
    | 22 | Missing Meta Description Tag |
    | 9 | Duplicate Page Title |
    | 1 | Title Element Too Long (> 70 Characters) |
    | 1 | Title Element Too Short |
    | 1 | 301 (Permanent Redirect) | I'm new to SEO and don't know how to fix this, I don't really see how I have Duplicate Page Content or Duplicate Page Title. This is my website: afrohairsolutions.co.uk Thank you in advance.

    | afrohairsolutions
    0

  • My website is https but the default property that was configured on Google WMT was http and wasn't showing me any information because of that. I added an https property for that, but my question is: do I need to delete the original HTTP or can I leave both websites?

    | Onboard.com
    0

  • Hi There, I have a question for you. I am working on a website where by typing any letter of the URL in lower or upper case, it will give a 200 code. Examples www.examples.com/page1/product www.examples.com/paGe1/Product www.examples.com/PagE1/prOdUcT www.examples.com/pAge1/proODUCt and so on… Although I cannot find evidence of backlinks pointing to my page with mixed cases, shall I redirect or rel=canonical all the possible combination of the cases to a lower version of them in order to prevent duplicate content? And if so, do you have any advice on how to complete such a massive job? Thanks a lot

    | Midleton
    0

  • I've be tasked with figuring out how to recover our rankings as we are likely being hurt by an algorithmic penalty.  I have no idea if this was the workings of a previously hired SEO or the result of negative SEO, **how does Google differentiate between a site with bad/spammy link building practices from a victim of a negative SEO attack? **

    | Syed_Raza
    0

  • Our site is likely suffering an algorithmic penalty from a high concentration of non-branded anchor text that I am painstakingly cleaning up currently.  Incremental clean-ups don't seem to be doing much. Google recommends I 'take a machete to them' and basically remove or disavow as much as possible, which I am now seriously considering as an option. What do you guys recommend, should torch the earth (disavow all links with that anchor text) or keep it on life support (slowly and manually identify each bad link)?

    | Syed_Raza
    0

  • Hi We have around 10 supplementary sites that have links to our site which are now closing down but are out of our control.  We could have access to their domains so how could we maintain the link juice from these old sites which are going to our new site? However there will be no websites left on these old supplementary just domain names

    | ocelot
    0

  • Hi guys, I've sent a disavow file via webmaster tools. After that, should the backlinks from domains listed in that file disappear from the list of links to my website in webmaster tools? Or does webmaster tools show all the links, whether I've sent disavow file or not?

    | superseopl
    0

  • Hi,
    I've been trying set up a 301 redirect from http://domestiquecap.com to www.domestiquecap.com but one was already set up by my client the other way around (from www to http://) so it's creating a redirect loop. However, we don't know where that original redirect was set up. The htaccess file doesn't appear to have the redirect and neither does the control panel of our hosting company. We need to turn off that original redirect so that I can instead redirect to the www subdomain. Where else could this 301 redirect have been set up? Is there a tool to diagnose where the 301 redirect was created so that I can turn it off? I am thinking maybe it was created via the domain registrar (GoDaddy) since the client has the login there and hasnt shared it with me.

    | bshanahan
    0

  • What do you think can be causing issues that my main homepage is not showing on search results for the main search terms, only my sub-pages are showing up like the reservations page. ps. I did not get a manual penalty action email. Thanks,

    | EVERWORLD.ENTERTAIMENT
    0

  • Hi,  I recently found a large number of duplicate pages on our site that we didn't know existed (our third-party review provider was creating a separate page for each product whether it was reviewed or not - the ones not reviewed are almost identical so they have been no indexed.  Question - how long do you have to typically wait for Google to pick this up On our site?  Is it a normal crawl or do we need to wait for the next Panda review (if there is such a thing)?  Thanks much.

    | trophycentraltrophiesandawards
    0

  • Hello everyone, We are in the process of going from a .net to a .com and we have also done a complete site redesign as well as refreshed all of our content.  I know it is generally ideal to not do all of this at once but I have no control over that part.  I have a few questions and would like any input on avoiding losing rankings and traffic. One of my first concerns is that we have done away with some of our higher ranking pages and combined them into one parallax scrolling page.  Basically, instead of having a product page for each product they are now all on one page.  This of course has made some difficulty because search terms we were using for the individual pages no longer apply. My next concern is that we are adding keywords to the ends of our urls in attempt to raise rankings.  So an example: website.com/product/product-name/keywords-for-product  if a customer deletes keywords-for-product they end up being re-directed back to the page again.  Since the keywords cannot be removed is a redirect the best way to handle this?  Would a canonical tag be better?  I'm trying to avoid duplicate content since my request to remove the keywords in urls was denied.  Also when a customer deletes everything but website.com/product/ it goes to the home page and the url turns to website.com/product/#.  Will those pages with # at the end be indexed separately or does google ignore that? Lastly, how can I determine what kind of loss in traffic we are looking at upon launch?  I know some is to be expected but I want to avoid it as much as I can so any advice for this migration would be greatly appreciated.

    | Sika22
    0

  • Hi - the content on our corporate website is pretty technical, and we include chemical element codes in the text that users would search on (like S02, C02, etc.) A lot of times our engineers request that we list the codes correctly, with a <sub>on the last number. Question - does adding this code into the keyword affect SEO? The code would look like SO<sub>2</sub>.</sub> Thanks.

    | Jenny1
    0

  • Hi Mozzers, I have read all the relevant blogs from media indexing experts like Phil Nottingham and have followed Google's best practice as well as advice from similar discussions on here. We have submitted video and image sitemaps to WT, and the image sitemap has 33 indexed from 720 submitted images, and the video 170 indexed from 738 submitted. With the image sitemap the number (33) has remained steady while the submitted has grown by over 100 in the last month. The video has shown signs of indexing new videos however but still not the amount that were submitted. Thus far, I have followed the guidelines sitemap structure as per Google. We are using Cloudfront so I have added and verified our cloudfront server in the same WT account. If anyone has any advice, it would be most appreciated. There is no duplicate content and the robots.txt is not blocking anything within the sitemap. Image sitemap: view-source:http://www.clowdy.com/sitemap.images.xml

    | Morrreau
    0

  • I know it is now possible to verify google webmaster tools accounts using google tag manager, but is it also possible to create a link from webmaster tools to tag manager? I would like to connect the two.  I have analytics already linked to tag manager, but can't find a clear answer regarding webmaster tools. Anybody? tx!

    | susancompass
    0

  • The number of indexed pages for my site was 1100 yesterday and today is 344 Anybody has any idea what can cause this. Thank you Sina

    | SinaKashani
    0

  • Hi, when we search for a phrase (which is the most searched for phrase for our company) the meta description which is displayed isnt the one we set, and it hasnt picked it up from any text on the page. The description is incorrect, it says we have an office in a city that we dont, and it just isnt a very good description generally. What has been suggested to us by our website developers is that the description is being picked up by google from a website which lists companies details. The description which is displayed on that website, is the same as the description which is shown for our company in the search results. But is it possible for Google to ignore the meta description which is set in our homepage and the other text on the home page, and pickup the text from another website and use it as our description? Many Thanks

    | danieldunn10
    0

  • I am looking for an online tool that will display the contents of the htaccess file. i came across a tool a month ago, but I can not recall the name of the tool. Thanks

    | irvingw
    0

  • I've begun writing an annual review of local business directories. Post from 2012 is here: http://web.servicecrowd.com.au/blog/top-10-australian-business-directories-in-2012/ New 2014 post is here: http://web.servicecrowd.com.au/blog/top-10-australian-business-directories-2014/ Is this appropriate use? Next year the post will be similar, but different metrics reported and slightly different review. Side note: For some reason the post hasn't been indexed by Google yet. Usually new posts are indexed as soon as they are shared on social media.

    | ServiceCrowd_AU
    0

  • If you're an ecommerce site owner, will you be changing how you deal with unavailable products as a result of the recent video from Matt Cutts? Will you be moving over to a 404 instead of leaving the pages live still? For us, as more products were becoming unavailable, I had started to worry about the impact of this on the website (bad user experience, Panda issues from bounce rates, etc.). But, having spoken to other website owners, some say it's better to leave the unavailable product pages there as this offers more value (it ranks well so attracts traffic, links to those pages, it allows you to get the product back up quickly if it unexpectedly becomes available, etc.). I guess there's many solutions, for example, using ItemAvailability schema, that might be better than a 404 (custom or not).  But then, if it's showing as unavailable on the SERPS, will anyone bother clicking on it anyway...? Would be interested in your thoughts.

    | Coraltoes77
    0

  • For years the newsroom, which is on the subdomain news.davidlerner.com - has ranked #2 for their brand name search. On march 10 it fell out of the SERPs - it is completely gone.  What happened?  How can I fix this?

    | MeritusMedia
    0

  • Hi Moz Community, We have a client who is legitimately repurposing, or scraping, content from site A to site B. I looked into it and Google recommends the cross-domain rel=canonical tag below: http://googlewebmastercentral.blogspot.com/2009/12/handling-legitimate-cross-domain.html The issue is it is not a one to one situation. In fact site B will have several pages of content from site A all on one URL. Below is an example of what they are trying to accomplish. EX - www.siteB.com/apples-and-oranges is made up of content from www.siteA.com/apples & www.siteB.com/oranges So with that said, are we still in fear of getting hit for duplicate content? Should we add multiple rel=canonical tags to reflect both pages? What should be our course of action.

    | SWKurt
    0

  • My report used to crawl my entire site which is around 90 pages.  Any idea of why this would happen? www.treelifedesigns.com

    | nathan.marcarelli
    0

  • Greetings All, I have a question regarding "Paid Links." My company creates custom websites for other small businesses across the country.  We always have backlinks to our primary website from our "Dealer Sites."  Would Google and other search engines consider links from our "dealer sites" to be "paid links?" Example:
    http://www.atlanticautoinc.com/ is the "dealer site." Would Google consider the links  from Atlantic Auto to be a "paid link," and therefor have less of an impact for page rankings, due to it not being organic? Any insight on this matter would be greatly appreciated.  Thank you!!!

    | CFSSEO
    0

  • My site have keyword domain but my page doesnt up or down at 6th page on search results. And my main page doesnt show 6th page too my alt pages. So what can i do for this penaly? Thanks for your help

    | iddaasonuclari
    0

  • I currently have the Spanish version of my site under myurl.com/es/ When I was at Pubcon in Vegas last year a panel reviewed my site and said the Spanish version should be in /mx/ rather than /es/ since es is for Spain only and my site is for Mexico only. Today while trying to find information on the web I found /es-mx/ as a possibility. I am changing my site and was planning to change to /mx/ but want confirmation on the correct way to do this. Does anyone have a link to Google documentation that will tell me for sure what to use here? The documentation I read led me to the /es/ but I cannot find that now.

    | RoxBrock
    0

  • Hi Team Moz, I am working through a site cutover with an entirely new URL structure and have a bunch of pages that could not, would not or just plain don't redirect to new pages. Steps I have taken: Multiple new sitemaps submitted with new URLs and the indexing looks solid used webmasters to remove urls with natural result listings that did not redirect and produce urls Completely built out new ppc campaigns with new URL structures contacted few major link partners Now here is my question: I have a pages that produce 404s that are linked to in forums, slick deals and stuff like that which will not be redirected. Is disavowing these links the correct thing to do?

    | mm916157
    0

  • Hi Mozzers, I was wondering what are the greatest benefits of an SEO audit and how to explain the necessity to do it to a customer? Thanks for your answers, Jonathan

    | JonathanLeplang
    0

  • I've noticed that there is a domain in WMT that Google says is linking to our domain from 173 different pages, but it actually isn't linking to us at all on ANY of those pages.  The site is a business directory that seems to be automatically scraping business listings and adding them to hundreds of different categories.  Low quality crap that I've disavowed just in case. I have hand checked a bunch of the pages that WMT is reporting with links to us by viewing source, but there's no links to us.  I've also used crawlers to check for links, but they turn up nothing.  The pages do, however, mention our brand name.  I find this very odd that Google would report links to our site when there isn't actually links to our site.  Has anyone else ever noticed something like this?

    | Philip-DiPatrizio
    0

  • So, we currently have rich snippets showing for reviews on our site. We've made some new product pages that have reviews on them, but they are hidden behind a tab. Because of this our rich snippets haven't been showing in the serps, so we've been looking for a way to get them showing for these new pages. What we've found is that we can change the rich snippets from reviews to votes, which will show an aggregate score on the page, and this will get the snippets appearing in serps again as votes. What we're concerned about is, if we make this change to these new pages, they will automatically change everything on our review pages and all snippets on our sites will change from reviews to votes (not just the new pages). What we want to know is, if we make this change do you think that we may see a negative seo impact (aside from maybe having a lower CTR)? Thanks!

    | davo23
    0

  • Hi, I have a site which have city wise pages and in a given city we have categories. The listed products can be listed in different categories which have separate URL.  The site have different URL, meta, title for each category. We want to Rank these pages based on category also... What is best way to avoid duplicate and canonical issue.. Thanks,
    Darshan..

    | dsingh1079
    0

  • How can I remove the duplicate page content in my Magento store from being read as duplicate. I added the Magento robots file that i have used on many stores and it keeps giving us errors. Also we have enabled the canonical links in magento admin I am getting 3616 errors and can't seem to get around it .. any suggestions?

    | adamxj2
    0

  • I've recently started to work on a website that has been previously targeting sub domain pages on its site for its SEO and has some ok rankings. To better explain, let me give an example...A site is called domainname.com. And has subdomains that they are targeted for seo (i.e. pageone.domainname.com, pagetwo.domainname.com, pagethree.domianname.com). The site is going through a site re-development and can reorganise its pages to another URL. What would be best way to approach this situation for SEO? Ideally, I'm tempted to recommend that new targeted pages be created - domainname.com/pageone, domainname.com/pagetwo, domainname.com/pagethree, etc - and to perform a 301 redirect from the old pages. Does a subdomain page structure (e.g. pageone.domainname.com) have any negative effects on SEO? Also, is there a good way to track rankings? I find that a lot of rank checkers don't pick up subdomains. Any tips on the best approach to take here would be appreciated. Hope I've made sense!

    | Gavo
    0

  • We're a bit confused as to why Google shows a secure page https:// URL in the results for some of our pages. This includes our homepage. But when you click through it isn't taking you to the https:// page, just the normal unsecured page. This isn't happening for all of our results, most of our deeper content results are not showing as https://. I thought this might have something to do with Google conducting searches behind secure pages now, but this problem doesn't seem to affect other sites and our competitors. Any ideas as to why this is happening and how we get around it?

    | amiraicaew
    0

  • The title (top link) in my organic result in google search has my page title then a dash and another keyword. This keyword is not related to the search result. It may discourage clicks. Where does google get that and how can i change it? For example: Page Title - unrelated keyword
    url
    link description

    | bhsiao
    0

  • Hi, This is a very strange problem and not sure how it has happened. I am adding packages to my website and a duplicate page & almost identical URL is being picked up by Google. E.g. the page I make is http://www.ukgirlthing.co.uk/hen-party/bristol-spa-rty-lunch-pampering-h... but then also appearing is http://www.ukgirlthing.co.uk/hen-party/bristol-spa-rty-pampering-hen-party. The node's are exactly the same, and if i edit one of them, the other also updates. You will notice that the URL's are almost exactly the same, except the words are re-organised slightly? Shall i just delete the URL alias of the duplicate entry or is there something else which is making this happen? These URL's are being picked up as duplicate content, although it's the same node! Hope you can help, Thank you!

    | Party_Experts
    0

  • We recently launched a new site and I felt the need to redirect all of our old site URLs to the new site URLs. Our developers told me they would only be able to do about 1000 before it starts to bog down the site. Has anyone else came across this before? On top of that, with our new site structure, whenever our content team changes a title (which is more often than i had hoped), the URL changes. This means I'm finding i have many other redirects I need to put in place, but cant at the moment. Advice please??

    | CHECOM
    0

  • Hi, I have the following setup: www.example.com/nl
    www.example.com/de
    www.example.com/uk
    etc
    www.example.com is 301'ed to www.example.com/nl But now www.example.com is ranking instead of www.example.com/nl
    Should is block www.example.com in robots.txt so only the subfolders are being ranked?
    Or will i lose my ranking by doing this.

    | mikehenze
    0

  • Does anyone else have experience with the current update Adobe Business Catalyst has announced for their blog features? Florin at BC offered the code below: http://www.graeagle.com/images/fb_blog_og_img.jpg" /> However nether myself nor another commentator can figure out how to make it work: I added the meta data to my template but it seems the tags are not correct. For example, the tag  {tag_blogpostmetatitle} does not automatically include the SEO title that I've called out in my individual blog post. So, it appears the browser is ignoring the tag and just including it as is. When I view the source for my live blog article, this is what I get for the lines that I've added the code in the tag: Also, I cannot get schema metadata to work on the BC blog. For example, I have used it on this page: http://www.homedestination.com/_blog/Real_Estate_Blog/post/things_to_know_before_building_a_new_home/; which yields the following in Google's Rich Snippet Tool: Extracted structured data rdfa-node property:  title: {tag_blogpostmetatitle} description:__{tag_blogpostmetadescription}

    | jessential
    0

  • We were ranking well for most of the keywords.These  keywords were at first page i.e mostly @ 1st and 2nd position in Google.But we have put the following tag ie no index tag for our pages by mistake and after that the ranking fluctuated &  fell down in Google search results as our pages were out of index in Google .We have removed the noindex tag after encountering the problem.Now I can see my pages are cached & indexed in Google after submitting to index in Google webmaster tool.Can I regain my ranking for the keywords and within how many days I would regain my previous ranking?

    | vivekrathore
    0

  • On our website recyclingbins.co.uk the meta decsription of the homepage under view source is - Recycling bins offers the largest range of recycling bins for schools, homes, offices and other venues. With free delivery on everything and lowest prices guaranteed.
    But if you searched for our website in Google the meta description it shows is: Offers recycling binsfor offices, schools and the home. Someone has already suggested it must be cached. I do not think this could be possible as we are fairly regularly crawled and it has been like this for weeks and weeks. No one seems to have much idea. could you possibly share any light? I am not concerned from an SEO perspective, but more from a click through perspective.Thank youJon

    | imrubbish
    0

  • My website, www.nile-cruises-4u.co.uk has fallen dramatically for the top industry search terms (nile cruise, nile cruises) over the last 12 months from previous page one rankings to page three which has very badly affected us financially. I found, using Linkdetox, that we had thousands of back-links for non-related anchor-text, mainly porn terms, viagra, etc. I have submitted a Disavow file and request about a week ago and wondered firstly if the enormous amount of these links would have helped cause the drop to page three and secondly  if the Disavow request will eventually help the website return to better rankings? Thanks,Colin

    | GratefulFred
    0

  • Hi all, What is the best and easiest way to 301 redirect URLs on IIS server? I got access to the FTP and WordPress back office, but no access to the server admin. Is there an easy way to create 301 redirect without having to always annoy the tech in charge of the server? Thanks!

    | 2MSens
    0

  • Hi, We have a subdomain that is appearing in the search results - I want to hide this as it looks really bad. If I were to add the no index tag to the sub domain would URL would this affect the whole domain or just that sub domain? The main domain is vitally important - it is just that sub domain I need to hide. Many thanks

    | Creditsafe
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.