Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • Hello, We have a website that was penalized roughly two years by Google for "Unnatural Links"... We are experiencing a lot of problems with this site, completely unrelated to the penalty or SERPS, and we're debating doing a 301 Re-direct to another site we own that is totally clean and has no "Unnatural Links". If we do a 301 from the penalized site to our alternative website, will there be any cross-contamination?  Will the penalty carry over to our other site? Please let me know what you guys think. Thanks

    | Prime85
    0

  • Can anyone recommend a reliable proxy service (paid or otherwise) to tunnel scraping requests through? I've been using free proxes which are sometimes a bit slow/ timeout or just refuse connections.

    | AlexThomas
    0

  • Hi, This is sort of a duplicate content issue, but not quite. I'm concerned with the way our code is written and whether or not it can cause problems in the future. On many of our pages (thousands), our users will have the option to post comments. We have a link which opens a JavaScript pop-up with our comments guidelines. It's a 480 word document of original text, but it's preloaded in the source code of every page it appears on. The content on these pages will be relatively thin immediately, and many will have thin content throughout. I'm afraid so many of our pages look the same in both code and on-site content that we'll have issues down the line. Admittedly, I've never dealt with this issue before, so I'm curious. Is having a 480 word piece of text in the source code on so many pages an issue, or will Google consider it part of the template, similar to footer/sidebar/headers? If it's an issue, we can easily make it an actual pop-up hosted on a SINGLE page, but I'm curious if it's a problem. Thanks!

    | kirmeliux
    0

  • Hey everyone! This is actually the first time I ever posted a question here on MOZ! Guess I was (still am) embarrassed by being an SEO Noob! That being said, I really have to get some input on this matter and i was wondering if you guys might be able to help. I'm optimizing a page for a wedding venue in Portugal. Currently, according to google trends the Plural - Venues for weddings, scores considerably better than the Singular, Venue for weddings(this was researched in Portuguese written terms of course). Despite this, i'm leaning towards an optimization for the Singular term, because the plural seems to un-natural to fit in the content, or title. I managed to fit the Plural in the description but i've read that it hasn't influenced rank directly for a while. Currently my title tag reads: Venue for Weddings | Name of the Venue. I really can't find anyway that it makes sense to me in the Plural... and i feel like if i was a user, i would rather click on the singular term cause it just makes a lot more sense. But my opinion is most probably biased by the fact that i understand that using the plural term will be solemnly an SEO effort to rank higher for a term that has more average searches per month. My question is: In the current state of search algorithms, will an optimization for the singular term, still get me some rank on the plural key phrase? Let me know what you think about this please, and thank you in advance for your time. Most Respectfully, Martim Coutinho dos Santos

    | martim_santos
    0

  • Hello Moz community, I have a question: what is the difference between a local version of Google vs. the default Google in regards to search results? I have a Mexican site that I'm trying to rank in www.google.com.mx, but my rankings are actually better if I check my keywords on www.google.com The domain is a .mx site, so wouldn't it make more sense that this page would rank higher on google.com.mx instead of the default Google site, which in theory would mean a "broader" scope? Also, what determines whether a user gets automatically directed to a local Google version vs. staying on the default one? Thanks for your valuable input!

    | EduardoRuiz
    0

  • I am not getting accurate views in ose. Its still saying i have links from domains which i do not. I am wanting an accurate anchor text distribution but ose is not providing me with it. Does anyone know where i can get this kind of info from? I waited for an update with ose but i personally find it inaccurate as it is still telling me links are there when they are not.

    | pauledwards
    0

  • Hello guys, Do you think it is worth disvowing scrappers' links on otherwise good linking profile. Sites like Webmaster tools shows that sites like: mrwhatis.net, askives.com prlog.ru bigbozz.com answerparty.com wordexplorer.com scrape our content and generated from 10 to 90 links to our pages. Is removing these links waste of my time? Thanks

    | SirMax
    0

  • Hi Moz Community, If I have two different sub-category pages: http://www.example.com/rings/anniversary-rings/
    http://www.example.com/wedding/anniversary-rings/ And the first one is ranking for all KWs, should I add a rel=canonical to the second URL or leave it since it's slightly different? Or should I try and create different unique content for the second URL? Everything in terms of content is the same on both these pages except for the URLs, which aren't that different to begin with. Thanks for your help! -Reed

    | IceIcebaby
    0

  • Hi On our blog, we have a section called 'Tags'. I have just noticed that these links are all "no follow" links. The tags section does appear on every single page on the blog - is this recommend to have them as 'no follow' links or should I get our developer to change them. Thanks

    | Andy-Halliday
    0

  • I'm building a resource section that will probably, hopefully, attract a lot of external links but the problem here is that on the main index page there will be a big number of links (around 150 internal links - 120 links pointing to resource sub-pages and 30 being the site's navigational links), so it will dilute the passed link juice and possibly waste some of it. Those 120 sub-pages will contain about 50-100 external links and 30 internal navigational links. In order to better visualise the matter think of this resource as a collection of hundreds of blogs categorised by domain on the index page (those 120 sub-pages). Those 120 sub-pages will contain 50-100 external links The question here is how to build the primary page (the one with 150 links) so it will pass the most link juice to the site or do you think this is OK and I shouldn't be worried about it (I know there used to be a roughly 100 links per page limit)? Any ideas? Many thanks

    | flo2
    0

  • Hi We have an eccomerce site. We just added to the product list view a quickview tab - when you roll mouse over it a popup window with the product image and short description shows up - is this a problem of duplicate content( its the same content that's on the product pages except there we also have a long detailed description) - t is done with javascript. Thanks!

    | henya
    0

  • Howdy Mozers, We have main website on .com domain and local websites for each language like .es, .fr, .in etc. We decided to move all local sites under main domain .com using subdirectories with gTLDs. One of the local sites has a manual penalty. Right now we are redirecting local site which have penalty using 302 redirect. So my question is. Will 302 redirect hurt our main site? Is there any other way to redirect visitors from local site without passing penalty? We have few thousands monthly users who are still using local domain links to get to our site, so we can't remove redirect at all. Best Regards,
    Juris

    | juris_l
    0

  • Hey I've got a question about linking within a div box. Say you have a title, an image and a short paragraph within a div box all referring to an internal page on your site. Should you link the whole div box to the internal page, or link the items within it individually e.g. the title, image and text individually? Thanks in advance

    | Kerry_Jones
    0

  • Good afternoon, smart dudes : ) I am here to ask for your help. I posted this question on google help forum and stackoverflow, but looks like people do not know the correct answer... QUESTION: We used to have a secured site, but recently purchased a separate reservation software that provides SSL (takes clients to a separate secured website) where they can fill out the reservation form. We cancelled our SSL (just think its a waste to pay $100 for securing plain text).  Now i have so many links pointing to our secured site and i have no idea how to fix it! How do i redirect https://www.mysite.comto http://www.mysite.com.Also would like to mention that i already have redirect from non www to www domain (not sure if that matters): RewriteEngine onRewriteCond %{HTTP_HOST} ^mysite.com$ [NC]RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L]As i already mentioned....we do not have SSL!!!! None of those 301 redirect codes i found online work (you have to have SSL for the site to be redirected from https to http | currently i get an error - can't establish a secured connection to the server ). Is there anything i can do???? Or do i have to purchase SSL again?

    | JennaD14
    0

  • Hy Guys, maybe you can help me to understand better: on 17.04 I had 7600 pages indexed in google (WMT showing 6113). I have included in the robots.txt file, Disallow: /account/  - which contains the registration page, wishlist, etc. and other stuff since I'm not interested to rank with registration form. on 23.04 I had 6980 pages indexed in google (WMT showing 5985). I understand that this way I'm telling google I don't want that section indexed, by way so manny pages?, Because of the faceted navigation? Cheers

    | catalinmoraru
    0

  • Hi, I have a website that has an old version and a new version. The content is not duplicate on the different versions.
    The point is that the old version uses www. and non-www before the domain and the new one uses www2. My questions is: Is that a problem and what should be done? Thank you in advance!

    | TihomirPetrov
    0

  • Hey MOZ fans, I have a exciting question for you today. http://i.imgur.com/dl0r9s1.png
    I try to visualize but let me explain too. In the pagerank algorithm, as you know the pagerank flows to the links **no matter they are internal or external. **And the link juice that pass can be found by pagerank of the site, times %85 divided by total outlinks ( no matter they are nofollow attribitu or not.) 
    Everything is okay here now. But what would be if it would be in the as image below. http://i.imgur.com/dl0r9s1.png Does it also become a loop and, each of the page makes their pagerank 10? Thanks for your help. dl0r9s1.png

    | atakala
    0

  • Google mysteriously decided to index the broken, https version of one page on my company's site (we have a cert for the site, but this page is not designed to be served over https and the CSS doesn't load). The page already has many incoming links to the http version, and it has a canonical URL with http.  I resubmitted it on http with webmaster tools.  Is there anything else I could do?

    | BostonWright
    0

  • One of my client sites lists millions of products and 100s or 1000s are de-listed from their inventory each month and removed from the site (no longer for sale). What is the best way to handle these pages/URLs from an SEO perspective? There is no place to use a 301. 1. Should we implement 404s for each one and put up with the growing number of 'pages not found' shown in Webmaster Tools? 2. Should we add them to the Robots.txt file? 3. Should we add 'nofollow' into all these pages? Or is there a better solution? Would love some help with this!

    | CuriousCatDigital
    0

  • Hey y'all, I know this question has been asked many times before but I wanted to see what your stance was on this particular case. The organisation I work for is a group of 12 companies - each with its own website. On some of the sites we have a link to the other sites within the group on every single page of that site. Our organic search traffic has dropped a bit but not significantly and we haven't received any manual penalties from Google. It's also worth mentioning that the referral traffic for these sites from the other sites I control is quite good and the bounce rate is extremely low. If you were in my shoes would you remove the links, put a nofollow tag on the links or leave the links as they are? Thanks guys 🙂

    | AAttias
    0

  • We are planning on redesigning our homepage and are thinking of moving the location of our blog. Currently it is part of the main menu with a tab "Blog and News" Links to the top news article are also displayed below the fold. I checked with Google In-Page analytics and the news articles main link get 0.1% of the clicks and the blog&news don't get any clicks. The Marketing VP wants to move Blog&News to a link below the fold... which seems like it will send a message to Google we don't care about it and get it even less traction than we currently have in terms of visitors. Any suggestions of what we should do with it?

    | theLotter
    0

  • Hi I have an ecommerce client who has all their images cloud hosted (amazon CDN) to speed up site. Somehow it seems maybe because the pinned the images on pinterest but the CDN got indexed and there now seems to be about 50% of the site duplicated (about 2500 pages eg: http://d2rf6flfy1l.cloudfront.net..) Is this a problem with duplicate content? How come Moz doesnt show it up as crawl errors? Why is thisnot a problem that loads of people have?I only found a couple of mentions of such a prob when I googled it.. any suggestion will be grateful!

    | henya
    0

  • Our website supports a unique business industry where our users will come to us to look for something very specific (a very specific product name) to find out where they can get it.  The problem that we're facing is that the products are constantly changing due to the industry. So, for example, one month, one product might be found on our website, and the next, it might be removed completely... and then might come back again a couple months later.  All things that are completely out of our control - and we have no way of receiving any sort of warning when these things might happen. Because of this, we're seeing a lot of duplicate content issues arise... For Example... Product A is not active today... so   www.mysite.com/search/productA  will return no results... Product B is also not active today... so  www.mysite.com/search/productB will also return no results. As per Moz Analytics, these are showing up as duplicate content because both pages indicate "No results were found for {your searched term}." Unfortunately, it's a bit difficult to return a 204 in these situations (which I don't know if a 204 would help anyway) or a 404, because, for a faster user experience, we simultaneously render different sections of the page... so in the very beginning of the page load - we start rendering the faster content (template type of content) that says "returning 200 code, we got the query successfully & we're loading the page".. the unique content results finish loading last since they take the longest. I'm still very new to the SEO world, so would greatly appreciate any ideas or suggestions that might help with this... I'm stuck. 😛 Thanks in advance!

    | SFMoz
    0

  • Hey Moz Community! On a the CMS in question the sitemap and robots file is locked down. Can't be edited or modified what so ever. If I noindex a page in the But it is still on the xml sitemap... Will it get indexed? Thoughts, comments and experience greatly appreciate and welcome.

    | paul-bold
    0

  • Hi All, I found one other discussion about the subject of PDFs and passing of PageRank here: http://moz.com/community/q/will-a-pdf-pass-pagerank  But this thread didn't answer my question so am posting it here. This PDF: http://www.ccisolutions.com/jsp/pdf/YAM-EMX_SERIES.PDF is reported by GWT to have 38 links coming from 8 unique domains. I checked the domains and some of them are high-quality relevant sites. Here's the list: Domains and Number of Links
    prodiscjockeyequipment.com 9
    decaturilmetalbuildings.com 9
    timberlinesteelbuildings.com 6
    jaymixer.com 4
    panelsteelbuilding.com 4
    steelbuildingsguide.net 3
    freedocumentsearch.com 2
    freedocument.net 1 However, when I plug the URL for this PDF into OSE, it reports no links and a Page Authority if only "1". This is not a new page. This is a really old page. In addition to that, when I check the PageRank of this URL, the PageRank is "nil" - not even "0" -  I'm currently working on adding links back to our main site from within our PDFs, but I'm not sure how worthwhile this is if the PDFs aren't being allocated any authority from the pages already linking to them. Thoughts? Comments? Suggestions? Thanks all!

    | danatanseo
    0

  • I understand that when migrating to a new site, even if done perfectly (page level 301s etc) that rankings will drop in the short term and each site will be impacted differently.  I picked up the following comment and was wanting to get a few experts thoughts on whether I can quote this to my client: "In our experience, even when 301's are correctly executed, we see a short term fall back (7-30) days and then about a 90% carry through after that period for about 90 days and then back to full strength. "

    | steermoz8
    0

  • Hi All This follows on from a previous question (http://moz.com/community/q/how-to-fix-google-index-after-fixing-site-infected-with-malware) that on further investigation has become a much broader problem.  I think this is an issue that may plague many sites following upgrades from CMS systems. First a little history.  A new customer wanted to improve their site ranking and SEO.  We discovered the site was running an old version of Joomla and had been hacked.  URL's such as http://domain.com/index.php?vc=427&Buy_Pinnacle_Studio_14_Ultimate redirected users to other sites and the site was ranking for buy adobe or buy microsoft.  There was no notification in webmaster tools that the site had been hacked.  So an upgrade to a later version of Joomla was required and we implemented SEF URLs at the same time.  This fixed the hacking problem, we now had SEF url's, fixed a lot of duplicate content and added new titles and descriptions.  Problem is that after a couple of months things aren't really improving.  The site is still ranking for adobe and microsoft and a lot of other rubbish and the urls like http://domain.com/index.php?vc=427&Buy_Pinnacle_Studio_14_Ultimate are still sending visitors but to the home page as are a lot of the old redundant urls with parameters in them.  I  think it is default behavior for a lot of CMS systems to ignore parameters it doesn't recognise so http://domain.com/index.php?vc=427&Buy_Pinnacle_Studio_14_Ultimate displays the home page and gives a 200 response code. My theory is that Google isn't removing these pages from the index because it's getting a 200 response code from old url's and possibly penalizing the site for duplicate content (which don't showing up in moz because there aren't any links on the site to these url's)  The index in webmaster tools is showing over 1000 url's indexed when there are only around 300 actual url's.  It also shows thousands of url's for each parameter type most of which aren't used. So my question is how to fix this, I don't think 404's or similar are the answer because there are so many and trying to find each combination of parameter would be impossible.  Webmaster tools advises not to make changes to parameters but even so I don't think resetting or editing them individually is going to remove them and only change how google indexes them (if anyone knows different please let me know) Appreciate any assistance and also any comments or discussion on this matter. Regards, Ian

    | iragless
    0

  • Hi All Upgraded a Joomla site for a customer a couple of months ago that was infected with malware (it wasn't flagged as infected by google).  Site is fine now but still noticing search queries for "cheap adobe" etc with links to http://domain.com/index.php?vc=201&Cheap_Adobe_Acrobat_xi in web master tools (about 50 in total).  These url's redirect back to home page and seem to be remaining in the index (I think Joomla is doing this automatically) Firstly, what sort of effect would these be having on on their rankings?  Would they be seen by google as duplicate content for the homepage (moz doesn't report them as such as there are no internal links). Secondly what's my best plan of attack to fix them.  Should I setup 404's for them and then submit them to google?  Will resubmitting the site to the index fix things? Would appreciate any advice or suggestions on the ramifications of this and how I should fix it. Regards, Ian

    | iragless
    0

  • Page A has written content, pictures, videos. The written content from Page A is being moved to Page B. When Google crawls the pages next time around will Page B receive the content credit? Will there not be any issues that this content originally belonged to Page A? Page A is not a page I want to rank for (just have great pictures and videos for users). Can I 301 redirect from Page A to B since the written content from A has been deleted or no need? Again, I intent to keep Page A live because good value for users to see pictures and videos.

    | khi5
    0

  • My URL is: www.ebuzznet.comToday when i checked webmaster tools under manual spam section. I got Manual spam and reason was Thin content with little or no added value. Then I checked other sites from same webmaster tools account, there are 11 sites, all of them received same manual action. I never received any mail, no notification on site messages section regarding this manual action. I just need confirmation whether it is something to do with any error in webmaster tools or all of the sites really received manual spam actions.Most of the article on sites are above 500 words, quality content (no spun or copied).Looking for suggestions, answers

    | ndroidgalaxy
    0

  • Hello All, I am a webmaster of http://www.bannerbuzz.com and I have some problem in keyword cannibalization in my store, i have lots of categories related to banners, and in Banner section my my keyword is vinyl banners and my all categories urls structure connected with vinyl banners, I am definitely sure that keyword cannibalization issue in my website and i want to resolve it as soon as possible, so can anyone please help me how can i resolve this issue with effective way or without affected my keyword ranking? My Keywords Vinyl Banners      : http://www.bannerbuzz.com/full-color-vinyl-banners.html
    Custom Banners  : http://www.bannerbuzz.com/custom-vinyl-banners.html
    Outdoor Banners   : http://www.bannerbuzz.com/outdoor-vinyl-banners.html My 1 keyword vinyl banners is affected, so can anyone please look at these pages and let me know how can i resolve keyword cannibalization from my website.? Thanks
    BannerBuzz.com

    | CommercePundit
    0

  • Hey guyz,
    I'm working with a client that has a page which has many internal links to the same page .
    Let me illustrate it.
    So as you can see I have a page which is called in the image "page" :D.
    As you can see, the **page **has many links to the solutions.htmls' anchor links which mean they are basically the same page ( solutions.html)
    Is it going to be a problem for us to do that ? 
    And is there anyway to handle this problem?
    Thank you for you patience. And sorry for my bad english 😄 4deRc1W.png

    | atakala
    0

  • Hey Moz'erz, I'm removing some URLS from the index and want to confirm the use of the "remove directory" request. If my structure is this: /blogs/customer-success-stories/tagged/ --- all pages that are /tagged/abc, /tagged/dce etc. will be removed correct? First time trying a directory removal as their are 100 plus of these tagged pages. Comments, suggestions and past experiences welcome!

    | paul-bold
    0

  • Hello, I have taken over the management of a site which has a big problem with duplicate content. The duplicate content is caused by two things: Upper and lower case urls e.g: www.mysite.com/blog and www.mysite.com/Blog The other reason is the use of product filters / pagination which mean you can get to the same 'page' via different filters. The filters generate separate URLs. http://www.mysite.com/casestudy
    http://www.mysite.com/casestudy/filter?page=1
    http://www.mysite.com/casestudy/filter?solution=0&page=1
    http://www.mysite.com/casestudy?page=1
    http://www.cpio.co.uk/casestudy/filter?solution=0" Am I right to assume that for the case sensitive URLs I should use a 301 redirect because I only want the lower page to be shown? For the issue with dynamic URLs should we implement a mod-rewrite and 301 to one page? Any advice would be greatly appreciated.
    Mat

    | Barques-Design
    0

  • A friend I'm working with at RedChairMarket.com is having duplicate page issues. Among them, both www and non-www URLs are being generated automatically by his software framework, ASP.net mvc 3. How should we go about finding and tackling these duplicates? Thanks!

    | BrittanyHighland
    0

  • I am running a site through Screaming Frog and many of the pages under "Content" are reading text/html; charset=UTF-8. Does this harm ones SEO and what does this really mean? I'm running his site along with this competitors and the competitors seems very clean with content pages reading text/html. What does one do to change this if it is a negative thing? Thank you

    | seoessentials
    0

  • Any thoughts on how schema markup should properly implemented on service base businesses? Let's say a plumber in Washington has a few locations. Obviously you would use schema mark up on their physical location information, but what about service areas? Are there ramifications to using the for service areas? It seems like you could potentially confuse the search engines. We are noticing a competitor use this on a newly developed website. We haven't seen any improvement in their rankings per say, but it may be a bit early to tell.

    | AaronHenry
    0

  • We are working with a client who is changing the direction of the company's marketing efforts. The current site includes many (approx 100) pages for each partner they work with (each partner has its own page). The new site will be losing many of these and we want to be sure we don't destroy organic traffic/rankings in the process. These landing pages don't directly garner the most traffic but it will definitely be a big change in the size of the site. Any advice for how to best handle the redesign is appreciated, thanks!

    | KMofOutlier
    0

  • Hi all, easy question: I have a client URL...example.com/giftbags that has been indexed for a while. Should I change the URL to example.com/gift-bags to separate these words for better KW ranking, or would the change be useless at this point? Thanks, -Reed

    | IceIcebaby
    0

  • If you view source you will see my actual title tag http://www.sqlsentry.com/products/performance-advisor/sql-server-performance The google results are showing a completely different title tag that we do not want used.  This was working properly and I'm not understanding how it got changed.  Thanks!

    | Sika22
    0

  • Inside my website, I use the rel = "canonical" but I do not use it in the but in a hyperlink. Now it is not clear to me if that goes well. See namely different stories about the Internet. My example below link: Bruiloft

    | NECAnGeL
    0

  • Hi Mozzers, We have a website that has both http as well as https indexed. I proposed the solution of implementing a canonical link tag on all pages (including the login/secure ones). Any disadvantages I could expect? Thanks!

    | DeptAgency
    0

  • I redesign my website from Wix to HTML. Now URLs changed to _ http://www.spinteedubai.com/#!how-it-works/c46c To http://www.spinteedubai.com/how-it-works.html Same for all other pages. How I can fix this issue and both pages were also indexed in google.

    | AlexanderWhite
    0

  • Hi Has anyone found WordPress Backup Buddy causing a problem with SEO. I understand why it does it, but wondered if anyone experienced issues with this? Only sometimes it adds /?doing_wp_cron=****** on to the end of a URL Thanks Tom

    | TomPryor83
    1

  • I'm currently trying to disavow toxic links that I have found on my site, that our previous SEO company created. Google requires that we reach out to the individual websites and try to have them removed.  Does anyone know of software that makes this process automated or easer?  I'm currently doing it manually, uhg! Also, is there software that can help you find toxic links? I'm currently also doing that manually, uhg!  Thanks.

    | milehigh5280
    0

  • Hello, i have a website with 1 years old and when i started, when i created the pages, theses have key stuffing (20-30-40 same words in meta descriptions and text, sometimes 15, sometimes 20 and sometimes 40). Since i saw this (about 4 months), i change that, doing new pages with 5-10 same keywords. Some pages with many keywords (20-30-40) work very fine and i would not lose the position in google, but i don't want to be penalized for that. Then, my question is: Should change the old pages with key stuffing or let them? Thanks so much.

    | pompero99
    0

  • I have multiple colors of the same product, but as a result I'm getting duplicate content warnings. I want to keep these all different products with their own pages, so that the color can be easily identified by browsing the category page. Any suggestions?

    | bobjohn1
    0

  • Hi All, Firstly thank you for taking the time to look. My dilemma is as follows; I have a site on wordpress that I have added an ssl certificate to and the entire domain is secure. The site has a mix of content including a blog area and product pages. My question is what does Google prefer, http or https or does it not matter As i see it my option is to keep  the entire site as https and enforce this sitewide so all non secure content redirects to the https version or i could enforce https just in the cart and or product pages, all other content, homepage, blog, about us, contact us etc would be http. From an seo perspective ie google search engine,  is their a best way to proceed? Finally, as i currently have http and https both displaying ie duplicate, what would be the way to fix this, i have yoast plugin so can set the canonical there and can also edit my robot.txt. I have come across this resource (http://www.creare.co.uk/http-vs-https-duplicate-content) and am wondering if this guideline is still correct or is there another more current way, if so I would be grateful if you could point me in the right direction. thanks in advance.

    | Renford_Nelson
    0

  • I have completely rewritten my web site, adding structure to the file directories. Subsequently added was Redirect information within the .htaccess file. The following example ...
    Redirect 301 /armaflex.html http://www.just-insulation.com/002-brands/armaflex.html
    Returns this response in the URL bar of ... 
    http://www.just-insulation.com/002-brands/armaflex.html?file=armaflex
    I am at a loss to understand why the suffix "?file=armaflex" is added The following code is inserted at the top of the file ...
    RewriteEngine On redirect html pages to the root domain RewriteRule ^index.html$ / [NC,R,L] Force www. prefix in URLs and redirect non-www to www RewriteCond %{http_host} ^just-insulation.com [NC]
    RewriteRule ^(.*)$ http://www.just-insulation.com/ [R=301,NC] Any advice would be most welcome.

    | Cyberace
    0

  • I just put my site in moz analytics. The crawl results says I have duplicate content. When I look at the pages it is because one page is www.xyz.com and the duplicate is xyz.com.   What causes this and how can it be fixed. I'm not a developer, so be kind and speak a language I can understand. Thanks for your help 🙂

    | Britewave
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.