Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • My first sports site is ready www.sbat.com/tips and www.sbat.com Content is going in slowly, and signed up for a pro membership on here. Looking to use this site to build up my SEO knowledge. We go live tomorrow, but any early thoughts would be great 🙂 Will be in and around here alot trying to learn from the Gurus Gary

    | gillies8888
    0

  • I am analyzing a site which has several thousands of pages, checking the headers, meta tags, and other on page factors. I noticed that the spider tool on SEO Book (http://tools.seobook.com/general/spider-test) does not seem to recognize the meta tags for various pages. However, using other tools including Moz, it seems the meta tags are being recognized. I wouldn't be as concerned with why a tool is not picking up the tags. But, the site suffered a large traffic loss and we're still trying to figure out what remaining issues need to be addressed. Also, many of those pages once ranked in Google and now cannot be found unless you do a site:// search. Is it possible that there is something blocking where various tools or crawlers can easily read them, but other tools cannot. This would seem very strange to me, but the above is what I've witnessed recently. Your suggestions and feedback are appreciated, especially as this site continues to battle Panda.

    | ABK717
    0

  • Hi Mozzers, We experienced a huge dip in traffic on Thursday, 8/14, across our entire site.  It was not a specific set of pages, it was sitewide.  Google Webmaster Tools notes our impressions are down as well.  The traffic has not recovered.  It appears our pages are still indexed in Google, just not ranking well. Here are some questions I have to help isolate the cause: We recently completed a major redesign of our entire website on 7/26.  We did not notice any dip in traffic after the new design launch - in fact, it actually increased a bit. Is it possible that only now Google sees our new site design and this is the reason for our dip?  Is there a way to see Google's past cache dates? Did anyone else experience a similar dip in traffic since Thursday? Was there a recent Google update? It would be much appreciated if someone takes a look at our site - www.consumerbase.com for any glaring SEO errors (missing necessary meta tags, etc.). What steps do you guys suggest I take to isolate the cause in this dip in traffic? Thanks!

    | Travis-W
    0

  • On a .NET site, there was a url rewrite done about 2 years ago. From a visitor's perspective, it seems to be fine as the urls look clean. But, Webmaster tools reports 500 errors from time to time showing /modules/categories... and /modules/products.... which are templates and how the original urls were structured. While the developer made it look clean, I am concerned that he could have set it up incorrectly. He acknowledged that IIS 7 on a Windows server allows url rewrites to be set up, but the site was done in another way that forces the urls to change to their product name. So, he has believed it to be okay. However, the site dropped significantly in its ranking in July 2013 which appears to be a Panda penalty. In trying to figure out if this could be a factor in why the site has suffered, I would like to know other webmasters opinions. We have already killed many pages, removed 2/3 of the index that Google had, and are trying to understand what else it could be. Also, in doing a header check, I see that it shows the /modules/products... page return a 301 status. I assume that this is okay, but wanted to see what others had to say about this. When I look at the source code of a product page, I see a reference to the /modules/products... I'm not sure if any of this pertains, but wanted to mention in case you have insight. I hope to get good feedback and direction from SEOs and technical folks

    | ABK717
    0

  • I have an ecommerce site that is not ranking well currently. It has about 1,000 pages indexed in Google but very few appear to be ranking. I normally find issues in Webmaster Tools HTML Improvements but for some reason it does not see a problem with the site. There are problems, trust me. Moz shows many issues. Google nothing! There is a problem somewhere but I am not seeing it. Why are HTML Improvements blank and the site not ranking? Am I in the dreaded sandbox? Any ideas? Sean We didn't detect any content issues with your site. As we crawl your site, we check it to detect any potential issues with content on your pages, including duplicate, missing, or problematic title tags or meta descriptions. These issues won't prevent your site from appearing in Google search results, but paying attention to them can provide Google with more information and even help drive traffic to your site. For example, title and meta description text can appear in search results, and useful, descriptive text is more likely to be clicked on by users.

    | optin
    0

  • I am always trying to optimize our website and have came across adding images to the sitemap. Has anyone done this? Did it make a big difference?

    | EcommerceSite
    0

  • Hi Mozzers I have a client that had a lot of soft 404s that we wanted to tidy up. Basically everything was going to the homepage. I recommended they implement proper 404s with a custom 404 page, and 301 any that really should be redirected to another page. What they have actually done is implemented a 404 (without the custom 404 page) and then after a short delay 301 redirected to the homepage. I understand why they want to do this as they don't want to lose the traffic, but is this a problem with SEO and the index? Or will Google treat as a hard 404 anyway? Many thanks

    | Chammy
    0

  • Howdy Mozzers, Trying to be proactive for a keyword that is going to be popular in the future. The keyword is "FIFA 15 tips" and the page is http://orangeoctop.us/fifa15-tips/. I have tried to model the page after Moz's category pages. Since the game has not yet been released it is difficult to have tips for the game. When the FIFA 15 game does come out, we will have tips coming in from the best gamers in the world (thought leaders), however there will be so much other media attention that we will get pushed even further down. Any help would be greatly appreciated. Looking for some outside of the box ideas for FIFA 15 tips.

    | orangeoctop.us
    0

  • XML Sitemap works properly in GWT, but when I run a search in Google for "site:example.com/sitemap.xml" it does not show. However, my XML image sitemap show when I run the same search in Google. Is this potentially an issue on my end and is there a solution?

    | khi5
    0

  • I'm currently doing some work on a website: http://www.abetterdriveway.com.au. Upon starting, I detected a lot of spammy links going to this website and sort to remove them before submitting a disavow report. A few months later, this site completely disappeared in the rankings, with all keywords suddenly not ranked. I realised that the test website (which was put up to view before the new site went live) was still up on another URL and Google was suddenly ranking that site instead. Hence, I ensured that test site was completely removed. 3 weeks later however, the site (www.abetterdriveway.com.au) still remains unranked for its keywords. Upon checking Web Master Tools, I cannot see anything that stands out. There is no manual action or crawling issues that I can detect. Would anyone know the reason for this persistent disappearance? Is it something I will just have to wait out until ranking results come back, or is there something I am missing? Help here would be much appreciated.

    | Gavo
    0

  • My company is redoing our homepage and there will be 4 links to our main play pages (5 games).  2 in the menu and 2 within the content. I was thinking I should no follow one of the links on the homepage + 1 in the menu so that we don't have link dilution from having multiple internal links to the same destination within 1 page. Does this make sense? Any downside of this or suggestions of a solution that may be more effective? Thanks!

    | theLotter
    0

  • Hi We're running a Magento store which doesn't have too much stock rotation. We've implemented a plugin that will allow us to give products custom canonical URLs (basically including the category slug, which is not possible through vanilla Magento). The sitemap feature doesn't pick up on these URLs, so we're submitting URLs to Google that are available and will serve content, but actually point to a longer URL via a canonical meta tag. The content is available at each URL and is near identical (all apart from the breadcrumbs) All instances of the page point to the same canonical URL We are using the longer URL in our internal architecture/link building to show this preference My questions are; Will this harm our visibility? Aside from editing the sitemap, are there any other signals we could give Google? Thanks

    | tomcraig86
    0

  • Hello, We are in the midst of a major replatforming of our current website, the process will take roughly six to nine more months to complete. We are completing revamping our site - the new site will be on the same domain, but almost everything is changing - from the category structure, hierarchy, architecture, different regions on separate URLs will not be on the same with a currency converter, URLs - you name it, we're changing it. There has been internal discussions for some time on whether we should hire an outside firm to help us with our SEO.  I have a lot of experience in SEO but my role has changed recently and we have had trouble filling my previous role.  We are not looking for help with the replatforming project, we have a great plan in place to preserve link equity, tags, etc.  We are looking for general SEO help as if replatforming wasn't on the table. My question is, is this smart to do before replatforming?  In my opinion, it's not.  Our new site will have completely different URLs and will be so dramatically different. We could have someone do some keyword research, but we have already done the bulk of it.  We have thought about and researched keywords for every new page we are creating.  But from a technical SEO perspective, I don't see the point in getting someone. In addition, we just had a major SEO audit done last year and we completed the tasks from that audit on the current site; however, most of the changes were technical, not content based. Thoughts?

    | Colbys
    0

  • Hi, I hope anyone can help me with this issue. Our french domain experienced a huge drop of indexed URLs in 2012. More than 50k URLs were indexed, after the drop less than 10k were counted. I would like to check what happened here and which URLs were thrown out of the index. So I was thinking about a comparison between todays data and the data of 2012. Unfortunately we don't have any data on the indexed pages in 2012 beside the number of indexed pages. Is there any way to check, which URLs were indexed 2 years ago?

    | Sandra_h
    0

  • Hi, Over the last few years I have built many sites and own a lot of domain names. Some have high page rank some have high domain authority and some have many back links. I'm finding it very difficult to keep up with all the links and being able to provide quality content for everything. Should I just redirect everything to my one site that make the most money as all sites are for the same industry, but in different categories of that industry. So I could 301 redirect all the sites to the relevant page on my money site. Would it be a problem is 1000's if not 10,000's of links all of a sudden pointed in to one site?

    | cibble03
    0

  • I have 130,000 profiles on my site. When not Connected to them they have very few differences. So a bot - not logged in, etc, will see a login form and "Connect to Profilename" MOZ and Google call the links the same, even though theyre unique such as example.com/id/328/name-of-this-group example.com/id/87323/name-of-a-different-group So how do i separate them? Can I use Schema or something to help identify that these are profile pages, or that the content on them should be ignored as its help text, etc? Take facebook - each facebook profile for a name renders simple results: https://www.facebook.com/public/John-Smith https://www.facebook.com/family/Smith/ Would that be duplicate data if facebook had a "Why to join" article on all of those pages?

    | inmn
    0

  • A website adds subfolders to a category URL for each filter that's selected. In a crawl of the website some of these URLs reach over 400 characters. For example, if I select shoe size 5, 5.5 and 6, white and blue colour, price $70-$100, heel and platform styles, the URL will be as follows: www.example.com/shoes/womens/filters/shoe-size--5--5.5--6/color--white--blue/price--70-100/style--heel--platform There is a canonical that points to www.example.com/shoes/womens/ so it isn't a duplicate content issue. But these URLs still get crawled. How would you handle this? It's not a great system so I'm tempted to tell them to start over with best practice recommendations, but maybe I should just tell them to block the "/filters/" folder from crawlers? For some products however, filtered content would be worth having in search indexes (e.g. colour).

    | Alex-Harford
    0

  • The theme I am using now means each page on my is currently set up like this: h1> keyword phrase keyword phrase The new theme I want to use is different h1> keyword phrase Will changing the theme have a negative effect on the rankings due to losing the h2 on each page?

    | brianflannery
    0

  • Hi there, I'd love to get the Moz community's take on this. We are working on setting up dynamic serving for mobile versions of our pages. During the process of planning the mobile version of a page, we identified a type of navigational links that, while useful enough for desktop visitors, we feel would not be as useful to mobile visitors. We would like to remove these from our mobile version of the page as part of offering a more streamlined mobile page. So we feel that we're making a fine decision with user experience in mind. On any single page, the number of links removed in the mobile version would be relatively few. The question is: is there any danger in “orphaning” the mobile versions of certain pages because links don’t exist pointing to those pages on our mobile pages?  Is this a legitimate concern, or is it enough that none of the desktop versions of pages are orphaned?  We were not sure whether it’s even possible, in Googlebot’s eyes, to orphan a mobile version of a page if we use dynamic serving and if there are no orphaned desktop versions of our pages. (We also plan to link to "full site" in the footer.) Thank you in advance for your help,
    Eric

    | Eric_R
    0

  • Hello, I have this abbreviation inside my domain name, ok? now for a page URL name, do you recommend me to use the actual word (which shortened form of it is inside domain name) in a page name? Or when have abbreviation in domain name, then using its actual word in a page name is not good? It's all about how much google recognize abbreviation as the actual word and gives the same value of word to it? do I risk not using the actual word? Hope made myself clear ) thanks.

    | mdmoz
    0

  • Hey just a quick question I'm having trouble finding a definitive answer to: Is the markup that is transferred from Word docs bad for SEO? We are managing to paste it and it looks fine, but the developers are worried that the extra code will be bad for SEO. Does anyone have solution besides pasting into Text Editor and formatting in the CMS? Is this necessary or can we just leave the extra code? Thank you!

    | keL.A.xT.o
    0

  • I am running keyword reports and I am seeing several sites that have taken big hits in Bing, but still are OK in Google.  Anyone else seeing this?

    | netviper
    0

  • Hey Everyone! I'm currently working on a project that we have A Lot of product pages and we have thousands of URL's that need to be 301'd over.  I know this can be a major issue and could lead to tons of errors.  What is everyone's thought of doing such a huge Migration, Should I do it all in phases? or should I do them all at once so they can all be indexed together? What would you suggest to be the best way to go about doing such a massive migration?

    | rpaiva
    0

  • Our site was handed a manual penalty in November 2013 where exact match anchor text and low quality directory submissions seemed to be the problem. We began the process of link removal, reconfiguration and disavowing. We had already planned to change our domain in early 2014 to coincide with our SSL certificate renewal and although we were hesitant to do this with the manual penalty still there we proceeded and 301'd most of the site but left the pages that were the landing page for most of the exact match links as 302 to the new domain. We continued to work on removing the manual penalty for the old domain as we didn't want it to pass over to the new one and eventually this was removed n March 2014 Now the penalty is gone are we safe to change those 302 redirects to 301 so everything redirects. The problem we have is that six months on, a lot of the pages for the old domain are still indexed and even though we are indexed for the new domains are rankings haven't recovered. Is it just a case of needing to build up a new quality link profile to replace the links that were disregarded or removed when recovering from the penalty or we missing something else

    | Ham1979
    0

  • Does anyone have any testing evidence what is better to use for pages with thin content, yet important pages to keep on a website? I am referring to content shared across multiple websites (such as e-commerce, real estate etc). Imagine a website with 300 high quality pages indexed and 5,000 thin product type pages, which are pages that would not generate relevant search traffic. Question goes: Does the interlinking value achieved by "noindex, follow" outweigh the negative of Google having to crawl all those "noindex" pages? With robots.txt one has Google's crawling focus on just the important pages that are indexed and that may give ranking a boost. Any experiments with insight to this would be great. I do get the story about "make the pages unique", "get customer reviews and comments" etc....but the above question is the important question here.

    | khi5
    0

  • I am showing 2 items with errors.  These products have both been removed from the site, and will trigger a 404 Page Not Found.  I am still seeing the page URLs in Webmaster Central > Search Appearance > Structured Data.  They are shown as items with errors, the errors being that they are missing price too. Should I 301 redirect these on an htaccess file, or should I remove the page url in some other way from Google? Also, I have a site with over 50,000 products and 2,000 category level pages.  In Structured Data, there are only 2,848 items.  Does it seem like Google is collecting very little data compared to how many urls I have on my site?

    | djlittman
    0

  • Hi all, I have a few pages that - despite having a robots meta tag and no follow, no index, they are showing up in Google SERPs. In troubleshooting this with my team, it was brought up that another page could be linking to these pages and causing this. Is that plausible? How could I confirm that? Thanks,
    Sarah

    | SSFCU
    0

  • Hey guys, One of my post popular pages for "Rust Hacks" use to be - http://www.ilikecheats.com/01/rust-cheats-hacks-aimbot/ Now when searching Google for site:ilikecheats.com rust hacks This page shows as the highest ranking - http://forum.ilikecheats.com/forums/221-Rust-Hacks-Rust-Cheats-Public-Forum What's weird is it seems the entire front end (Wordpress site) isn't ranking well anymore on page #1 of Google and the forums are ranking better currently. I did have a huge penalty from backlinks last year but cleared it. I got Yoast to do a site review and I'm cleaning up everything now. I also cleared most of the bad links via the disavow tool. Another example is when I search for "warz hacks" the forums show up in 4th place but the main website isn't showing at all back to page 10. If I search site:ilikecheats.com warz hacks the links directly to the main site doesn't show until page #2. So is this still a penalty that is carried over or is something else going on? Can't seem to figure it out, thanks in advance for looking.  😃 Any ideas what's going on and why the main pages no longer rank - http://www.ilikecheats.com

    | Draden67
    0

  • Hello... We recently re-launched website with a new CMS (Magento). We kept the same domain name, however most of the structure changed. We were diligent about inputting the 301 redirects.  The domain is over 15 years old and has tons of link equity and history. Today marks 27 days since launch...And Google Webmaster Tools showed me a recently detected (dated two days ago) URL from the old structure. Our natural search traffic has take a slow dive since launch...Any thoughts? Some background info: The old site did not have a sitemap.xml.  The relaunched site does. Thanks!

    | 19prince
    0

  • How to redirect pages in cold fusion? If using ColdFusion and modrewrite, the URL will never be redirected from ModRewrite.

    | alexkatalkin
    0

  • I have a lot of categories (like e-commerce sites) and many have page 1 - 50 for each category (view all not possible). Lots of the content on these pages are present across the web on other websites (duplicate stuff). I have added quality unique content to page 1 and added "noindex, follow" to page 2-50 and rel=next prev tags to the pages. Questions: By including the "follow" part, Google will read content and links on pages 2-50 and they may think "we have seen this stuff across the web….low quality content and though we see a noindex tag, we will consider even page 1 thin content, because we are able to read pages 2-50 and see the thin content." So even though I have "noindex, follow" the 'follow' part causes the issue (in that Google feels it is a lot of low quality content) - is this possible and if I had added "nofollow" instead that may solve the issue and page 1 would increase chance of looking more unique? Why don't I add "noindex, nofollow" to page 2 - 50? In this way I ensure Google does not read the content on page 2 - 50 and my site may come across as more unique than if it had the "follow" tag. I do understand that in such case (with nofollow tag on page 2-50) there is no link juice flowing from pages 2 - 50 to the main pages (assuming there are breadcrumbs or other links to the indexed pages), but I consider this minimal value from an SEO perspective. I have heard using "follow" is generally lower risk than "nofollow" - does this mean a website with a lot of "noindex, nofollow" tags may hurt the indexed pages because it comes across as a site Google can't trust since 95% of pages have such "noindex, nofollow" tag? I would like to understand what "risk" factors there may be. thank you very much

    | khi5
    0

  • Hello, One of my website has quite a lot (~1000) nofollow blog comment links. Is it worth getting them removed if they are nofollow, could they be dragging the metric of my website down. Does anyone have any experience of this? The site only has about 5 follow links, something seems to be dragging the domain metrics down. Thanks Rob

    | tomfifteen
    0

  • Hello, A client has a large ecommerce site (www.mydomain.com) for technical products that require a number of technical documents. Most of these are PDFs, some 3D PDFs drawings and renderings - all good for indexing. We are considering 2 possibilities for these: 1 - a separate site (www.mydomain2.com or docs.mydomain.com), catalog style (probably wordpress) to store the files, with links from product pages at (www.mydomain.com) to the relevant PDFs.  This will be much easier to maintain than the second possibility. 2 - storing the files at www.mydomain.com (in /docs/ folder, for example) with links from the product pages to the relevant PDFs. Is there an advantage one way or the other? Thank you

    | tlw
    0

  • Hi all Just got a new site up about weekend travel for VisitSweden, the official tourism office of Sweden. Everything went just fine except som issues with indexing. The site can be found here at weekend.visitsweden.com/no/ For some weird reason the "frontpage" of the site does not get indexed. What I have done myself to find the issue: Added sitemaps.xml Configured and added site to webmaster tools Checked 301s so they are not faulty By doing a simple site:weekend.visitsweden.com/no/ you can see that the frontpage is simple not in the index. Also by doing a cache:weekend.visitsweden.com/no/ I see that Google tries to index the page without the trailing /no/ for some reason. http://webcache.googleusercontent.com/search?q=cache:http://weekend.visitsweden.com/no/ Any smart ideas to get this fixed or where to start looking? All help greatly appreciated Kind regards Fredrik

    | Resultify
    0

  • Hi I'm dealing with this problem for a few days. In fact i didn't realize it was this serious until today when i saw most of my site "de-indexed" and losing most of the rankings. [URL Errors: 1st photo] 8/21/14 there were only 42 errors but in 8/22/14 this number went to 272 and it just keeps going up. The site i'm talking about is gazetaexpress.com (media news, custom cms) with lot's of pages. After i did some research i came to the conclusion that the problem is to the firewall, who might have blocked google bots from accessing the site. But the server administrator is saying that this isn't true and no google bots have been blocked. Also when i go to WMT, and try to Fetch as Google the site, this is what i get: [Fetch as Google: 2nd photo] From more than 60 tries, 2-3 times it showed Complete (and this only to homepage, never to articles). What can be the problem? Can i get Google to crawl properly my site and is there a chance that i will lose my previous rankings? Thanks a lot
    Granit FvhvDVR.png dKx3m1O.png

    | granitgash
    0

  • Hello, I need to change the URL name for a few pages on my site. The site was launched just recently, so it has no obvious ranking and traffic. My question is, what is the best practice for changing/deleting the page name? after deleting the page, should I go to Google webmaster tool and use URL Removal and remove the old page? I know that I have to also create a new XML sitemap file, but not sure about the old pages in google search result Thanks!

    | mdmoz
    0

  • Hello, What are the top 5 tips or resources you would give to an ecommerce site that is starting a blog? If EGOL could share, too, that would be great. He's the best. So far we are doing: 1. Around 1000 words per blog post, but varying depending on the topic 2. New product and best product reviews for some of the posts. 3. I'm doing my best to have the writer make them best-of-the-web 4. After we've got a track record, I'll analyze the statistics to see what's working. 5. There's very little blogging in our industry Thanks!

    | BobGW
    0

  • I would like to know if it is acceptable (or even possible from Google's standpoint) to canonical your homepage to a different URL on the same domain? For example, my homepage is www.grasscare.com (it's not) and I've built links to that page for years for terms like "grass seed" and "buy grass seed" because all I sold in the past was grass seed. If I then decide I want to sell both grass seed and sod, can I canonical my homepage (grasscare.com) to a new URL www.grasscare.com/grasss-seed.html to preserve the link value I've built up for "grass seed"?The new homepage would turn into a doorway page of sorts, forcing users to select either grass seed or sod before going further. Whatever content there is on the new homepage about grass seed would also be present on grasscare.com/grass-seed.html, though it would only be a small amount of content. Can a canonical be used to point the homepage to this new page and also, will this canonical pass all of the link value and ranking signals it help in the past to the new URL? Thank you in advance for any help or insight.

    | andrewv
    0

  • I'm working on a site and am running some basic audits, including a campaign within Moz. When I put the domain into any of these tools, including response header checkers, the response is a 302 that says there is a redirect to an Error Page. However, the page itself doesn't redirect, and resolves fine in the browser. But all of the audit tools cant seem to get any information from any of the pages. What is the best way to troubleshoot what is going on here? Thanks.

    | jim_shook
    0

  • I have a page with a Google Map taking up 80% of space above the fold (rest is content which is not unique to my site) and all unique written content and copyrighted pictures are from a  visual stand point right below the fold. I am considering making the Google map 1/4 in size so I can get my unique content up higher. Questions: Do we have any evidence or sound reasoning why I should / should not make this move? Is the content really considered below the fold or will Google see that it is simply a large map I have on the site and therefore will actually consider the content to be above the fold? Thank you

    | khi5
    0

  • Hello there, If we place an animated gif banner on a website will this pass link juice in the same way as if we just had a link? Obviously the website would need to be "Follow" Thanks Robert

    | roberthseo
    0

  • Currently my website is having H1 tag on website logo and also h2 tag on post title. I think for seo it is good to use H1 to my post title and H2 to website logo?

    | MasonBaker
    0

  • We're working on a complete rebuild of a client's site. The existing version of the site is in WordPress and I've noticed that the site is accessible via http and https. The new version of the site will have mostly or entirely different URLs. It seems that both http and https versions of a page will resolve, but all of the rel-canonical tags I've seen point to the https version. Sometimes image tags and stylesheets are https, sometimes they aren't. There are both http and https pages in Google's index. Having looked at other community posts about http/https, I've gathered the following: http/https is like two different domains. http and https versions need to be verified in Google Webmaster Tools separately. Set up the preferred domain properly. Rel-canonicals and internal links should have matching protocols. My thought is that we will do a .htaccess that redirects old URLs regardless of the protocol to new pages at one protocol. I would probably let the .css and image files from the current site 404. When we develop and launch the new site, does it make sense for everything to be forced to https? Are there any particular SEO issues that I should be aware of for a scenario like this? Thanks!

    | GOODSIR
    0

  • I've had a few questions around the blog content on our site. Some of our vendors and partners have expressed interest in posting some of that content on their domains. What are the implications if we were to post copies of our blog posts on other domains? Should this be avoided or are there circumstances that this type of program would make sense?

    | Visier
    1

  • I have a folder "testing" within my domain which is a folder added to the robots.txt. My web developers use that folder "testing" when we are creating new content before uploading to an indexed folder. So the content is uploaded to the "testing" folder at first (which is blocked by robots.txt) and later uploaded to an indexed folder, yet permanently keeping the content in the "testing" folder. Actually, my entire website's content is located within the "testing" - so same URL structure for all pages as indexed pages, except it starts with the "testing/" folder. Question: even though the "testing" folder will not be indexed by search engines, is there a chance search engines notice that the content is at first uploaded to the "testing" folder and therefore the indexed folder is not guaranteed to get the content credit, since search engines see the content in the "testing" folder, despite the "testing" folder being blocked by robots.txt? Would it be better that I password protecting this "testing" folder? Thx

    | khi5
    0

  • We are having an issue with our cilent's blog creating excessive duplicate content via blog tags.  The duplicate webpages from tags offer absolutely no value  (we can't even see the tag).   Should we just 301 redirect the tagged page or use a rel canonical?

    | VanguardCommunications
    0

  • The company I work for owns and operates hundreds of websites throughout the United States. Each of these is tied to a legitimate local business many times with specific regional branding and mostly unique content. All of our domains are in pretty good shape and have not ever participated in any shady link building/SEO. These sites currently are often linking together between the other sites within their market. It makes perfect sense from a user standpoint since they would have an interest in each of the sites if they were interested in the specific offering that business had. My question is whether or not we should nofollow the links to our other sites. Nothing has happened from Google in terms of penalties and they don't seem to be hurting our sites now as they are all currently followed, but I also don't want to be on the false positive side of any future algorithm updates surrounding link quality. What do you think? Keep them followed or introduce nofollow?

    | MJTrevens
    0

  • Hi there, My client just launched a new site and the CMS requires that the home page goes to a subfolder - clientsite.com/store. Currently there is a redirect in place such that clientsite.com -> clientsite.com/store. However, I want clientsite.com to be the canonical version of the URL. What should I do in this case, given that there is now a loop between the redirected page and the canonical page?

    | FPD_NYC
    0

  • I created a PDF - will it pass PageRank?

    | alhallinan
    1

  • My company has 2 distinct divisions: a B2B division and a B2C division.  Right now we are restructuring our B2C service offerings and pulling back on marketing spend for B2C so that we can relaunch new B2C offerings in about 6-8 months.  Our current company website will stay in place but will be revamped right before our B2C relaunch (will maintain the same domain). Currently, our company has little to no SEO presence for both B2C and B2B.  I need to know if it is smart for us to increase SEO activity right now for the B2C division, knowing that it can take up to 6 months to get SEO traction so that when we do relaunch we have a strong established SEO presence. Thanks for your sharing your thoughts!

    | VISANOW
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.