Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • I started a new music site that has a database of 8,000,000 songs and 500,000+ artists that we are cross referencing with free & legal content sources. Each song essentially has its own page. We are about to start adding links to a sitemap and wanted to find the best practices. Should we add all 8,000,000+ links at once? Should we add a maximum amount  a day? Maybe max 5,000? What are the pros and cons of slowly adding the pages or adding them all at once. Any risks? At the rate google is crawling our page  it will take 8 years to have all of our songs indexed (It would be very hard to crawl all of our songs as our system is more of an app). I wan't to play it safe and not do anything that will come off as spammy. I have been trying to find some actual evidence on what the best course of action is. Thanks in Advance!

    | mikecrib1
    0

  • According to webmaster tools, the number of pages indexed by Google on my site doubled yesterday (gone from 150K to 450K). Usually I would be jumping for joy but now I have more indexed pages than actual pages on my site. I have checked for duplicate URLs pointing to the same product page but can't see any, pagination in category pages doesn't seem to be indexed nor does parameterisation in URLs from advanced filtration. Using the site: operator we get a different result on google.com (450K)  to google.co.uk (150K). Anyone got any ideas?

    | DavidLenehan
    0

  • We changed our domain 6 weeks ago as we had a penalty we couldn't shake off... My question is: How long will it take to rank for our keywords. I appreciate this is a difficult questions as there are a lot of factors that will effect our ranking. Do Google wait a period of time before allowing a new site to rank well?

    | jj3434
    0

  • Hi guys I read the seomoz article about sitemap.xml dated 2008. Just wanted to check views on: Is it worthwhile using the 'priority' What if everything is set to 100% Any tips to using the priority Many thanks in advance! Richard

    | Richard555
    0

  • Hi, When pushing our new site live, most of the 301 redirections got done too late for several reasons. Understandably, our site rankings in google have taken a hit now. So far we have just tried to perfectly optimize the pages that used to rank well (They weren't even optimized before and were still ranking) , to get our positions back. But does anyone have an idea about what else we could do? Is there a recommended "action plan" when someone is late with their 301 redirections?

    | JohanMattisson
    0

  • Hi Mozzers, What are your thoughts on using www.folkd.com for video SEO? We have a few company videos and would like to possibly get a backlink by either embedding one of our youtube videos on our site or self hosting the video. Are bookmarking sites like this spammy?

    | Travis-W
    0

  • Does anyone have any insight into how Google determines 'top references' from medical websites?
    For example, if you search 'skin disorders,' you'll see 'Sources include <cite>nih.gov</cite>, <cite>medicinenet.com</cite> and <cite>dmoz.org</cite>'--how is that determined?

    | nicole.healthline
    0

  • Hello here, this question is from a merchant stand point, and here is a typical scenario: this merchant has thousand of affiliate incoming links. Affiliates link to specific product pages with their affiliate ID passed as a parameter as: http://www.merchantsite.com/products/product_page/?affid=[affiliate_id] and users get 301 redirected to a clean URL like: http://www.merchantsite.com/products/product_page/ after that a cookie is stored into the user's browser for tracking purposes. Now, my question is the following:  is for the merchant more convenient to have its affiliates linking with follow or nofollow links? Is that actually relevant? What are the pros and cons? Thank you in advance for any insights!

    | fablau
    0

  • I have a new SEO client that has a business model similar to Criagslist and Angies List or Task Rabbit, Where they offer local based services nationwide. My first thought was Local link building and citation building etc. But the issue is they are a purely online service company and they don't have a phyiscal address in every city/state they will be offering their services in. What is the best course of action for providing SEO services for this type of business model. I am pretty much at a stand still on how to rank them locally for the areas they provide services in. it's a business model that involves local businesses and customers looking for services from those local businesses.

    | VITALBGS
    0

  • Hi, At my company we are making a new website because the days of the old one are numbered. We already decided that the folder structure will be changed so we have more "clean" url's. Now we also would like to change from .net/nl to .nl . Since we already are redirecting all url's (>10.000), we think this is the moment to switch the TLD. What do you guys think? Is their anyone who has some kind of experience/tip they would like to share?

    | SEO_ACSI
    0

  • Hello! We are managing SEO campaign of a video website. We have an issue about sitemap folders. I have sitemaps like ** /xml/sitemap-name.xml .** But Google is indexing my /xml/ folder and also sitemaps and they appear in search results. If i will add Disallow: /xml/ to my robots.txt and remove /xml/ folder from webmaster tools, Google could see my sitemaps? or it ignores them? Will my site effect negatively after remove /xml/ folder completely from search results? What should i do?

    | roipublic
    0

  • Hello, I'm in the middle of a link reclamation project wherein we're identifying broken links, links pointing to dupe content etc. I found a forgotten co-brand which is effectively dupe content across 8 sub-domains, some of which have a significant number of links (200+ linking domains | 2k+ in-bound links). Question for the group is what's the optimal redirect option? Option 1: set 301 and maintain 1:1 URL mapping will pass all equity to applicable PLPs and theoretically improve rank for related keyword(s). requires a bit more configuration time and will likely have small effect on rank given links are widely distributed across URLs. Option 2: set 301 to redirect all requests to the associated sub-domain e.g. foo.mybrand.cobrand.com/page1.html and foo.mybrand.cobrand.com/page2 both redirect to foo.mybrand.com/ will accumulate all equity at the sub-domain level which theoretically will be roughly distributed throughout underlying pages and will limit risk of penalty to that sub-domain. Option 3: set 301 to redirect all requests to our homepage. easiest to configure & maintain, will accumulate the maximum equity on a priority page which should positively affect domain authority. run risk of being penalized for accumulating links en mass, risk penalty for spammy links on our primary sub-domain www, won't pass keyword specific equity to applicable pages. To be clear, I've done an initial scrub of anchor text and there were no signs of spam. I'm leaning towards #3, but interested in others perspectives. Cheers,
    Stefan

    | PCampolo
    0

  • Hi all, I have a question regarding permanently redirecting many small websites into one, large new one. During the past 9 years I have created many small websites, all focusing on hotel reservations in one specific city. This has served me beautifully in the past, but I have come to the conclusion that it is no longer a sustainable model and therefore I am in the process of creating one large, worldwide hotel reservations website. To not loose any benefit of my hard work the past 9 years, I want to permanently redirect the smaller websites to the correct section of my new website. I know that if it is only a few websites, that this strategy is perfectly acceptable, but since I am talking about 50 to 100 websites, I am not so sure and would like to have your input. Here is what I would like to do: (the domain names are not mine, just an example) Old website: londonhotels.com 301 to newdomain.com/london/ Old website: berlinhotels.com 301 to newdomain.com/berlin/ Old website: amsterdamhotels.com 301 to newdomain.com/amsterdam/ Etc., etc. My plan is to do this for 50 to 100 websites and would like to have your thoughts on if this is an acceptable strategy or not. Just to be clear, I am talking about redirecting only my websites that are in good standing, i.e. none of the websites I am thinking about 301'ing have been penalized. Thanks for your thoughts on this.

    | tfbpa
    0

  • When you add rel=canonical to the page, will Google still crawl your page for content and discover new links in that page?

    | ReferralCandy
    0

  • I've been asked to do an audit of http://www.equipment4garages.com/. The first thing I did was check the code, and saw that the whole thing has a clone of the original site in an iframe. I can't for the life of me think why anybody would do that, so I was wondering if someone here could shed some light on it?

    | neooptic
    0

  • Hi 
    I've been working with a law firm's website for a couple of years and we've encounter a problem. The pages were divided to target employers and employees separately. For the very targeted keywords mentioning either employees or employers everything was good but for broader less targeted keywords e.g unfair dismissal keywords chooses either one or the other which is a problem. Now I created this ''bridge'' pages where all the topics are explained and then users are directed to and then they will chose where to go. the problem is a lot of off page was created during this years either targeting on or the other. What I plan to do is: -Create a new site map and changing the priority, so the new pages will have a priority 1 and the others less. - bookmarks, articles, etc will be targeting now to the new pages. I place the new pages linked from the home page so that they get the link juice of the home page and they are also now more a category page in the map, so a level up comparing to the previous ones. Questions: 1- Is it worthwhile adding a rel canonical tag to the new pages and rel alternate to previous pages, or if its not a question of duplicate content it shouldn't have an impact? What other things should I take into consideration? Thanks a lot. nico

    | niclaus78
    0

  • Hello All, I wonder if you'd kind enough to help with your insights... The site is www.dataclinic.co.uk It's been up and running since 2002 and is a data recovery related site that for years ranked on page 1 for lots of data recovery search terms. Now it's hardly ranking at all, despite decent content and updated blogs. Google (via Webmaster tools) says the site hasn't been penalized for bad links... but we are searching for an explanation as to why it's so difficult to get our once good rankings back.... (For example, the main search term we'd like to rank for is "data recovery"... we are currently on page 8... [google.co.uk search]... why???) Any ideas greatly appreciated... and I'll reply to each of you privately if you wish. Thanks for your time. Chris

    | 3Amigos
    0

  • For a Wordpress site that has the ending / in the URL with a ? after it... how can you do a 301 redirect to strip off anything after the / For example how to take this URL domain.com/article-name/?utm_source=feedburner and 301 to this URL domain.com/article-name/ Thank you for the help

    | COEDMediaGroup
    0

  • Is embedded content "really" on my page? There are many addons nowadays that are used by embedded code and they bring the texts after the page is loaded. For example - embedded surveys. Are these read by the Google bot or do they in fact act like iframes and are not physically on my page? Thanks

    | BeytzNet
    0

  • Hello, I know this probably depends on market sector, etc., but are there broad guidelines on the optimal ratio of different types of links - blogs, directory links, citations, etc. I'm working for a tourism business and see nearly all of their links are from directories - both generic and tourism specific. I'm thinking about re-balancing that with guest blogs, PR work, etc.

    | McTaggart
    0

  • I have recently launched an e-commerce website which has a whopping domain authority of 1! I was thinking about adding a blog to it (it's in open cart), but that would mean creating it in a wordpress but using the same domain name. Would this be beneficial from an SEO stand point (i.e sending traffic to w blog that isn't actually on the e-commerce website itself) , or am I better off creating content as blogs/articles on other people sites?

    | lindsayjhopkins
    0

  • For the first several months this website, WEBSITE, ranked well in Google for several local search terms like, "Columbia MO spinal decompression" and "Columbia, MO car accident therapy." Recently the website has completely disappeared from Google's SEPRs. It does not even exist when I copy and paste full paragraphs into Google's search bar. The website still ranks fine in Bing and Yahoo, but something happened that caused it to be removed from Google. Beside for optimizing the meta data, adding headers, alt tags, and all of the typical on-page SEO stuff, we did create a guest post for a relevant, local blog. Here is the post: Guest Post. The post's content is 100% unique. I realize the post has way to many internal/external links, which we definitely did not recommend, but can anyone find a reason why this website was removed from Google's SERPs? And possibly how we should go about getting it back into Google's SERPs? Thanks in advance for any help.

    | VentaMarketing
    0

  • Okay I am new to SEO and I have read a few SEO beginner guides and have been practicing SEO over time now. I am trying to do SEO for a new clients site that is  a completely new site with no MR and MT and here is my current link building strategy. Can you please review my link building plan and help me out with suggestions and corrections 1. Directory Submissions- From what I understand since the new google penguin updates this isn't as effective of a method but I am trying get high PR directory list, but  a lot of them require paid standard submission reviews, otherwise it takes 2-3 months 2. Local Directory Submissions- Such as yelp, angee's list, whitepages, and other local directories. 3. Social Bookmarking- submit links to social bookmarking sites with target keyword(s) as anchor 4. Article Writing & Submission: create articles and submit to high pr article directories with different article titles and also wanted to see different submissions I can make with each article 5. Press Releases- submit to high pagerank press release directories, also wanted to see how many submissions is generally the rule of thumb for press releases. 6. Blog Outreach for Product Reviews: Submit products to blogs with PR 2+ to get review and backlinks 7. Forum Profile Creation- create forum profiles and engage in topics with signature with a link, I understand that since the penguin update this isn't something I should emphasis on 8. Blog Commenting- comment on relevant blogs that have dofollow links and nofollow links for link diversity 9. Guest Blogging- Write unique content and outreach to related blogs for guest posting opportunities 10. .edu & .gov links- How do I gain .edu & .gov links I have read several articles and I am having a hard time understanding this concept, would commenting on .edu & .gov blogs and profiles be an effective method or the correct method for gaining these types of links?

    | azokaei
    0

  • Should I be worried about anchor text diversity for my internal links? It seems like I should be ok but I just wanted to double check... you know how google can be

    | KenyonManu3-SEOSEM
    0

  • Bing and Yahoo Ranks work, google ranks not happening please help

    | Djdealeyo
    0

  • Our site was placed under a manual penalty last year in June 2012 after penguin rolled out. We were advised by Google that we had unnatural links pointing to our site. We fought for months, running backlink checks and contacting webmasters where Google's WMT was showing the sites which had links. We have submitted numerous reconsideration requests with proof of our efforts in the form of huge well labeled spreadsheets, emails, and screen shots of online forms requesting link removal.When the disavow tool came out we thought it was a godsend and added all the sites who had either ignored us or refused to take down the links to the disavow.txt with the domain: tag. Then we submitted another reconsideration request, but to no avail.We have since had email correspondence with a member of the Google Quality Search Team who after reviewing the evidence of all our previous reconsideration requests and disavow.txt still advised us to make a genuine effort and listed sites which had inorganic links pointing to our site which were already included in the disavow.txt.Google has stated "In order for your site to have a successful reconsideration request, we will need to see a substantial, good-faith effort to remove the links, and this effort should result in a significant decrease in the number of bad links that we see."We have truly done everything we can and proven it too! Especially with all the sites in the disavow.txt there must be a decrease in links pointing to our site. What more can we do? Please help!

    | Benbug
    0

  • Each of our login pages show up under different subdomains of our website. Currently these are accessible by Google which is a huge competitive advantage for our competitors looking for our client list. We've done a few things to try to rectify the problem: -  No index/archive to each login page Robot.txt to all subdomains to block search engines gone into webmaster tools and added the subdomain of one of our bigger clients then requested to remove it from Google (This would be great to do for every subdomain but we have a LOT of clients and it would require tons of backend work to make this happen.) Other than the last option, is there something we can do that will remove subdomains from being viewed from search engines? We know the robots.txt are working since the message on search results say: "A description for this result is not available because of this site's robots.txt – learn more." But we'd like the whole link to disappear.. Any suggestions?

    | desmond.liang
    1

  • Hi, We wish to start ranking on a specific keyword ("log house prices" in italian). We have two options on what pages we should optimize for this keyword: A long content page (1000+ words with images) Log houses category page, optimized for the keyword (we have 50+ houses on this page, together with a short price summary). I would think that we have better chances with ranking with option nr.2 , but then we can't use that page for ranking with a more short-tail keyword (like "log houses"). What would you suggest? Is there maybe a third option for this?

    | JohanMattisson
    0

  • Hey guys I need a little help with setting up a big 301. Background: It's a bit of a mess as the old site is a total mess after being online for 10 years plus. It has html and php pages, and a mod rewrite to redirect old html links to the newer php version of those pages. It's now moving to a new site and as the domain name and URL structure has changed we can't use any fancy regex and have to do a page to page redirect. There are 1500 pages to redirect. However, the old site has thousands of linking root domains, and some of these are to the old html pages (which currently redirect to the php pages) and some to the newer php pages. Question: My initial plan was to leave the mod rewrite and only redirect the php pages. That means 1500 individual redirects instead of 3000 if I individually redirect both the php and html pages. I'm not sure what's best to be honest. We don't really want multiple hops in the redirect (html>php>new site), but surely 1500 redirects is better than 3000! Does anyone have any advice on which option may be best, or even a better option? Thanks 🙂

    | HarveyP
    0

  • Hi all i am learning seo mobile app on google play and itune , I'm finding some tips or experience to seo there. Please tell me some advise .Thanks all

    | Anhlebksp
    0

  • Hi all, I have a site that is ranking #1 in Google Places for its main <city><keyword>search... but it does not rank for any of its basic keyword variations, which I find very confusing.</keyword></city> ie (just an example) Chicago Caterer (ranked #1 in google places)
    Chicago Caterers (not ranked in google places)
    Chicago Catering (not ranked in google places)
    Chicago Catering Company (not ranked in google places)
    Chicago Catering Companies (etc..) How can I secure a google places ranking for these simple keyword variations? Do I build links to the google plus page using that anchor text? Do I get citations that contain that keyword somewhere on the page? Do I optimize for these keyword variations on the actual website itself? (not the places listing). Obviously I don't stuff these keywords into the google places listing. Any help would be much appreciated!

    | x2264983x
    0

  • For our company website faithology.com we are attempting to block out any urls that contain a ? mark to keep google from seeing some pages as duplicates. Our robots.txt is as follows: User-Agent: * Disallow: /*? User-agent: rogerbot Disallow: /community/ Is the above correct? We are wanting them to not crawl any url with a "?" inside, however we don't want to harm ourselves in seo. Thanks for your help!

    | BMPIRE
    0

  • Hi We are setting up a new domain to focus on a specific product and want to use some of the content from the original domain on the new site and remove it from the original. The content is appropriate for the new domain and will be irrelevant for the original domain and we want to avoid creating completely new content. There will be a link between the two domains. What is the best practice for this to avoid duplicate content and a potential Panda penalty?

    | Citybase
    0

  • I know that google does not want to index "search result" pages for a lot of reasons (dup content, dynamic urls, blah blah).  I recently optimized the entire IA of my sites to have search friendly urls, whcih includes search result pages.  So, my search result pages changed from: /search?12345&productblue=true&id789 to /product/search/blue_widgets/womens/large As a result, google started indexing these pages thinking they were static (no opposition from me :)), but i started getting WMT messages saying they are finding a "high number of urls being indexed" on these sites.  Should I just block them altogether, or let it work itself out?

    | rhutchings
    0

  • I am building out a brand new site. It's built on Wordpress so I've been tinkering with the themes and plug-ins on the production server. To my surprise, less than a week after installing Wordpress, I have pages in the index. I've seen advice in this forum about blocking search bots from dev servers to prevent duplicate content, but this is my production server so it seems like a bad idea. Any advice on the best way to proceed? Block or no block? Or something else? (I know how to block, so I'm not looking for instructions). We're around 3 months from officially launching (possibly less). We'll start to have real content on the site some time in June, even though we aren't planning to launch. We should have a development environment ready in the next couple of weeks. Thanks!

    | DoItHappy
    0

  • Hello, SEO Gurus! First off, my many thanks to this community for all of your past help and perspective. This is by far the most valuable SEO community on the web, and it is precisely because of all of you being here. Thanks! I've recently kicked off a robust niche biotech news publishing site for a client, and in the first 6 weeks, we've generated 15K+ views and 9300 visits. The site is built on the WordPress platform. I'm well aware that a best practice is to noindex tag and category pages, as I've heard SEOs say that they potentially lead to duplicate content issues. We're using tags and categories heavily, and to date, we've had just 282 visits from tag & category pages. So, that's 2.89% of our traffic; the vast majority of traffic has landed on the homepage or article pages (we are using author markup). Here's my question, though, and it's more philosophical: do these pages really cause a duplicate content issue? Isn't Google able to determine that said page is a tag page, and thus not worthy of duplicate content penalties? If not, then why not? To me, tag/category pages are sometimes better content pages to have ranked than article pages, since, for news especially, they potentially give searchers a better search result (particularly for short tail keywords). For example, if I write articles all the time about the Mayo Clinic," I'd rather have my evergreen "Mayo Clinic" tag page rank on page one for the keyword "mayo clinic" than just one specific article that very quickly drops out of the news cycle. Know what I mean? So, to summarize: 1. Are doindexed tag/category pages really a duplicate content problem, and if so, why the heck? 2. Is there a strategy for ranking tag/category pages for news publishing sites ahead of article pages? Thanks as always for your time and attention. Kind Regards, Mike

    | RCNOnlineMarketing
    0

  • I had a client ask me why his website wasn't popping up for the local places results when he typed "plumbing". There were a group of google local places and my question is, how would I optimize for these with the Venice update. Any strategy? I didn't have an answer. Also "sewer repair" brings in great results. Do I optimize for the keyword sewer repair? Any suggestions would help out greatly. Thanks,
    Greg

    | GregMontoya
    0

  • I have done and am doing everything I can think of to bring back lost traffic after the late 2012 updates from google hit us. I just is not working. We had some issues with our out of house web developers which screwed up our site in 2012 and after taking it in house we have Eden doing damage control form months now. We think we have fixed pretty much everything. URL structure filling up with good unique content(under way. Lots still to do) making better category descriptions redesigned homepage. Updated product pages (CMS is holding things back on that part otherwise they would be better. New CMS under construction) started more link building(its a real weak spot on our SEO as far as I can see) audited bad links from dodgy irelavent sites. hired writers to create content and link bait articles. Begun making high quality video's for both YouTube (brand awareness and viral) and on site hosting (link building and conversions) (in the pipeline not online yet). Flattened out site architecture. optimise internal link flow (got this wrong by using nofollows. In the process of thinking of a better way by reducing nun wanted Nav links on page.) i realise its not all done but I have been working ever since the drop in traffic and I'm just seeing no increase at all. I have been asking a few questions on here for the past few days but still can't put my finger on the issue. Am I just impatient and need to wait on the traffic as I am doing all the correct things? Or have I missed something and need to fix it. you anyone would like to have a quick look at my site and see if there is an obvious issue I have missed It would be great as I have been tearing my hair out trying to find the issues with my site. It's www.centralsaddlery.co.uk Criticism would me much appreciated.

    | mark_baird
    0

  • We have been using nofollow links to create a silo architecture. is this a good idea or should we stay away from using this on our site. Its an eCommerce site with about 3000+ pages so not sure of the best architecture. ideas and suggestions on best practice welcome!

    | mark_baird
    0

  • We just found out our password protected development site has been crawled.  We are worried about duplicate content - what are the best steps to take to correct this beyond adding to robots.txt?

    | EileenCleary
    0

  • Hello, Say you sell barbecue grills. When would it be appropriate to write a "Complete guide to Barbecuing" that offers everything: background, recipes, statistics, products, etc? The barbecue niche actually is better suited for a complete guide than my client's niche. My client doesn't sell barbecue grills but he is in a niche that doesn't have hardly any traffic to specific information. He's thinking of writing an article about the most general keyword directly in his niche, "A Complete Guide to X". Right now his Ecommerce home page is on page 17 for this keyword, it's competitive, and last year G Analytics showed 700 page views while on page 17. It would cover a lot more information than the barbecue example with heavy authority as competition. The article would be about 4000 words. There's nothing that complete out there but, again, there's a few very authoritative sources to compete against that we'd never outrank. We'd try to make it best-of-the-web. We're looking to get natural backlinks. We might do outreach but a more specific guide might work better for outreach, which we're also writing. Should he do this as one of his 5 articles for the site?

    | BobGW
    0

  • We have a longstanding site that sells media files with a large number of digital products. We offer a referral program with rewards for those that tell others about us. In the last year, we have seen some sites popping up primarily in China that appear to be spammy looking advertisement based copies of our own site with product pages that link back to our actual products using a referral link/code.  (no-follow links.) These sites started popping out more when we noticed that some of their pages were outranking our own actual product pages. Any thoughts on this?  Our affiliate policy states that the affiliate program is meant to help and not harm our site.  In one sense this is traffic to our site which is supposed to be a good thing, but if these pages are ranking above our own, that is not what we are wanting.  I would bet these pages might get clicked on and due to the spammy nature of these sites, the user bounces and never actually gets to our website. How would you handle something like this? Thanks!! Craig

    | TheCraig
    0

  • Hi All, I have a basic understanding of SEO and the various factors that contribute to higher search rankings. This question is specifically related to MozRank, which I understand to be defined as: Pages earn MozRank by the number and quality of other pages that link to them. The higher the quality of the incoming links, the higher the MozRank. In my case, I am wondering if somebody could explain to me why I have a lower MozRank score than my competitor when I have both: Larger number of followable inbound links to my siteAND Of my larger number of followable inbound links, the page authority (and domain authority) of these links are greater than the page/domain authority of the lower number of links to my competitor's site. I have attached 3 images to help explain my points.
    Comparison Image: My site is on the left.
    Competitor Inbound: Shows a snippet of the volume of inbound links and quality of inbound links of my competitor's site (filtered by highest page authority).
    My Inbound: Shows a snippet volume of inbound links and quality of inbound links to my site (filtered by highest page authority). Any feedback or help is much appreciated! n5o5lJ4.png?1 KrQAONn.jpg VedgBLI.jpg

    | Rogs.SEO
    0

  • Hi All, My site, http://domain.com, currently has a 302 redirect from http://domain.com to https://domain.com. The reason we have an SSL cert is because we have a registration and log in form on the homepage (as well as log in options in the top bar of every page on the site - email/password). My question is: When taking SEO best practices into account, would we be better having a 301 redirect from http://domain.com to https://domain.com as most of our inbound links are directed to http://domain.com. Thanks for any help!

    | Rogs.SEO
    0

  • I have recently taken on a client that has been manually penalised for spammy link building by two previous SEOs. Having just read this excellent discussion, http://www.seomoz.org/blog/lifting-a-manual-penalty-given-by-google-personal-experience I am weighing up the odds of whether it's better to cut losses and recommend moving domains. I had thought under these circumstances it was important not to 301 the old domain to the new domain but the author (Lewis Sellers) comments on 3/4/13 that he is aware of forwards having been implemented without transferring the penalty to the new domain. http://www.seomoz.org/blog/lifting-a-manual-penalty-given-by-google-personal-experience#jtc216689 Is it safe to 301? What's the latest thinking?

    | Ewan.Kennedy
    0

  • Basically, we use a number of parameters in our URLs for event tracking. Google could be crawling an infinite number of these URLs. I'm already using the canonical tag to point at the non-tracking versions of those URLs....that doesn't stop the crawling tho. I want to know if I can do conditional 301s or just detect the user agent as a way to know when to NOT append those parameters. Just trying to follow their guidelines about allowing bots to crawl w/out things like sessionID...but they don't tell you HOW to do this. Thanks!

    | KenShafer
    0

  • On top of clear on-page sponsored content copy would you add meta robots to noindex native advertising? Google recently came out against this type of content in GNews, is there a similar directive for the main index?

    | hankazarian
    0

  • I have a regional site that does fairly well for most towns in the area (top 10-20). However, one place that has always done OK and has great content is not anywhere within the first 200. Everything looks OK, canonical link is correct, I can find the page if I search for exact text, there aren't any higher ranking duplicate pages. Any ideas what may have happened and how I can confirm a penalty for example. TIA,
    Chris

    | Cornwall
    0

  • Within the context of a physical product launch what are some ideas around creating a /coming-soon page that "teases" the launch. Ideally I'd like to optimize a page around the product, but the client wants to try build consumer anticipation without giving too many details away. Any thoughts?

    | GSI
    0

  • How is the best way to use Rel="canonical" for our website www.ofertasdeemail.com.br, for we can say goodbye for duplicated pages? I appreciate for every help. I also hope to contribute to the SEOmoz community. Sincerely,
    Amador Goncalves

    | ZZNINTERNETMEDIAGROUP
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.