Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Hi, We are just a small company trying to understand this seo business. :O)  I hope you will give us your input on how we look and how we can improve. www.bakerbay.com Each week I go down the seomoz crawl and change the duplicate page titles it increases the duplicate page content. Then the next week it seems to find other titles that have been fixed that are afluwie now. Please help. We are a bead company and alot of our products have a similar description but have different colors and items numbers. Could you please advise me on how to fix these errors and increase our ranking? All the Best, Beth

    Technical SEO | | BakerBayBeadCo
    0

  • Dumb question, why is the SEOmoz Toolbar reporting vastly different data than opensitexplorer? I had assumed they pulled from the same data set. False assumption? Am I misinterpreting the metrics? The discrepancies with which I am most confused are differences in number of root linking domains between OSE and Toolbar. Please enlighten me.

    Moz Pro | | Gyi
    0

  • We own a small web design company that both creates the sites and then does the seo marketing for them once they are created.  The business has grown to have over 100 customers.   However, as we sell more sites we find our SEO team is somewhat short staffed.  So far, the area that suffers the most is the link building aspect of SEO. The sites we build are for dentists and usually only contain 10 to 20 pages.  Enough to list the services they offer and where they are located.  Here is my question: We need to get the most effective links possible for over 100 sites with just a couple of employees.  What would be the most effective links to try for.  We cant chase down every stray link for our doctors.  And, we don't have time to do in-depth research for each client each month as we are already pushing ourselves to the limit.  In time, we will have more employees to help share the SEO load.  But for now, where should we be spending our time most effectively?  We can usually only budget one or two hours per client each month for linking. Thanks for any advice you can offer.

    Link Building | | Alex_Ratynski
    0

  • Both SEOMOZ and Google webmaster tools report lots of 404 errors throughout my wordpress blog. I have the url structure set to category/title Most of the 404 errors seem to be that the crawler is looking for a /home.html page. Each time I add a new post I get more 404 errors. I could, of course, add 301 redirects but I presume there is an easy way to do this within the WP setup. Any ideas? Thanks

    Technical SEO | | bjalc2011
    0

  • This is a strange one, and I hope a few local experts are out there. My client basically has one major competitor in the market. The competitor is closer to downtown and he is out about 27 miles. A couple of months ago, if you searched on "biplane rides in atlanta" the places map in the SERPS would show two - my client and his competitor. Now, the initial local in-line serp just shows his competitor, zoomed in. If go to Google Maps and type in the same search, he is listed, but you first have to click show more results. Then, he's listed twice - one his airport address (which is the real one) and one his business registered address (his house). How would I go about straightening this out? My client is #1 in the natural SERPS, it's just this local thing drives us crazy. If anyone can figure this out, you may walk away with a biplane ride next time you're in Atlanta! Thanks, Charles

    Competitive Research | | Chas-295721
    0

  • Hi, I have read numerous articles that support submitting multiple XML sitemaps for websites that have thousands of articles... in our case we have over 100,000.  So, I was thinking I should submit one sitemap for each news category. My question is how many page levels should each sitemap instruct the spiders to go?  Would it not be enough to just submit the top level URL for each category and then let the spiders follow the rest of the links organically? So, if I have 12 categories the total number of URL´s will be 12??? If this is true, how do you suggest handling or home page, where the latest articles are displayed regardless of their category... so I.E. the spiders will find l links to a given article both on the home page and in the category it belongs to.  We are using canonical tags. Thanks, Jarrett

    Intermediate & Advanced SEO | | jarrett.mackay
    0

  • As my mobile numbers are going up in Google Analytics I'm trying to figure out what phone runs what browser.  I see that 93% of mobile users have Safari which I believe is iPhone, iPad and iPods.  There is a 6% usage of "Mozilla Compatible Agent".  What cell phone would this be?

    Reporting & Analytics | | LabadieAuto
    0

  • On Friday, 4/29, we noticed that we suddenly lost all rankings for all of our keywords, including searches like "bbq guys". This indicated to us that we are being penalized for something. We immediately went through the list of things that changed, and the most obvious is that we were migrating domains. On Thursday, we turned off one of our older sites, http://www.thegrillstoreandmore.com/, and 301 redirected each page on it to the same page on bbqguys.com. Our intent was to eliminate duplicate content issues. When we realized that something bad was happening, we immediately turned off the redirects and put thegrillstoreandmore.com back online. This did not unpenalize bbqguys. We've been looking for things for two days, and have not been able to find what we did wrong, at least not until tonight. I just logged back in to webmaster tools to do some more digging, and I saw that I had a new message. "Google Webmaster Tools notice of detected doorway pages on http://www.bbqguys.com/" It is my understanding that doorway pages are pages jammed with keywords and links and devoid of any real content. We don't do those pages. The message does link me to Google's definition of doorway pages, but it does not give me a list of pages on my site that it does not like. If I could even see one or two pages, I could probably figure out what I am doing wrong. I find this most shocking since we go out of our way to try not to do anything spammy or sneaky. Since we try hard not to do anything that is even grey hat, I have no idea what could possibly have triggered this message and the penalty. Does anyone know how to go about figuring out what pages specifically are causing the problem so I can change them or take them down? We are slowly canonical-izing urls and changing the way different parts of the sites build links to make them all the same, and I am aware that these things need work. We were in the process of discontinuing some sites and 301 redirecting pages to a more centralized location to try to stop duplicate content. The day after we instituted the 301 redirects, the site we were redirecting all of the traffic to (the main site) got blacklisted. Because of this, we immediately took down the 301 redirects. Since the webmaster tools notifications are different (ie: too many urls is a notice level message and doorway pages is a separate alert level message), and the too many urls has been triggering for a while now, I am guessing that the doorway pages problem has nothing to do with url structure. According to the help files, doorway pages is a content problem with a specific page.  The architecture suggestions are helpful and they reassure us they we should be working on them, but they don't help me solve my immediate problem. I would really be thankful for any help we could get identifying the pages that Google thinks are  "doorway pages", since this is what I am getting immediately and severely penalized for. I want to stop doing whatever it is I am doing wrong, I just don't know what it is! Thanks for any help identifying the problem! It feels like we got penalized for trying to do what we think Google wants. If we could figure out what a "doorway page" is, and how our 301 redirects triggered Googlebot into saying we have them, we could more appropriately reduce duplicate content. As it stands now, we are not sure what we did wrong. We know we have duplicate content issues, but we also thought we were following webmaster guidelines on how to reduce the problem and we got nailed almost immediately when we instituted the 301 redirects.

    White Hat / Black Hat SEO | | CoreyTisdale
    0

  • I am trying to optimise on line shop. At begining I was focused on small number of keywords, but now I am considering using 'long tail' technique. Is there any software supporting/helping with 'long tail' seo, as it seems impossible to optimise for all of  those pharses? How should link to subpages to get best results? Any other information regarding 'long tail' seo would be much appreciated. Best Regards

    Moz Pro | | g_stepinski
    0

  • Thought I'd see what the asking side of Q&A feels like 😉 We've been hearing for forever that the internet is running out of IP addresses, but I finally encountered the reality of it. I just realized that one of my sites is on a shared IP (hosted by Hosting.com, formerly HostMySite.com). My other sites with them included a unique IP, so I was surprised to discover this. They claim it's due to limitations on their IP allocations. Hosting.com doesn't have the option to buy a unique IP, but some other hosts do. I noticed, though, that many of them are using IPv6 for the new accounts. Has anyone had practical experience with having an IPv6 address and is there any impact on SEO?

    Technical SEO | | Dr-Pete
    0

  • I run a local business, and I'm working on ranking for keyword + city.  I currently rank on the first page for just about every keyword I'm working on, but only the top 3 for a little less than half.  Because the search volume is so low for each keyword (for most cities Google doesn't have an estimated monthly search volume) the grand total of a few searches a month for each keyword + city combination is where I get my traffic. Although I seem to be getting consistently higher in the rankings, I am curious as to how much more traffic I can expect.  I read somewhere that sites that are ranked number one are clicked 50% of the time, number two 20% of the time, number three 15% and from there on it goes down fast.  Rank 7 and on is below 1%.  Probably around 30% of my keywords are ranked between 7-10 and probably about 20% are ranked 4-6. Are the CTR numbers fairly accurate?  I understand that there are a lot of influences on CTR, such as title/description, but generally is that somewhat accurate? If it is, I am missing out on A LOT of traffic. I am pulling about 800 unique visitors a month from Google.  If I get in the top 3 for most of my keywords, can I expect significantly more traffic?  I ask the question because there are many other things I could be doing with my time to help the business aside from SEO.  I don't want to be working constantly on SEO if traffic is only going to increase very little.

    Algorithm Updates | | bjenkins24
    0

  • Most of the time, when we claim a Google Place Page, they give 2 choices to verify ownership:  1) phone verification and 2) postcard verification.  But right now (and for several weeks), for our listing, they are only giving the phone verification choice, which unfortunately won't work with our automated phone system.  How can we get our Place Page listing verified through a postcard sent to our address, when Google isn't presenting that as an option?

    On-Page Optimization | | DenisL
    0

  • To maximize SEO, would it be better to use a hyphen between two words in the name of a website? For instance, www.londonparis.com or www.london-paris.com. Would it be OK to use www.LondonParis.com Many thanks in advance, Ricardo

    On-Page Optimization | | RicardoMello
    0

  • It's quite easy to buy Facebook likes & fans these days. You can get 200 for five bucks at fiverr. I am sure that most of the likes are from fake accounts. But do you think it will have any affect anyway? And is this black hat SEO or is it just collateral from trying to boost the # likes / fans?

    Social Media | | Chrisper
    0

  • One of the sites I'm working on seems to have dropped a few spots in rankings. It has numerous home page tiles, but they are not really advertising; just links to different sections on the site. Does anyone think that might be a factor in the rankings drop?

    On-Page Optimization | | J.Marie
    0

  • Hi, I have recently started working in-house for a company and one site development was started and completed just as I joined. A new area of the site has been developed, but the developers have developed this new section in php, which cannot be hosted on the windows server the site is running on (they tell me, is this correct?) They want to add the new section as a subdomain - http://newarea.example.co.uk/ whereas I would have preferred the section added as a new subfolder. I plan to ensure that future developments to not have this problem, but is the best solution to work with the subdomain (in this instance it may not be too bad as it is a niche area of the site), or can I redirect the pages hosted on the sub-domain to a subfolder, and is this recommended? Thanks for your time.

    Technical SEO | | LSLPS
    0

  • How does the website www.starsdirectory.com.ar get a Domain Authority of 54 and a Page Authority of 61 when Google quite correctly gives it a PR 0? It is clearly a spam directory, which Google has recognised. It is very misleading using OSE or Campaign Management when sites such as these (and there are hundreds more we have found) are skewing the results of competitiors through the use of spam links. Is there no way that SEOMoz tools can identify such spam sites when they create their ratings?

    Moz Pro | | paulsmithlondon
    0

  • I have separate sites for my blog and main website. I'd like to link them in a way that enables the blog to boost my main site's SEO. Is there an easy way to do this? Thanks in advance for any advice...

    Technical SEO | | matt-14567
    0

  • We have a massive site that is having some issue being fully crawled due to some of our site architecture and linking. Is it possible to have a XML sitemap index point to other sitemap indexes rather than standalone XML sitemaps? Has anyone done this successfully? Based upon the description here: http://sitemaps.org/protocol.php#index it seems like it should be possible. Thanks in advance for your help!

    Intermediate & Advanced SEO | | CareerBliss
    0

  • Who has the fastest hosting company? what major provider has fastest service for page load times? Looking for affordability like godaddy.

    Technical SEO | | bozzie311
    0

  • I've been tracking my link building activities in a spreadsheet using these columns: Date, URL, Action Taken, Status, Link Text used/requested. Can anyone share different or better ways of tracking what links you have requested and how you keep track of them

    Link Building | | waynekolenchuk
    0

  • We have our complete set of scriptures online, including the Bible at http://lds.org/scriptures.  Users can browse to any of the volumes of scriptures. We've improved the user experience by allowing users to link to specific verses in context which will scroll to and highlight the linked verse.  However, this creates a significant amount of duplicate content.  For example, these links: http://lds.org/scriptures/nt/james/1.5 http://lds.org/scriptures/nt/james/1.5-10 http://lds.org/scriptures/nt/james/1 All of those will link to the same chapter in the book of James, yet the first two will highlight the verse 5 and verses 5-10 respectively.  This is a good user experience because in other sections of our site and on blogs throughout the world webmasters link to specific verses so the reader can see the verse in context of the rest of the chapter. Another bible site has separate html pages for each verse individually and tends to outrank us because of this (and possibly some other reasons) for long tail chapter/verse queries.  However, our tests indicated that the current version is preferred by users. We have a sitemap ready to publish which includes a URL for every chapter/verse.  We hope this will improve indexing of some of the more popular verses.  However, Googlebot is going to see some duplicate content as it crawls that sitemap! So the question is: is the sitemap a good idea realizing that we can't revert back to including each chapter/verse on its own unique page?  We are also going to recommend that we create unique titles for each of the verses and pass a portion of the text from the verse into the meta description.  Will this perhaps be enough to satisfy Googlebot that the pages are in fact unique? They certainly are from a user perspective. Thanks all for taking the time!

    Technical SEO | | LDS-SEO
    0

  • We have a very large ecommerce store with little fresh content being added, accept through a web blog on the sub domain. We are thinking of moving over our blog that is on another domain entirely and has a lot of active users. But first I want to make sure it will actually help the domains rankjings, and second i'm concerned about the duplicate content on the old forum if we move it to the main domain. Should we just copy over all the content, 301 the old forum URL's to the new ones? Thanks much!

    Intermediate & Advanced SEO | | iAnalyst.com
    0

  • Hi everyone, I love the duplicate content feature; we have a lot of duplicate content issues due to the way our site is structured. So, we're working on them. However, I'm not fully understanding the results. For example, say I have an article on breast cancer symptoms. It shows up as duplicate content, by having two urls that point to the exact same page. http://www.healthchoices.ca/articles/breast cancer symptoms and http://www.healthchoices.ca/somerandomstringofcode. I fully understand why that is duplicate content. I am not sure about this though, it picks up the same url twice and calls it duplicate content. For example, saying that http://www.healthchoices.ca/dr.-so-and-so and http://www.healthchoices.ca/dr.-so-and-so is duplicate...however is this not the same page? Is there something I'm missing? Many of the URL's are identical. Thanks, Erin

    Technical SEO | | erinhealthchoices
    0

  • I have been doing manual keyword research for some time now, but I am looking for something that can dig deep. I used to use the SEMrush API when i was a raven tools subscriber and it was great for identifying organic rankings for domains. Has anyone had a good experience with this software strictly for keyword research?

    Competitive Research | | JordanGreve
    0

  • As of a day ago, the SERPs in Google are showing our listing with NO meta description at all and the incorrect title.  Plus the Title is varying based on the keywords searched. Info: Something I just had done was have the multiple versions of their home page (duplicate content, about 40 URLs or so) 301 redirected the the appropriate place.  I think they accidentally did 302s. Anyone seen this before? Thanks

    Technical SEO | | poolguy
    0

  • I've been meaning to try out the eTag entity for a while now. They seem like a great way to notify the bot when to and when not to fetch your content. Fruthermore, it is impliad that proxy servers can make use of them and help your site load faster by not fetching a newer copy if one is not available. This is not something that is easy to test on a small site and implementation on bigger sites is in my case a one way road and a few weeks in hell with the developers on staff. Will eTags take some load off the a site with a lot of traffic and dynamically generated content? Is this a good practice, as far as search engines are concerned?

    Intermediate & Advanced SEO | | Svetoslav
    0

  • I site I've been working on has been up since early January. The domain was not new, but no site or links existed prior. Link building finished 2 months ago. There are about 80 fairly high quality links mostly from unique domains, and the site has been doing well with search engines for some time. OSE lists only 24 of these domains (for all pages on the root domain). OSE stats hardly changed in the last update so here's my question: when will OSE data reflect the current reality?

    Link Building | | waynekolenchuk
    0

  • About an year, the business I work for purchased 20+ domains: sendmoneyfromcanada.com sendmoneyfromaustralia.com sendmoneyfromtheuk.com sendmoneyfromireland.com The list goes on, but you can get the main idea. They thought that the domains can be useful to aid http://www.transfermate.com/ . I can set up a few micro sites on them, but from that point there will be no one to maintain them. And I'm, honestly, not too happy with hosting multiple sites on one IP and having them all link to the flagship. It is spammy and it does not bring any value to end users. I might be missing something, so my question is - Can I use these domains to boost my rankings, while avoiding any shady/spammy techniques? P.S. I had this Idea of auctioning the domains in order to cover for the domain registration fees.

    White Hat / Black Hat SEO | | Svetoslav
    0

  • Hello and Good Morning, I have a tech convergence question... In our social media package we have listed that we will subscribe to hubpages and scribd. a lot of these services do the same thing (twitter vs identi.ca, color vs instagram, gowalla vs fourquare, ect), what are some of the better services to use and why? We have our favorites but they may not necessarily be the most effective. thanks!

    Social Media | | imageworks-261290
    0

  • I was reading some seomoz posts last week and for a long time ago they are making some experiences with social medias. I agree that facebook share is more powerful than retweets but also is harder to get it. I would like to know if you have any GOOD experiences with Twitter or Facebook in your SEO campaigns. I got one: I got 20 retweets per day since last week, now my keyword is ranking at 2º positon. What about you? Thanks

    Social Media | | Ex2
    1

  • Hi All, We have a crawler problem on one of our sites www.sneakerskoopjeonline.nl. On this site, visitors can specify criteria to filter available products. These filters are passed as http/get arguments. The number of possible filter urls is virtually limitless. In order to prevent duplicate content, or an insane amount of pages in the search indices, our software automatically adds noindex, nofollow and noarchive directives to these filter result pages. However, we’re unable to explain to crawlers (Google in particular) to ignore these urls. We’ve already changed the on page filter html to javascript, hoping this would cause the crawler to ignore it. However, it seems that Googlebot executes the javascript and crawls the generated urls anyway. What can we do to prevent Google from crawling all the filter options? Thanks in advance for the help. Kind regards, Gerwin

    Intermediate & Advanced SEO | | footsteps
    0

  • I'm tracking keywords. In the past past i used Rank checker. I have compared Rank checker and Rank tracker of SEOmoz and they give different results. Which one is more accurate?

    Moz Pro | | PlusPort
    0

  • Hi, We are running a website in Spain to teach touch typing to children (www.mecanografia.com). Our main keyword is mecanografia (touch typing in Spanish) which is official written with an accent on the i. Our SEOMOZ on page optimization report was initially an F because we set the keyword mecanografia without an accent and on the page we use the grammatically correct version with an accent on the i. Once we added the keyword with accent to SEOMOZ our rapport was upgraded to an A. Our question is: how does Google treat accents. Is it necessary to optimize for words with and without accent or does one version suffice. Users search about 50% of the time without using the accent. Any advice is greatly appreciated!

    On-Page Optimization | | Mecanografia
    0

  • Obviously, websites can link to another web site using iframes, and Google and other search engines do seem to have some capability to index the content. What I want to know is what is the difference in value passed between a regular link and an iframe link. <iframe src="http://www.targetwebsite.com/targetlink"><br /><br />will have the same page ranking effect on the target web site as this link: <br /><br /><a href="http://www.targetwebsite.com/targetlink">link</a><br /><br />Alternative, would it work to include an invisible actual link right before the iframe, like this: <br /><br /><a style="display: none;" href="http://www.targetwebsite.com/targetlink">link</a><br /><iframe src="http://www.targetwebsite.com/targetlink"/></p> <p><span style="color: #5e5e5e;">The reason is that we are building a product recommendation engine, for a branded cosmetology school in order to get our concept salons to link back to use and I was curious if creating a version they can use as an iframe will give us link benefit.</span></p></iframe>

    Link Building | | incept
    0

  • Hey Guys Just wanted to get some friendly feedback on ways that you like to promote linkbait. Personally, I like to: a) develop the companies main social media channels (twitter and facebook) b) develop accounts on digg and reddit etc c) email journalists and blog owners who appear to be part of my clients 'linkerati' Has anyone got any interesting approaches or strategies for getting the viral ball rolling?

    Intermediate & Advanced SEO | | SebastianDyer
    0

  • I want to move a site of mine that ranks #1 for many keywords to a new domain name.  I have already redirected many smaller less important sites to the new domain, but have held off on my most popular site.   If I redirect the entire site with a 301 redirect, what can I expect with my number one ranking, particular for coveted search terms..thanks for the input.

    Technical SEO | | insync
    0

  • Having read Rand's post about the canonical tag I very much wish to use it to advise Google that the duplicates created during archiving (due to the fact that the posts have multiple categories) are just copies. I understand the theory, but can't transfer it into practice!  Could someone give me an idiots guide as to how to add the tag and to where.  My site produces approx 10 - 20 blog posts per week.  Each has at least 2-4 categories applied to it.  They are archived each month, at which point I have  a big jump in duplicates in my campaign panel. Help! C:\Users\Catherine\Desktop\me_small.gif

    Content Development | | catherine-279388
    1

  • Total, total noob question, I know - but is rogerbot performance bound because of bandwidth and processing capacity? I understand if it is, but I am wondering for those of us with very large sites if we would be able to offload the burden on SEOmoz resources by running our own local licensed version of rogerbot, crawl the sites we want and the upload the data to SEOmoz for analysis. If this was possible would we be getting more immediate results?

    Technical SEO | | AspenFasteners
    0

  • We are considering using a video hosting service like Vzaar or Wistia for our videos. What are some of the SEO advantages of using a video hosting service rather than YouTube?

    Content Development | | SparkplugDigital
    0

  • If you had to choose one person to represent the black hat community who would it be?

    Industry News | | JasonJackson
    0

  • When I first signed up about a week and a half ago I paid for the DVD's but not only did I not get a confirmation, but my 2 emails requesting a status went unanswered. A question about the page crawl gets answered right away. So is this co. all tech? Sales dept. just takes money, then hides? What's up?

    Link Building | | joemas99
    0

  • Hi all, I have a listings based website that just doesn't seem to want to pass rank to the inner pages. See here for an example: http://www.business4sale.co.uk/Buy/Hotels-For-Sale-in-the-UK I know that there are far too many links on this page and I am working on reducing the number by altering my grid classes to output fewer links. The page also displays a number of links to other page numbers for these results. My script adds the string " - Page2" to the end of the title, description and URL when the user clicks on page two of these results. My question is: Would an excessive amount(200+) of links on a page result in less PR being passed to this page(looking spammy)? And would using rel canonical on page numbers greater than 1 result in better trust/ranking? Thanks in advance.

    Intermediate & Advanced SEO | | Mulith
    0

  • I've had a horrible experience with the security on wordpress hosting with GoDaddy.  Someone recommended Blue Host as my next option.   Does anyone have any experience with BlueHost and what other hosting companies would you recommend for wordpress hosting?

    Web Design | | ChristineCadena
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.