Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • the product pages of my ecommerce site contain schema markup. 
    according to rich snippet tool from google all looks fine and properly formatted
    however in the SERP google shows the last rating in the snippet instead of the aggregate rating. any idea how to show the aggregate ratings? URL: https://www.humidordiscount.co.uk/adorini-triest-deluxe-rosewood-humidor below extract of schema code as recognized by the rich snippet tool: Product name:adorini Triest Deluxe Rosewood Humidor image:https://www.humidordiscount.co.uk/952-large_atch/adorini-triest-deluxe-rosewood-humidor.jpg description:High-quality multiple ..... brand [Organization]: name:Adorini offers [Offer]: price:141 priceCurrency:GBP availability:http://schema.org/InStock aggregateRating [AggregateRating]: worstRating:1 ratingValue:4.5 bestRating:5 ratingCount:460 review [Review]: description:It 's my first .... reviewRating [Rating]: ratingValue:4 worstRating:1 bestRating:5 author [Thing]: name:Alessandro M review [Review]: description:excellent ... reviewRating [Rating]: ratingValue:5 worstRating:1 bestRating:5

    Conversion Rate Optimization | | lcourse
    0

  • Hello Moz community, Let's suppose you're working on a +600 pages website and you are working on lots of keywords. I'd like to know if you had a database / excel / tool to know which keywords you've been targetting so that you don't create twice the same content ? Thanks for your answers

    Keyword Research | | Sindicic_Alexis
    0

  • Hi, New here in the SEO world. Excellent resources here. We have an ecommerce website that sells presentation templates. Today our templates come in 3 flavours - for PowerPoint, for Keynote and both - called Presentation Templates. So we've ended up with 3 URLS with similar content. Same screenshots, similar description.. Example: https://www.improvepresentation.com/keynote-templates/social-media-keynote-template https://www.improvepresentation.com/powerpoint-templates/social-media-powerpoint-template https://www.improvepresentation.com/presentation-templates/social-media-presentation-template I know what you're thinking. Why not make a website with a template and give 3 download options right? But what about https://www.improvepresentation.com/powerpoint-templates/ https://www.improvepresentation.com/keynote-templates/ These are powerfull URL's in my opinion taking into account that the strongest keyword in our field is "powerpoint templates" How would you solve this "problem" or maybe there is no problem at all.

    Technical SEO | | slidescamp
    0

  • Hi Guys, Just wondering what is the best way to find forums in your industry?

    Intermediate & Advanced SEO | | edward-may
    2

  • Hi.  Just wondering what you all thought would be the best way to estimate a customer base for a certain niche? I'd be willing to pay for this service, if there are companies who do this sort of thing.  But I'm assuming I'll be able to do it for free... How much weight should I putting into facebook ad data? Any advice?

    Search Behavior | | PedroAndJobu
    0

  • Hi ! I have 7 Domains that I bought that point to the same webspace as my main domain. In Open Site Explorer they are showed as spam links. So to solve the issue I redirected the links to an empty subdirectory on the same server which is different from the directory the main domain is linking to. But nevertheless the domains are still showing up as spam. Why might that be? What can I do to get rid of these domains? In fact I only need the main domain. Cheers, Marc

    Intermediate & Advanced SEO | | RWW
    0

  • I'm thinking of starting a new blog, but when I did my keyword research I found that my keywords all have low search volume (under 100 searches per month, with the occasional keyword having 480 searches a month). Is this a deal breaker? Any recommendations would be great - thanks everyone!

    Content Development | | Trevorneo
    1

  • Hi Mozers, I though a while ago I heard that buying backlinks was a no go, until I seen and read this article:  I notice the guy that wrote the article suggested that you can buy backlinks from fiverr, and also just make sure they are do-follow backlinks.  Can someone please correct me and perhaps clear my confusion over this.  As far as I knew it was best to build backlinks by doing guest posting and engaging in relevant forums? Heres the article: http://socialmediafuze.com/10-backlink-strategies-business/ Thanks guys

    Intermediate & Advanced SEO | | edward-may
    2

  • I am using Moz Local UK on behalf of a customer of mine. There are two verified listings when I do a search for their business name and postcode. There are a couple of slight variations in the two verified NAPs; One says Road, the other says Rd. One says their county, the other says their district - or East Sussex & Brighton & Hove. Both their Facebook and Google+ pages say Road and East Sussex. Both verified listings are wrong and in contrast to any Facebook or Google+ account. Not too sure where to go with this and would appreciate any advice :0)

    Moz Local | | Nanuq
    0

  • Hey gang, I hate to write to you all again with more bad news, but such is life. Our big data team produced an index this week but, upon analysis, found that our crawlers had encountered a massive number of non-200 URLs, which meant this index was not only smaller, but also weirdly biased. PA and DA scores were way off, coverage of the right URLs went haywire, and our metrics that we use to gauge quality told us this index simply was not good enough to launch. Thus, we're in the process of rebuilding an index as fast as possible, but this takes, at minimum 19-20 days, and may take as long as 30 days. This sucks. There's no excuse. We need to do better and we owe all of you and all of the folks who use Mozscape better, more reliable updates. I'm embarassed and so is the team. We all want to deliver the best product, but continue to find problems we didn't account for, and have to go back and build systems in our software to look for them. In the spirit of transparency (not as an excuse), the problem appears to be a large number of new subdomains that found their way into our crawlers and exposed us to issues fetching robots.txt files that timed out and stalled our crawlers. In addition, some new portions of the link graph we crawled exposed us to websites/pages that we need to find ways to exclude, as these abuse our metrics for prioritizing crawls (aka PageRank, much like Google, but they're obviously much more sophisticated and experienced with this) and bias us to junky stuff which keeps us from getting to the good stuff we need. We have dozens of ideas to fix this, and we've managed to fix problems like this in the past (prior issues like .cn domains overwhelming our index, link wheels and webspam holes, etc plagued us and have been addressed, but every couple indices it seems we face a new challenge like this). Our biggest issue is one of monitoring and processing times. We don't see what's in a web index until it's finished processing, which means we don't know if we're building a good index until it's done. It's a lot of work to re-build the processing system so there can be visibility at checkpoints, but that appears to be necessary right now. Unfortunately, it takes time away from building the new, realtime version of our index (which is what we really want to finish and launch!). Such is the frustration of trying to tweak an old system while simultaneously working on a new, better one. Tradeoffs have to be made. For now, we're prioritizing fixing the old Mozscape system, getting a new index out as soon as possible, and then working to improve visibility and our crawl rules. I'm happy to answer any and all questions, and you have my deep, regretful apologies for once again letting you down. We will continue to do everything in our power to improve and fix these ongoing problems.

    API | | randfish
    11

  • I'm having 2 possibly related issues with a Moz Local listing. The first is that the company the listing is for was an LLC when we put the listing up, but have since changed a C corporation, so the name needs to change from ________ LLC to _________, Inc. I've updated the name on their validated Google+ local business listing a few weeks ago, and Moz Local sees that listing, but the name on Moz Local hasn't updated yet. The second issue is that it's not verifying through Facebook, and the Facebook page that Moz Local CAN see is the wrong one. I edited the listing and manually added the URL to the correct Facebook page a few weeks ago, but that still hasn't gone through either. One last thing, if it makes any difference, is that the company's website does have a link directly to the proper Facebook page. Hopefully there's a quick and/or easy way to resolve this. Thanks!

    Moz Local | | BrianAlpert78
    0

  • Hello Please comment on which you think is best SEO practice for each & any comments on link juice following through. Title text ( on Product Page ) <title>Brandname ProductName</title>
    OR
    <title>ProductName by Brandname</title> on category page <a <span="" class="html-attribute-name">itemprop="name" href="[producturl]">ProductName</a>
    <a <span="" class="html-attribute-name">itemprop="brand" href="[brandurl]>BrandName</a> OR <a <span class="html-attribute-name">itemprop="name" href="[producturl]">BrandName ProductName
    ( Leave Brand Link Out)</a <span> Product Page <a itemprop="name" href="[producturl]">ProductName
    <a itemprop="brand" href="[brandurl]>BrandName</a itemprop="brand" href="[brandurl]></a itemprop="name" href="[producturl]"> OR <a itemprop="name" href="[producturl]">BrandName ProductName
    ( Leave Brand Link Out)</a itemprop="name" href="[producturl]"> Thoughts?

    Intermediate & Advanced SEO | | s_EOgi_Bear
    0

  • Hello Best Practice for rich snippets / structured data in ecommerce category pages? I put structured markup in the category pages and it seems to have negatively impacted SEO.  Webmaster tools is showing about 2.5:1 products to pages ratio. Should I be putting structured data in category Pages at all? Thanks for your time 🙂

    Intermediate & Advanced SEO | | s_EOgi_Bear
    0

  • We are trying to select our "business type" for MozLocal. It's not clear what the implication is by choosing one over the other: brick and mortar or service area? Does anyone understand the criteria to choose one over the other, and when and how is this used by local listings? Thanks.

    Moz Local | | partnerscreative
    0

  • According to: https://developers.google.com/structured-data/rich-snippets/reviews - all someone has to do is add in some html code and write the review. How does google do any validation on whether these reviews are legitimate or not?

    White Hat / Black Hat SEO | | wlingke
    0

  • Hey Moz, Overall we love your product and are using it daily to help us grow, part of that has been to rely on the Moz Index for DA and PA as well as places where we are doing positive linking through genuine partnerships and reviews of clients. We were really excited to see any the results for this month as we have been partner linked from lots of high reputation sites and google seems to agree as our rankings are moving up weekly. The question from our marketing team is, since a significant part of Moz will not be available to us this month, will there be any compensation handed out to the paying community. PS: I am an engineer and I know how you have probably lost a very large set of data which cant simply be re-crawled over night but Moz Pro is not a cheap product and we do expect it to work. Source: https://moz.com/products/api/updates Kind Regards.

    API | | SundownerRV
    0

  • I am looking into a .htacess file for a site I look after and have noticed that the urls are all 301 redirecting from a none slash directory to a trailing slashed directory/folders. e.g. www.domain.com/folder gets 301 redirected to www.domain.com/folder/ Will this do much harm and reduce the effect on the page and any links pointing to the site be lessened? Secondly I am not sure what part of my htaccess is causing the redirect. RewriteCond %{HTTP_HOST} !^www.domain.co.uk [NC] RewriteCond %{HTTP_HOST} !^$
    RewriteRule ^(.*) http://www.domain.co.uk/$1 [L,R,NE] RewriteCond %{THE_REQUEST} ^./index.php
    RewriteRule ^(.)index.php$ /$1 [R=301,L] or could a wordpress ifmodule be causing the problem? Any info would be apreciated.

    Technical SEO | | TimHolmes
    0

  • Should I use the word "blog" in my sub folder as in : http://www.mybusiness.com/blog or should I use http://www.mybusiness.com/news. Is there a difference for when my site is crawled. I understand that a blog works a little differently. Can someone explain the basics?

    On-Page Optimization | | graemesanderson
    0

  • Hi guys, I'd love to hear your opinion on how to handle this. We have a www.site.com business site. We expanded our business to Italy a few years ago with an Italian ccTLD with a www.site.it version. A business decision has been made to expand in to several new territories, but we will be going with a subdirectory structure for each country: .com/se, .com/fi etc When we're setting up the hreflang tags for this, the language/region all need to be cross annotated. This is all fine. The only anomaly is the site.it version already existing. This site will continue to exist as it suits its market context so well. In terms of annotation, should the hreflang include the .it site or is the ccTLD the only signal google needs to serve the correct version to Italian searchers.But the hreflang tag needs to declare all available language versions and it is possible to include different domains etc. Please let me know your thoughts on this 🙂 My gut feeling is that it should be included? Thanks!

    International SEO | | AdiRste
    0

  • I've got a client with multiple business locations however when I type the company name into Google it's only displaying 2/3 of the branches in the local pack for that SERP. The client's picked up on this and would like the third location to appear in the local pack when people search for their company name. I have a suspicion that it's because the third location (the one that isn't displaying) has an address that's exactly the same as several other businesses that are located in offices directly above or to the side of them. My (very flakey) theory is that Google is perhaps uncertain about the exact location of this business given that there are several others with the same address but different business names, so the NAP consistency is being diluted and Google is simply leaving them out of the local pack due to the uncertainty over which business is in fact located at 2 West Street. So my question is, has anyone else had any issues of not all business locations showing in the local pack for a brand name query and if so how did you solve it?

    Local Listings | | PeteW
    0

  • Hi Guys I have a question...I am currently working on a website that was hit by a spam attack. The website was hacked and 1000's of adult censored pages were created on the wordpress site. The hosting company cleared all of the dubious files - but this has left 1000's of dead 404 pages. We want to fix the dead pages but Google webmaster only shows and allows you to download 1000. There are a lot more than 1000....does any know of any Good tools that allows you to identify all 404 pages? Thanks, Duncan

    Intermediate & Advanced SEO | | CayenneRed89
    0

  • Three different SEO guys suggest moving my blog to another platform. " I think migrating to a more robust CMS platform like Drupal or Wordpress would be a wise decision." But, they never say why moving would benefit SEO. My blog is on a custom domain, has lots of original content and has decent organic traffic to begin with. I think I have other SEO issues to deal with before bothering with a new platform. Does blogger stink for SEO? Why?

    Web Design | | Eric_haney
    0

  • Hi, I am loading 3 CSS files here: http://www.viatrading.com/wholesale/9/Domestics.html PageSpeed is telling me I "should fix" the delivery of these CSS Files (see image). I read https://developers.google.com/speed/docs/insights/OptimizeCSSDelivery , but can't figure out which is my case. The CSS are big, but even if I split them in several, all CSS files are still showing up as render-blocking. I moved them to the header/footer, but the message is still appearing. Do you know what might be the problem and how to solve it? Thank you, Screen_Shot_2015_09_10_at_4_44_23_PM.png

    On-Page Optimization | | viatrading1
    0

  • Hi, We have had a strongly ranking site since 2004. Over the past couple of days, our Google traffic has dropped by around 20% and some of our strong pages are completely disappearing from the rankings. They are still indexed, but having ranked number 1 are nowhere to be found. A number of pages still remain intact, but it seems they are increasingly disappearing. Where should we start to try and find out what is happening? Thanks

    Intermediate & Advanced SEO | | simonukss
    0

  • Hello Moz Community, I'm working with a client who has translated their top 50 landing pages into Spanish. It's a large website and we don't have the resources to properly translate all pages at once, so we started with the top 50. We've already translated the content, title tags, URLs, etc. and the content will live in it's own /es-us/ directory. The client's website is set up in a way that all content follows a URL structure such as: https://www.example.com/en-us/. For Page A, it will live in English at: https://www.example.com/en-us/page-a For Page A, it will live in Spanish at https://www.example.com/es-us/page-a ("page-a" may vary since that part of the URL is translated) From my research in the Moz forums and Webmaster Support Console, I've written the following hreflang tags: /> For Page B, it will follow the same structure as Page A, and I wrote the corresponding hreflang tags the same way. My question is, do both of these tags need to be on both the Spanish and English version of the page? Or, would I put the "en-us" hreflang tag on the Spanish page and the "es-us" hreflang tag on the English page? I'm thinking that both hreflang tags should be on both the Spanish and English pages, but would love some clarification/confirmation from someone that has implemented this successfully before.

    International SEO | | DigitalThirdCoast
    0

  • Very curious situation. We have a network of sites. Sunday night one (only one) of our sites goes down, and since then we've seen a loss in traffic across all our sites!! Not only have we seen a loss of traffic, we also saw a loss of indexed pages. A complete drop off from 1.8 million to 1.3 million pages indexed. Does anyone know why one site outtage would affect the rest of them? And the indexed pages? Very confused. Thanks,

    Technical SEO | | TMI.com
    0

  • Under On-Page Elements in the Moz bar there is a Tag/ Location called Bold/ Strong. What does that mean?

    Moz Bar | | TiffanyatElite
    0

  • Hello: There was a Mozscape Index scheduled 9/8/2015 and now it go pushed back October 8,2015. There seems to be a lot of delays with the Mozscape Index. Is this something we should expect? Updates every 2 months instead of every month? Thanks!

    API | | sderuyter
    1

  • I found an example of the snippet to use: __gaq.push(['.setCustomVar,
    1, // first slot
    'user-type', // custom variable name
    'visitor', // custom variable value
    2 // custom variable scope - session-level
    ]); Once the visitor logs into your website, we change this code, accordingly: __gaq.push(['.setCustomVar,
    1, // first slot
    'user-type', // custom variable name
    'regular-user', // custom variable value
    2 // custom variable scope - session-level
    ]); How does the code know to change from 'visitor' to 'regular user' once a user logs in? Is the snippet only placed on the login page?

    Reporting & Analytics | | Evan34
    0

  • Hi- I have a client that had thousands of dynamic php pages indexed by Google that shouldn't have been.  He has since blocked these php pages via robots.txt disallow.  Unfortunately, many of those php pages were linked to by high quality sites mulitiple times (instead of the static urls) before he put up the php 'disallow'. If we create 301 redirects for some of these php URLs that area still showing high value backlinks and send them to the correct static URLs, will Google even see these 301 redirects and pass link value to the proper static URLs?  Or will the robots.txt keep Google away and we lose all these high quality backlinks?   I guess the same question applies if we use the canonical tag instead of the 301.  Will the robots.txt keep Google from seeing the canonical tags on the php pages? Thanks very much, V

    Technical SEO | | Voodak
    0

  • Our website consists of a primary domain (marketing focused) and subdomain (ecommerce platform). The two sites look and function as one site even though they are using different technology. I would like to track the primary domain (example.com) and the subdomain (shop.example.com) as a single site in Google Analytics. The subdomain will be set up with GA ecommerce tracking as well. Can someone provide an example of the GA snippet that each would need?

    Reporting & Analytics | | Evan34
    0

  • Seem to be getting an awful lot of referral spam (Very large percentage), does it cause a problem with SEO or do search engines ignore it? Is it likely to have a negative effect? Is it a type of commercial warfare by competitors? Does anyone have any direct experience or knowledge? Thanks

    Technical SEO | | seoman10
    0

  • i have used wordfence on wordpress and I am getting a lot of alerts of "people" registering on my blog but then having over 20 failed login attempts. Gives the IP addresses. Anyone else had this issue and should I just block the ip addresses? How have you guys fixed the issues?

    Technical SEO | | kpexpressions
    0

  • I have read that it is unsafe to change more than 20% of your site’s content in any update. The rationale is that "Changing too much at once can flag your site within the Google algorithm as having something suspicious going on." Is this true, has anyone had any direct experiences of this or similar?

    Algorithm Updates | | GrouchyKids
    0

  • Does anyone know how deep the crawl diagnostics will crawl when searching for dup content? Will it crawl the entire site, or will it only crawl "x" amount of pages? Thanks!

    Moz Bar | | tdawson09
    0

  • Hi On 1st June we moved http://www.patient.co.uk to http://patient.info. We are a trusted health website so the information is relevant to all english speaking countries. (Content on the .co.uk domain has been there for over 15 years). Prior to the move over 60% of site traffic was international, even with a .co.uk domain. The intention of the move was to broaden our international reach/traffic whilst maintaining our UK traffic. We would do this as .info is a top level domain. We followed all of the best practice rules, 301s, new and old sitemaps, change of address in webmaster tools etc. Basically all the advice here: https://support.google.com/webmasters/answer/6033049?hl=en&ref_topic=6033084&rd=1 We specifically chose on the new domain to leave "unlisted" under webmaster tools international targeting as the content is relevant for all countries. This is the only thing that has changed compared to the previous settings. The URL structure etc is all identical, just on a different domain. After the move we immediately saw a drop in c.60% of traffic. Over the first 5 weeks after the move we had initial gradual recovery (c.2% increase on traffic week on week) Since then it has completely flatlined with no traffic increase. So we are sat at c.50% less traffic than we did have before the move. Worryingly over the past 2 weeks, the indexed results for patient.info have dropped from c.2M to c.500k (https://www.google.co.uk/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=site%3Apatient.info) There were c.6M indexed results for patient.co.uk before the move, this has been gradually shrinking and there are now c.300k indexed results (https://www.google.co.uk/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=site:patient.co.uk) In webmaster tools crawl stats, .co.uk is still being crawled like crazy, much more than .info. It feels like we have followed the rules but something is missing and that the new site just isn't being fully indexed or as highly ranked as the old site. Anyone who has any input/advice would be much appreciated. Many thanks Ben

    Reviews and Ratings | | PATIENTUK
    0

  • Hey guys, two weeks ago we switched over to the new website, however we've experienced about a 20% drop in Google Organic traffic and it does not appear we are getting indexed correctly by Google. On search results it's not using the meta description and the links still point to the old pages. Robots is fine We are using the same IP address Re-directs are in for the pages in question. Sitemap was submitted to Google in Webmaster tools What else do we need to do?

    Web Design | | ScottOlson
    0

  • A business with offices in 3 major cities and loads of service areas hired us to build its website. Here's my internal debate regarding local SEO: Do I build one site with a thorough sitemap that utilizes one page per city and/or region for local SEO? Do I build a primary site with a limited sitemap and a subsite for each city (e.g. companyname.com/city) that essentially replicates the sitemap from the primary site? If I go this route, the content on each page of each subsite would be unique (not copied and localized versions of the content on the primary site), but what about the keywords? For example, should each subsite use the same keywords as the primary site (e.g. companyname.com/keyword-or-phrase and companyname.com/city-name/keyword-or-phrase OR companyname.com/keyword-or-phrase and companyname.com/city-name/variation-of-keyword-or-phrase). In the end, I suppose the question is, "Should I build one site with a more thorough sitemap and single pages for each city and/or region OR should I build a site for each city with less thorough sitemaps?" Budget constraints won't allow for option C, which is build a site for each city with a thorough sitemap for each. Thank you guys in advance for whatever insight you're willing to give!

    Local SEO | | cbizzle
    0

  • Hi guys, it seems that now days it is almost impossible to achieve 0 (Zero) Errors when testing a site via (The W3C Markup Validation Service - https://validator.w3.org). With analytic codes, pixels and all kind of tracking and social media scripts gunning it seems to be an unachievable task. My questions to you fellow SEO'rs out there are 2: 1. How important and to what degree of effort do you go when you technically review a site and make the decision as to what needs to be fixed and what you shouldn't bother with. 2. How do you argue your corner when explaining to your clients that its impossible to active 100% validation. *As a note i will say that i mostly refer to Wordpress driven sites. would love ot hear your take. Daniel.

    Technical SEO | | artdivision
    0

  • Hi, When I search for "Zotey" in google, the following message is being displayed. Showing results for zotye
    Search instead for zotey Anyone let me know how to get rid of this conflict asap? Regards, Sivakumar.

    Technical SEO | | segistics
    0

  • I look after 3 sites which have a lot of crossover on products. We have 1000s of products and I've made it a requirement that we give each it's on description on each of the sites. This sounds like the right thing to but it's very hard for our content writers to write three different versions descriptions, especially when we have variations on the products so potentially writing unique product descriptions for 4-5 very similar products on three separate sites. We've worked very hard to create unique content deep through the site on all categories, subcategories and tag combinations and along with the other SEO work we've done over the last couple of years is producing great results. My question is now far do we have to go? I'm busy writing some product descriptions for a 3rd party site for some of our products, the easy thing to do is just copy and paste but I want Google to see the descriptions as unique. Whilst all SEO advice will say 'write unique descriptions' from a practical point of view this isn't especially useful as there doesn't really seem to be much guidance on how different they need to be. I gather we can't just move around the paragraphs or jumble up sentences a bit but it is easier to work from a description and change it than it is to start from a blank slate (our products range form being very interesting and unique, to quite everyday so sometimes tough to create varied unique content for). Does anyone know of any guidance or evidence of just how clever the Google algorithm is and how close content has to be before it becomes classed as the same or similar? Thanks Pete

    Content Development | | PeterLeatherland
    0

  • I built a website, http://deeprootshgc.com. I submitted the site to google and have set up analytics. When I search for the site, deep roots home and garden center, I get their facebook page, manta and everything but the actual URL. Any thoughts?

    Local Listings | | MarkBolin
    1

  • Hi guys, having a bit of a tough time here... MOZ is reporting duplicate content for 21 pages on eagleplumbing.co.nz, however the reported duplicate is the www version of the page. For example: http://eagleplumbing.co.nz and http://www.eagleplumbing.co.nz are considered duplicates (see screenshot attached) Currently in search console I have just updated the non-www version to be set as the preferred version (I changed this back and forth twice today because I am confused!!!). Does anyone know what the correct course of action should be in this case? Things I have considered doing include: changing the preferred version to the www version in webmaster tools, setting up 301 redirects using a wordpress plugin called Eggplant 301 redirects. I have been doing some really awesome content creation and have created some good quality citations, so I think this is only thing that is eaffecting my rank. Any help would be greatly appreciated. view?usp=sharing

    Technical SEO | | QRate
    0

  • Hello, I have a website whose spam analysis is showing a score is a 6 but when I look at the my links, I only have one link that is showing up as a 5. I look at all the subdomains and root domain and I am just curious if I should be worried or that this is quite common. Thoughts? Thanks,
    Errick

    Moz Bar | | ErrickG
    0

  • A client of ours asked if we could place link to their local Facebook page instead of a link to the direct domain in their Google My Business listing.  Will Google allow this?

    Local Listings | | RosemaryB
    0

  • Hello community! I am not an SEO professional, though I am a practitioner, I would say. I am seeking a solution on behalf of a friend. If you search the term "Peter Blatt" you will discover a "black eye" on the first page, towards the bottom of SERPs. It's a PDF published on the Florida Department of Financial Services website regarding the final order for a settlement he and his company ("Blatt Financial Group") reached with the state as it related to professional conduct allegations. Does anyone have any advice on how to address this? I don't want "game" the search engines, but at the same time, this document looks really scary and much worse than it actually is to people, and I would love for it do drop below page one. Any advice or suggestions from the community? Thanks! Tom

    Technical SEO | | 800GoldLaw
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.