Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • I have added all the website versions into Google web master tools and I have no crawl errors.  When I click on Search traffic these pages are blank - search analytics, & Mobile usability. And When I fetch as Google the status is constantly saying me 'temporarily unreachable' - any help would be greatly appreciated.

    Reporting & Analytics | | HLAS
    0

  • I need to do about 6000 redirects for a Magento site. The pages no longer exist. I have tried the URL rewrite module but it isn't working for me and I don't want to do 6000 redirects in the htaccess files. Any suggestions?

    Intermediate & Advanced SEO | | Tylerj
    0

  • Hey, So on our site we have a Buyer's Guide that we made. Essentially it is a pop-up with a series of questions that then recommends a product. The parameter ?openguide=true can be used on any url on our site to pull this buyer's guide up. Somehow the Moz Site Crawl reported each one of our pages as duplicate content as it added this string (?openguide=true) to each page. We already have a URL Parameter set in Google Webmaster Tools as openguide ; however, I am now worried that google might be seeing this duplicate content as well. I have checked all of the pages with duplicate title tags in the Webmaster Tools to see if that could give me an answer as to whether it is detecting duplicate content. I did not find any duplicate title tag pages that were because of the openguide parameter. I am just wondering if anyone knows:
    1. a way to check if google is seeing it as duplicate content
    2.  make sure that the parameter is set correctly in webmaster tools
    3. or a better way to prevent the crawler from thinking this is duplicate content Any help is appreciated! Thanks, Mitchell Chapman
    www.kontrolfreek.com

    Moz Pro | | MitchellChapman
    0

  • We had an enterprise client ask to remove mobile URLs from their sitemaps.  For their website both desktop & mobile URLs are combined into one sitemap.  Their website has a mobile template (not a responsive website) and is configured properly via Google's "separate URL" guidelines.  Our client is referencing a statement made from John Mueller that having both mobile & desktop sitemaps can be problematic for indexing.  Here is the article https://www.seroundtable.com/google-mobile-sitemaps-20137.html 
    We would be happy to remove the mobile URLs from their sitemap.  However this will unfortunately take several billing hours for our development team to implement and QA.   This will end up costing our client a great deal of money when the task is completed. Is it worth it to remove the mobile URLs from their main website to be in adherence to John Mueller's advice?  We don't believe these extra mobile URLs are harming their search indexing.   However we can't find any sources to explain otherwise. Any advice would be appreciated. Thx.

    Intermediate & Advanced SEO | | RosemaryB
    0

  • I have 2 Domains with the same name  with same content. How to solve that problem? Do I need to change the content from my main website. My Hosting is having different plans, but with the same features. So many pages were having the same content, and it is not possible to change the content, what is the solution for that? Please let me know how to solve that issue?

    Intermediate & Advanced SEO | | Alexa.Hill
    0

  • I am getting that error on my product pages. This link is in the errors http://www.wolfautomation.com/drive-accessory-safety-sto-module-i500 but when I look at it on mobile it is fine.

    Intermediate & Advanced SEO | | Tylerj
    0

  • Hello, I am hoping that someone is able to help with a problem that is destroying both my business and my health. We are an ecommerce site who have been trading since 2004 and who have always had strong rankings in Google. Unfortunately, over the past couple of months, these have significantly decreased (I would estimate around 40% drop in organic traffic). We have not had a manual penalty and still have decent rankings for a lot of competitive keywords, so we think it is more likely to be an algorithmic penalty.The most likely culprit is due to a huge scale negative SEO attack that has been going on for around 18 months. Last September, we suffered a major drop in rankings as a result of the 302 hijack scheme, but after submitting a disavow file (of around 500 domains) on 12th November, we recovered on 26th November (although we now don't know whether this was due to disavow file or the Phantom III update on 19th November).After suffering another major drop at the end of June, we submitted a disavow file of 1100 domains (this the scale of the problem!). This tempoarily halted the slide, however it is getting worse again. I have attached a file from Majestic which shows the increase in the backlinks (however we are not building these).We are at a loss and desperately need help. We have contacting all the sites to try and get links removed but they are happening faster than we can contact them. We have also done a full technical audit and added around 50,000 words of unique, handwritten content, as well as continuing to work through all technical fixes and improvements.At the moment, the only thing we can think of doing is submitting a weekly disavow for all the new spammy domains that come up. The questions I have are: Is there anything we can do to stop the attack? Is this increase in backlinks likely to be the culprit for the drops (both the big drops and the subsequent weekly 10% drop)? If so, would weekly disavows solve the problem? Is this likely to take months (years?) to recover from or can it be done quicker? Can you give me any ray of light to help me sleep at night? 😞 Really appreciate any and all help. I wouldn't wish ths on anyone.Thanks,Simon

    Intermediate & Advanced SEO | | simonukss
    0

  • I have created separate account for my sub domain and created goal for that, we have sales force data base to capture the leads.The problem which i'am facing is comparing to the goals and Sales force leads is very much difference.Goals leads are showing huge than sales force leads. 2) When lead is creating from organic source but in goals it's showing referral lead.

    Technical SEO | | Anshul.S
    0

  • Hey Mozzers Hoping to get some opinions on SEO at a small business level. We're engaged in SEO for a number of clients which are small businesses (small budgets). We stick to strictly white hat techniques - producing decent content (and promoting it) and link building (as much as is possible without dodgy techniques/paying huge sums). For some clients we seem to have hit a ceiling about with rankings anywhere between roughly position #5 - #15 in Google. In the majority of cases - the higher ranking clients don't appear to be engaged in any kind of content marketing - often have much worse designed websites - and not particularly spectacular link profiles (In other words they're not hugely competitive - apart from sometimes on the AdWords front - but that's another story) The only difference seems to be links on agency link farms - you know the kind? Agency buys expired domains with an existing PR - then just builds simple site with multiple blog posts that link back to their clients sites. (Also links that are simply paid for) Obviously these sites serve no purpose other than links - but I guess it's harder for Google to recognize that than with obvious SEO directories etc?... It seems to me that at this level of SEO for small businesses (limited budgets, limited time) the standard approach for SEO is the "expired domains agency link sites" described above - and simply paying bloggers for links. Are the above techniques considered black hat? Or are they more grey-hat? - Are they risky? - Or is this kind of thing all in the game for SEO at the small business level (by that I mean businesses that don't have the budget to employ a full time SEO and have to rely on engaging agencies for low level - low resource  SEO campaigns) Look forward to your always wise council...

    White Hat / Black Hat SEO | | wearehappymedia
    0

  • Hello everyone, Here I am with a question about Penguin. I am asking to all Penguin experts on these forums to help me understand if there is a "safe" threshold of unnatural links under which we can have peace of mind. I really have no idea about that, I am not an expert on Penguin nor an expert of unnatural back link profiles. I have a website with about 84% natural links and 16% affiliate/commercial links. Should I be concerned about possibly being penalized by an upcoming Penguin update? So far, I have never been hit by any previous Penguin released, but... just in case, you experts, do you know what's the "threshold" of unnatural links that shouldn't be exceeded? Or, in your experience, what's the classic threshold over which Google can penalize a website for unnatural back link profile? Thank you in advance to anyone helping me on this research!

    White Hat / Black Hat SEO | | fablau
    0

  • Again I am facing same Problem with another wordpress blog. Google has suddenly started to Cache a different domain in place of mine & caching my domain in place of that domain. Here is an example page of my site which is wrongly cached on google, same thing happening with many other pages as well - http://goo.gl/57uluq That duplicate site ( protestage.xyz) is showing fully copied from my client's site but showing all pages as 404 now but on google cache its showing my sites. site:protestage.xyz showing all pages of my site only but when we try to open any page its showing 404 error My site has been scanned by sucuri.net Senior Support for any malware & there is none, they scanned all files, database etc  & there is no malware found on my site. As per Sucuri.net Senior Support It's a known Google bug. Sometimes they incorrectly identify the original and the duplicate URLs, which results in messed ranking and query results. As you can see, the "protestage.xyz" site was hacked, not yours. And the hackers created "copies" of your pages on that hacked site. And this is why they do it - the "copy" (doorway) redirects websearchers to a third-party site [http://www.unmaskparasites.com/security-report/?page=protestage.xyz](http://www.unmaskparasites.com/security-report/?page=protestage.xyz) It was not the only site they hacked, so they placed many links to that "copy" from other sites. As a result Google desided that that copy might actually be the original, not the duplicate. So they basically hijacked some of your pages in search results for some queries that don't include your site domain. Nonetheless your site still does quite well and outperform the spammers. For example in this query: [https://www.google.com/search?q=](https://www.google.com/search?q=)%22We+offer+personalized+sweatshirts%22%2C+every+bride#q=%22GenF20+Plus+Review+Worth+Reading+If+You+are+Planning+to+Buy+It%22 But overall, I think both the Google bug and the spammy duplicates have the negative effect on your site. We see such hacks every now and then (both sides: the hacked sites and the copied sites) and here's what you can do in this situation: It's not a hack of your site, so you should focus on preventing copying the pages: 1\. Contact the protestage.xyz site and tell them that their site is hacked and that and show the hacked pages. [https://www.google.com/search?q=](https://www.google.com/search?q=)%22We+offer+personalized+sweatshirts%22%2C+every+bride#q=%22GenF20+Plus+Review+Worth+Reading+If+You+are+Planning+to+Buy+It%22 Hopefully they clean their site up and your site will have the unique content again. Here's their email flang.juliette@yandex.com 2\. You might want to send one more complain to their hosting provider (OVH.NET) abuse@ovh.net, and explain that the site they host stole content from your site (show the evidence) and that you suspect the the site is hacked. 3\. Try blocking IPs of the Aruba hosting (real visitors don't use server IPs) on your site. This well prevent that site from copying your site content (if they do it via a script on the same server). I currently see that sites using these two IP address: 149.202.120.102\. I think it would be safe to block anything that begins with 149.202 This .htaccess snippet should help (you might want to test it) #-------------- Order Deny,Allow Deny from 149.202.120.102 #-------------- 4\. Use rel=canonical to tell Google that your pages are the original ones. [https://support.google.com/webmasters/answer/139066?hl=en](https://support.google.com/webmasters/answer/139066?hl=en) It won't help much if the hackers still copy your pages because they usually replace your rel=canonical with their, so Google can' decide which one is real. But without the rel=canonical, hackers have more chances to hijack your search results especially if they use rel=canonical and you don't. I should admit that this process may be quite long. Google will not return your previous ranking overnight even if you manage to shut down the malicious copies of your pages on the hacked site. Their indexes would still have some mixed signals (side effects of the black hat SEO campaign) and it may take weeks before things normalize. The same thing is correct for the opposite situation. The traffic wasn't lost right after hackers created the duplicates on other sites. The effect build up with time as Google collects more and more signals. Plus sometimes they run scheduled spam/duplicate cleanups of their index. It's really hard to tell what was the last drop since we don't have access to Google internals. However, in practice, if you see some significant changes in Google search results, it's not because of something you just did. In most cases, it's because of something that Google observed for some period of time. Kindly help me if we can actually do anything to get the site indexed properly again, PS it happened with this site earlier as well & that time I had to change Domain to get rid of this problem after I could not find any solution after months & now it happened again. Looking forward for possible solution Ankit

    Intermediate & Advanced SEO | | killthebillion
    0

  • We are a Healthcare system which has 70+ medical service/practice locations and 2500+ individual doctor lists. What is the best way manage these locations list, so they are not completing with each other in local search.  for example, if search for "urgent care", we will like our medical practice group location show up before individual doctor in local search.  What is the best practices for managing your group location lists and your individual doctor lists? Thanks!

    Image & Video Optimization | | HonorHealth
    0

  • Hey Moz! We have a situation with a dentist firm with multiple doctors at the same address. They have two locations for their dental offices, and each of the dentists operate at both offices. The issue: Each doctor insists on having their own by business page for each location and i'm afraid this is hurting their local SEO. We've been tracking keywords by week and we've seen some big fluctuations in ratings and i'm looking into why this is happening. The office in location 1 has it's own Google My Business page and the three dentists have their own my business page set up at the exact same address. The office in location 2 has it's own Google My Business page as well and the three dentists have their own my business page there also. This leads the two addresses of the main offices having multiple My Business pages at the same address competing against eachother since they are all are registered with similar names and specialties. Could this be hurting our local SEO? Thanks! -Z

    Local Website Optimization | | zacgarrison_70
    0

  • Hey Mozzers! I have a question that I haven't found a perfect answer to yet. The company I work for has built a press/awards/news article page and I'm trying to determine the best format to showcase the information in. (You can take a look at the page here: https://www.webpt.com/about/press Should I have our team copy and paste the press releases onto our site and rel canonical that post to the original article? Or would it be better to just have a short intro paragraph and then have a read full story link at the bottom of that paragraph. Final question--should I make these pages noindex, nofollow? Looking forward to hearing everyone's answers!

    On-Page Optimization | | WebPT
    0

  • Hi Everyone, My client is switching from a mobile subdomain to a responsive site. All URLs are the same on mobile subdomain vs desktop so we just need a wildcard rule to redirect m. to www. Anyone have this wildcard redirect code for an .htaccess file? Much appreciated! Dan

    Intermediate & Advanced SEO | | kernmedia
    0

  • Got a good one for you all this time... For our site, Google Search Console is reporting 436,758 "Page Not Found" errors within the Crawl Error report. This is an increase of 350,000 errors in just 22 days (on Sept 21 we had 87,000 errors which was essentially consistently at that number for the previous 4 months or more). Then on August 22nd the errors jumped to 140,000, then climbed steadily from the 26th until the 31st reaching 326,000 errors, and then climbed again slowly from Sept 2nd until today's 436K. Unfortunately I can only see the top 1,000 erroneous URLs in the console, of which they seem to be custom Google tracking URLs my team uses to track our pages. A few questions: 1. Is there anyway to see the full list of 400K URLs Google is reporting they cannot find?
    2. Should we be concerned at all about these? 
    3. Any other advice? thanks in advance! C

    Intermediate & Advanced SEO | | usnseomoz
    0

  • Hi, So, I have a client who wants to host two websites (which you could refer to as sister sites) on the same hosting account. For some reason, I was under the impression that doing as much may be detrimental (for SEO purposes). Am I correct in thinking this? Can I get some back-up documentation or comments here? I look forward to hearing what you all have to say. Thanks for reading!

    White Hat / Black Hat SEO | | maxcarnage
    0

  • Do you know any tool or methods for load time testing on mobile app? Like Google PageSpeed Tool for webpages. Thank you.

    Moz Pro | | Tormar
    0

  • Hi, Do naked link anchors such as "www.mysite.com/my-category" have the same SEO power as long tails anchors? What about the UX which such anchors? Lately a site published our back-to-school article with this horrible naked link anchor, it does not look friendly to me as a reader, but I am wondering whether I should bother, how important it is for google and for readers? Thank you in advance. Isabelle

    Link Building | | isabelledylag
    0

  • Hi, Quick question about duplicate page titles. I have a few duplicate pages where I've got lots of portfolio items under the same category, lets call it a portfolio of kangaroos. I can see there are separate URLs for each of the pages, i.e, www.blahblah.com/portfolio/kangaroos and www.blahblah.com/portfolio/kangaroos/2, but they both have the same page title, i.e, Kangaroos Portfolio | Domain and it's coming up in Moz as a medium priority duplicate page title error. Where in WordPress can I change the page title for the second page of portfolio items? Thanks very much! Tom.

    Technical SEO | | onefoursix
    0

  • Hi Mozzers - was just looking at website speed and know the google guidelines on average page load time but I'm not sure whether Google issues guidelines on any of the other 4? Do you know of any guidance on domain lookup, server response, server connection or page download? Page Load Time (sec) - I tend to aim for 2 seconds max: http://www.hobo-web.co.uk/your-website-design-should-load-in-4-seconds/ 
    Server Response Time: [Google recommends 200ms]: https://developers.google.com/speed/docs/insights/Server Redirection Time (sec) [dependent on number of redirects so probably no guide figure]
    Domain Lookup Time (sec)
    Server Connection Time (sec)
    Page Download Time (sec) Thanks, Luke

    Intermediate & Advanced SEO | | McTaggart
    0

  • Hey Everyone, I'm so happy to be apart of this community and assert knowledge where and when I can. I joined the community for one specific reason and I hope to employ the help of everyone here in conjunction with solving my SEO problem. I have a few years experience in SEO/SEM and have been continuously learning, while learning to adapt to continuous changes (I think we can all relate lol). At any rate, here is what I am experiencing frustration with. I'm the SEO Analyst for a company that is trying to compete for the keyword phrase "Lyft Promo Code". We have been trying to place page one on google for over a year now to no avail. I have gotten my direct domain url to appear on pages 1 & 2, but can't seem to get permalinks or "Sub-URL's" indexed. If you google this phrase you will see what I mean. The top result is:http://rideshareapps.com/lyft-promo-code-credit/
    This url has an aggregated rating and appears page one for the phrase aforementioned above. What we have managed to do, as I mentioned is get www.couponcodeshero.com on page two. However, we have noticed that the page one trend is all permalinks. However when we have tried to emulate the pages structure and index priority, we are unable too. Our page:
    http://couponcodeshero.com/lyft-promo-code-rideshare-guide/ I have ran multiple on-page graders from many resources and have not been able to get this page indexed as a permalink on any page that directly correlates with the Keyword Phrase. In essence, I'm looking for some direction from individuals who may have experienced this before. I have spent a good amount of time Googling and searching forum databases but can not find any direct content that explains how to index a permalink. I hope to get some great ideas from the individuals here! If you do know of any articles or even previously answered questions here please direct me there. it is only my intention to add value to the community! Schieler Mew
    Number One Designs

    Intermediate & Advanced SEO | | Number_One_Deisgns
    0

  • I've seemed to have a load of issues as of late. Just small graphical things but still enough to make me concerned. 1. https://moz.com/community/users/4207879 being noindex (however it seems to other users an external tools work) (image proof: https://i.gyazo.com/2e77c94ffd068718e9e149e8afef1e44.png ) 2. No mozpoints: https://gyazo.com/9e0777af3ebbdb6ad58a395bafe28a52 3. I'm a staff member? https://i.gyazo.com/5d7b156c13fa1d0bb6c2e0af3467423f.png 4. Trying to respond when logged out: https://i.gyazo.com/9f5fe8efd372920391a9a54178561e0b.png (not really a bug, however could be a better experience) These have all been sent via support, just wondered if it was just me having issues? Seems i'm staff on this one too: https://i.gyazo.com/6f1bc6372180f4f9f1cb1c7d6f6d941c.png

    Product Support | | ThomasHarvey
    0

  • We have 44,249 errors and I have set up for most of the URLs a 301 redirect. I know exactly which links are correctly redirected my problem is I don't want to mark as fixed each one individually. Is there a way to upload a URL list to webmaster tools and it automatically marks as fixed based on the list.

    Technical SEO | | easyoffices
    0

  • Hi I wondered what the view is on content below the fold? We have the H1, product listings & then some written content under the products - will Google just ignore this? I can't hide it under a tab or put a lot of content above products - so I'm not sure what the other option is? Thank you

    Intermediate & Advanced SEO | | BeckyKey
    0

  • Hello everyone, I have a basic question for which I couldn't find a definitive answer for. Let's say I have my main website with URL: www.mywebsite.com And I have a related affiliates website with URL: affiliates.mywebsite.com Which includes completely different content from the main website. Also, both domains have two different IP addresses. Are those considered two completely separate domains by Google? Can bad links pointing to affiliates.mywebsite.com affect www.mywebsite.com in any way? Thanks in advance for any answer to my inquiry!

    Intermediate & Advanced SEO | | fablau
    0

  • I worked with a team of developers to launch a new site back in March. I was (and still am) in charge of SEO for the site, including combining 4 sites into 1. I made sure 301 redirects were in place to combine the sites and pretty much every SEO tactic I can think of to make sure the site would maintain rankings following launch. However, here we are 6 months later and YoY numbers are down -70% on average for organic traffic. Anyone mind taking a look at http://www.guestguidepublications.com and seeing if there's a glaring mistake I'm missing?!?!?! Thanks ahead of time!

    Intermediate & Advanced SEO | | Annapurna-Digital
    1

  • We are trying to eliminate tedium when developing complexly designed responsive navigations for mobile, desktop and tablet. The changes between breakpoints in our designs are too complex to be handled with css, so we are literally grabbing individual elements with javascript and moving them around. What we'd like to do instead is have two different navigations on the page, and toggle which one is on the DOM based on breakpoint. These navigations will have the same links but different markup. Will having two navigation components on the page at page load negatively impact our Google SEO rankings or potential to rank, even if we are removing one or the other from the DOM with JavaScript?

    Intermediate & Advanced SEO | | CaddisInteractive
    0

  • Hello all! So I know having a sitemap XML file is important to include in your robots.txt file.  I also know it is important to submit your XML sitemap to Google and Bing.  However, I am wondering if it is beneficial for your site's SEO value to have a sitemap page displayed on your website? Or is this just a redundant action if you have already done the above two actions with your XML sitemap? Thanks in advance!

    Web Design | | Myles92
    0

  • My client is trying to achieve a global presence in select countries, and then track traffic from their international pages in Google Analytics. The content for the international pages is pretty much the same as for USA pages, but the form and a few other details are different due to how product licensing has to be set up. I don’t want to risk losing ranking for existing USA pages due to issues like duplicate content etc. What is the best way to approach this? This is my first foray into this and I’ve been scanning the MOZ topics but a number of the conversations are going over my head,so suggestions will need to be pretty simple 🙂 Is it a case of adding hreflang code to each page and creating different URLs for tracking. For example:
    URL for USA: https://company.com/en-US/products/product-name/
    URL for Canada: https://company.com/en-ca/products/product-name /
    URL for German Language Content: https://company.com/de/products/product-name /
    URL for rest of the world: https://company.com/en/products/product-name /

    Intermediate & Advanced SEO | | Caro-O
    1

  • Hi Moz community!I've just joined and am getting to grips with SEO basics. Right now, I'm looking at the Competitive Link Metrics in Moz Pro, and I'm curious about the following- Of the three competitors that we're following, I'm trying to figure out some differences between two of them - we'll call them A and B. 'A' has 3.6k external followed and total links, with 5 total linking root domains. 'B' (a more prestigious and established company with a much higher DA) has 2.2k total external links, with 180 root domains. So my question is, how can A have nearly 1,000 more links, but only from 5 domains? Any feedback much appreciated! Thanks!

    Getting Started | | thegildedteapot
    0

  • Hi Mozians, My firm is looking to present different content to different users depending on whether they are new, return visitors, return customers etc... I am concerned how this would work in practice as far as Google is concrened- how would react to the fact that the bot would see different content to some users. It has the slight whiff of cloacking about it to me, but I also get that in this case it would be a UX thing that would genuinely be of benefit to users, and clearly wouldn't be intended to manipulate search rankings at all. Is there a way of acheiving this "personalisation" in such a way that Google understands thay you are doint it? I am thinking about some kind of markup that "declares" the different versions of the page. Basically I want to be as transparent about it as possible so as to avoid un-intended consequences. Many thanks indeed!

    Technical SEO | | unirmk
    0

  • I've discovered that my clients website used to have another domain name, which they still own but don't use. It's doing OK considering its not been used for a few years - almost 6,000 backlinks showing on Majestic. So what's the best way of using this for SEO? I'm presuming some kind of redirecting? A simple redirect of everything on the domain to the new domain index page? Or going trough all the old pages and redirecting them one by one?

    Technical SEO | | abisti2
    0

  • It has come to my attention with one of my clients (WordPress website) that for some time they have within their Landing Page report (of GA - Google Analytics) URLs that should all be pointing to the one page, example: domain.com/about-us, also has a listing in GA as domain.com/about-us/index.htm Is this some kind of indication of a subdirectory issue? Has anyone had experience with this in such wordpress plugins as Yoast SEO, or other SEO plugin? My thoughts here are to simply redirect any of these non-existent files with a redirect in .htaccess - but what I'm using isn't working. I will insert the redirect here - - and any help would be greatly appreciated. RewriteEngine onRewriteCond %{THE_REQUEST} ^./index.html?
    RewriteRule ^(.)index.html?$ http://www.dupontservicecenter.com/$1 [R=301,L] and this rewrite doesn't work: RewriteEngine on
    RewriteRule ^(.+).htm$ http://dupontservicecenter.com/$1.php [R,NC] _Cindy

    Reporting & Analytics | | cceebar
    0

  • I know back in 08 Google started crawling forms using the method=get however not method=post. whats the latest? is this still valid?

    Intermediate & Advanced SEO | | Turkey
    0

  • We are having an odd thing happen with our Mobile Friendly status. Google has had the pages "Mobile Friendly" for almost a year now. While Bing says we fail mobile friendly. We've tried changing the two things we are failing on in the Bing test but that breaks the page for some users. Two things we are failing on Bing are: Viewport Not Configured correctly - We have tried their suggested tag, it breaks our pages on Tablets. Page content does not fit device Width - Page does fit the devices fine, Google has no problem with it. What do you suggest we do?

    Intermediate & Advanced SEO | | K-WINTER
    0

  • It appears Google is moving towards the Rich Cards JSON-LD for all data. https://webmasters.googleblog.com/2016/05/introducing-rich-cards.html However on an ecommerce site when I have schema.org microdata structured data inline for a product and then I add the JSON-LD structured data Google treats that as two products on the page even though they are the same. To make the matter more confusing Bing doesn't appear to support JSON-LD. I can go back to the inline structured data only, but that would mean when Rich Cards for products eventually come I won't be ready. What do you recommend I do for long term seo, go back to the old or press forward with JSON-LD?

    Intermediate & Advanced SEO | | K-WINTER
    0

  • Hi, We have blogs set up in each of our markets, for example http://blog.telefleurs.fr, http://blog.euroflorist.nl and http://blog.euroflorist.be/nl. Each blog is localized correctly so FR has fr-FR, NL has nl-NL and BE has nl-BE and fr-BE. All our content is created or translated by our Content Managers. The question is - is it safe for us to use a piece of content on Telefleurs.fr and the French translated Euroflorist.be/fr, or Dutch content on Euroflorist.nl and Euroflorist.be/nl? We want to avoid canonicalising as neither site will take preference. Is there a solution I've missed until now? Thanks,
    Sam

    Intermediate & Advanced SEO | | seoeuroflorist
    0

  • I have built a link on behalf of a ciient in a long, well-written article on a reputable website that accepts contributor accounts. I therefore control the link. I have since realised that the anchor text of the link could be optimized much better than it currently is (while still only being a partial match). Would I be punished by the algorithm for going in and changing the link? I know it's not 100% "natural," but then we're SEOs, and i don't think it's too implausible that a website owner may go in and do the same... Maybe if I add some text as well, it would make things look more natural?

    Intermediate & Advanced SEO | | zakkyg
    1

  • Hi, My client has a new WordPress site http://www.londonavsolutions.co.uk/ and they have installed the Yoast Premium SEO plug-in. They are having issues with getting the lights to go green and the main problem is that on most pages Yoast does not see any words/content – although there are plenty of words on the pages. Other tools can see the words, however Yoast is struggling to find any and gives the following message:- Bad SEO score. The text contains 0 words. This is far below the recommended minimum of 300 words. Add more content that is relevant for the topic. Readability - You have far too little content. Please add some content to enable a good analysis. They have contacted the website developer who says that there is nothing wrong, but they are frustrated that they cannot use the Yoast tools themselves because of this issue, plus Yoast are offering no support with the issue. I hope that one of you guys has seen this problem before, or can spot a problem with the way the site has been built and can perhaps shed some light on the problem. I didn't build the site myself so won't be offended if you spot problems with it. Thanks in advance, Ben

    Technical SEO | | bendyman
    0

  • Hi Moz Fans, We are looking at our blog and improving the content as much as we can for SEO purposes, but we have hit a bit of a blank in terms of lazy loading implications and issues with crawl depths. We introduced lazy loading onto the blog home page to increase site speed initially and it works well with infinite scroll, but we were wondering whether this would cause any issues regarding SEO. A lot of the resources online seem to be conflicting and some are very outdated, so some clarification on what is best in terms of lazy loading and crawl depths for blogs, would be fantastic! I hope someone can help and give us some up to date insights - If you need anymore information, I'll reply ASAP

    Intermediate & Advanced SEO | | Victoria_
    0

  • So, every time I press log in on the Mozbar, it takes me to the Moz home page which I'm already logged in to but there is no change on the status of the Mozbar? What should I do to remedy this? Thanks!

    Moz Bar | | soapmed
    5

  • I want to list our product on a number of sites that require PAD files such as Software Informer and Softpedia. Is this a good idea from an SEO perspective to have links on these pages?

    White Hat / Black Hat SEO | | SnapComms
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.