Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • Hi There, With the latest penguin 3.0 algorithm update (on October 17th,) I noticed a drop in my rankings. Even though I didn’t receive any manual penalty because no messages have been found in WebMaster Tool, I suspect it is an algorithm penalty. For this reason, I definitively decided to clean-up my external link profile. **I am excluding it is a Panda 4.1 penalty because an extensive site structure review has been conducted quite recently. I collected external links from Webmaster Tool and Open Site Explorer. What I found is that 83% of my external links need to be disavowed because the links come either from poor directories or marketing articles that are evidently and specifically written for link building purposes. My questions are: 1)       Shall an external link clean-up be set in place anyway although I didn’t receive any penalty message in order to prevent future problems with penguin algorithm? 2)       Is it too dangerous to disavow 83% of external links? May such a manoeuvre destroy my actual rankings? Thanks in advance for you advices 🙂

    | Midleton
    0

  • Hi all and thanks for taking the time to read my question. We are going to migrate a very small website from http to https, its a roughly 9 page site with 5 of those being product pages. I figured I would have to set a canonical and permanent 301 redirects for each page. But our tech guys suggested just doing a binding to https so any traffic hitting our site with a http url would automatically get redirected to the https version. So if someone land on http://mydomain, it would automatically return https://mydomain Does this sound correct or would we need to do additional tasks even if we go down the binding route?thanks again for looking.

    | Renford_Nelson
    1

  • Just trying to find out if this may be the root of a slight traffic dip and also if we should be redirecting differently. We relaunched and did 301 redirects to the new site, initially. Then, we decided to change from http to https. Our HTTP status now looks like this when using the MozBar: HTTP/1.1 301 Moved Permanently – http://site.com/oldurl
    HTTP/1.1 302 Found – https://site.com/oldurl
    HTTP/1.1 200 OK - https://site.com/new Should we be changing that 302 to a 301? Are we losing link equity due to this? Thanks.

    | MichaelEka
    0

  • Hello all, so either I broke Google or Google doesn't know how to index my page properly (onradpad.com/paymyrent). If you search "pay rent with credit card", whether you're logged in to Google or not, you'll see a snippet from our signup process (which is js) right under the ad slot in the serps (Awesome! You're signed up!) and it will repeat where my meta data should be. It's been like this for well over a month now and I cannot figure out how to get rid of it. Additionally, if you search for the branded title of the page "pay with radpad", it pulls language that's not on that page (perhaps from somewhere in the js signup form). Though if you search for "pay rent with radpad" you'll see what my meta description is supposed to look like in the serps. Any ideas as to what the heck is going on?

    | RadMatt
    0

  • ​Howdy Mozzers, I don't see this very often but figured I would share my findings, no surprise that I found a huge portal like rottentomatoes.com ranking for the keyword: new on dvd , but for the last month or so - the page that is indexed for it is their "mobile" view. ( Screen Shot Attached ) I have a few ideas how you could go about fixing this - but just more of a conversation piece - have many of you ever seen such a thing - especially on a portal so big? Your pal, Chenzo ScreenShot2014-12-02at101927AM_zps107789f1.png

    | Chenzo
    0

  • How can I cannocialize IP address for websites in Wordpress and Joomla?

    | ArthurRadtke
    0

  • Hello,We recovered from Google's manual penalty in January 2014. Afterwards, we changed our site design, fixed our content, disavowed bad links, increased our social presence, tried to engage customers in blog content, etc. But we still couldn't get our domain name back on SERPs. (It wont show our site even on first page if I search "best vpn service" on Google.com)What should we do to bring our domain name back on the SERP?What About Sandbox? What we need to do to get out our domain name from sand box?Any comments or thoughts will be really appreciated 🙂

    | UmairGadit
    0

  • Hi There,We've launched a new website and as part of the update have changed our domain name - now we need to tell Google of the changes: Both sites were verified in Webmaster Tools From the old site's gear icon, we chose "Change of address" As part of the "Change of address" checklist Google presented, we added 301 redirects to redirect the old domain to the new one But now that the 301 redirects are in place, Google can no longer verify the old site And because it can no longer verify the old site, Google won't let us complete the change of address form How do we tell Google of the change of address in this instance - and has anyone else encountered this?CheersBen

    | cmscss
    0

  • Hi guys i have started adding my seo tags to my image keywords ie alt tags captions etc.... but when i go to inspect the slideshow that image then appears in there none of the alt tags etc in them? Does anyone know how to add then?

    | hemeravisuals
    0

  • Hi, I'm doing a link audit.  My sites' keyword rankings and organic traffic have been sent to the Phantom Zone since the last Penguin Update. I've got 70+ and counting follow backlinks to my main domain and one of the my subdomains from http://www.secretsearchenginelabs.com/; should i disavow them? There's a load of links to many recognisable sites in there and my instinct's all out of whack with this decision. I think that these links were all manually added by a link building company on our behalf. It does look manipulated to me but i'd like a second opinion before I dump all of those links. Thanks Thanks

    | McCaldin
    0

  • Hi, This is Suresh. I made changes to my website and I see that google is unable to crawl my website from 22nd October. Even it is not showing any content when I use Cache:www.vonexpy.com. Can any body help me in knowing why Google is unable to crawl my website. Is there any technical issue with the website? Website is www.vonexpy.com Thanks in advance.

    | sureshchowdary
    1

  • Hello just one quick question which came up in the discussion here: http://moz.com/community/q/take-a-good-amount-of-existing-landing-pages-offline-because-of-low-traffic-cannibalism-and-thin-content When I do 301 redirects where I put together content from 2 pages, should I keep the page/html which redirects on the server? Or should I delete? Or does it make no difference at all?

    | _Heiko_
    0

  • Hi everyone, Recently someone pointed out that my website can be accessed in both ways i.e. by typing www.example.com or example.com. He further added that Google might identify this as duplicate content and penalize my website. So now I'm thinking about 301 redirection from non WWW to WWW using htaccess method. But my website is 2 year old now and I'm getting some decent traffic from Google. Will this redirection have an adverse effect on my rankings? Is there any other way to resolve this issue? I don’t want to lose my current rankings or organic traffic. Any help would be very much appreciated. P.S.  Currently Google index my website pages with WWW.

    | nicksharma04
    0

  • Hi, We have been trying to gain ranking for 7 keywords for a year now but have been unsuccessful We are not sure where we are going wrong, if someone could please help us out, we are happy to pay for your time.

    | mframing
    0

  • hello Moz We know that this year, Moz changed its domain to moz.com from www.seomoz.org
    however, when you type "site:seomoz.org" you still can find old urls indexed on Google (on page 7 and above) We also changed our site from http://www.example.com to https://www.example.com
    And Google is indexing both sites even though we did proper 301 redirection via htaccess. How long would it take Google to refresh the index? We just don't worry about it? Say we redirected our entire site. What is going to happen to those websites that copied and pasted our content? We have already DMCAed their webpages, but making our site https would mean that their website is now more original than our site? Thus, Google assumes that we have copied their site? (Google is very slow on responding to our DMCA complaint) Thank you in advance for your reply.

    | joony
    0

  • Hi All, Does redirecting alternate versions of my homepage with a 301 only improve reporting, or are there SEO benefits as well. We recently changed over our servers and this wasn't set-up as before and I've noticed a drop in our organic search traffic. i.e. there was no 301 sending mywebsite.com traffic to www.mywebsite.com Thanks in advance for any comments or help.

    | b4cab
    0

  • As many of you are aware that the pigeon update was only applied to the new Google maps resulting in very different search results for Google local business. When you search for a business on old Google maps then you get totally different results vs the new Google maps. Some businesses totally disappeared completely from the search results. I have done my research and found out that it's because the new Algo was only applied to the new maps. Also new algo does not apply to other countries. Well the reason I posted this topic is because I have noticed that all the new Google Business listings I am verifying for my clients are all being put under the old Google maps and not the new ones. They come up fine when searching from old maps but not the new ones. I understand Google has not rolled out the pigeon on all data centers but why? Will Google eventually roll out the update to old maps? Since Google is adding businesses to old google maps then what's the point of even adding new listings?

    | bajaseo
    0

  • hi i want undrestand my domain is domiciliationacasablanca.blogspot.com and i have 92 of DA can that realy help me to etablish trust with google if i work hardly SEO under google guidelines please help me to undrestand

    | seomastering
    0

  • Excuse me for posting this here, I wasn't having much luck going through GWT support. We recently moved our eCommerce site to a new server and in the process the site was hacked. Spammy URLs were injected in, all of which were pointing outwards to some spammy eCommerce retail stores. I removed ~4,000 of these links, but more continue to pile in. As you can see, there are now over 20,000 of these links. Note that our server support team does not see these links anywhere. I understand that Google doesn't generally view this as a problem. But is that true given my circumstance? I cannot imagine that 20,000 new, senseless 404's can be healthy for my website. If I can't get a good response here, would anyone know of a direct Google support email or number I can use for this issue?

    | jampaper
    0

  • My site has 10 lakh approx. genuine urls. But due to some unidentified bugs site has created irrelevant urls 10 million approx.  Since we don’t know the origin of these non-relevant links, we want to redirect or remove all these urls. Please suggest is it good to redirect such a high number urls to home page or to throw 404 for these pages.  Or any other suggestions to solve this issue.

    | vivekrathore
    0

  • After reading other posts about Wix and SEO I think I need to change the web design provider to something I have more control of SEO options. Does anyone have any suggestions of something I can use?

    | benjacksoncook
    0

  • Hi guys please help me to solve this webmaster tools issue and downfall of clicks and impressions. What is the Issue : I have managed hills self storage site since long time, but recently ( since few weeks ) in webmaster what I noticed is that, there is a constant reduction in clicks which reached zero thereafter it has been continuously stuck to zero, however impressions too started decreasing from the same time period and that too reached gradually to zero. What I have tested : I have tested Analytics Traffic which is increasing gradually Keywords ranking in Google Australia is also increasing gradually Before October 2014 Total links are 200+,  In mid October 2014 it has started decreasing gradually & now it is showing “No Data Available”. 301 redirection is perfect, google fetching is also ok. What Enhancement we did : Site moved to a new server in October 2014 We switched our site pages from “http” to “https” in October 2014 Kindly reply to my above queries so that I can get back to my “Total Links”, “correct impressions” and “clicks” as previously in the webmaster tools ? Regards, Dave 2eevhoj.jpg zxpro5.jpg

    | akshaydesai
    0

  • Some people have declared the ‘fold’ dead because people scroll.  Others using eye tracking studies hold that most attention is "still be focused on the top of pages.  80.3% of users attention was focused on above the fold (top 600-800 pixels). The case becomes especially strong with mobile devices. It is more inconvenient than ever to see content far down the page when looking at a screen that ranging from 3.5″-5″. Opinons?

    | jgodwin
    0

  • We recently had Google crawl a version of the site we that we had thought we had disallowed already. We have corrected the issue of them crawling the site, but pages from that version are still appearing in the search results (the version we want them to not index and serve up is our .us domain which should have been blocked to them). My question is this: How long should I expect that domain (the .us we don't want to appear) to stay in their index after disallowing their bot? Is this a matter of days, weeks, or months?

    | TLM
    0

  • Hello, From july to november (this year), I gained 110.000 backlinks. Considering that I'm having trouble ranking well for any keyword in my niche (a niche that I was ranking #1 for several keywords and now I'm losing), I'm starting to believe that negative seo is affecting me. I already read several articles about negative seo, some telling this is a myth, others telling that negative SEO is alive and kicking... My site is about health and fitness in brazilian-portuguese language, and there's polish/chinese/english with warez/viagra/others drugs pointing to my domain and a massive links in comments with blogs without comment approval. Considering that all these new backlinks are not on my language and are clearly irrelevant, can I disavow them without fear of affecting my SEO even more ? Everytime you see someone talking about the disavow tool, is always the same warning: "cautiong when disavowing a link, you can hurt you site even more, removing a link that - in some way - was helping you". Any help or guidelines if I can remove this links safely would be greatly appreciated. Thank you and sorry for my english (it's not my native language) 5ZDjUcK.jpg

    | broncobr
    0

  • Hi All, I have noticed that Google are not displaying a mobile friendly tag next to our website (www.wombatwebdesign.com). We made it responsive over a year ago and it is running on Joomla 3.X, as recommended by Google. I have run it through google checking tool and it confirms it is mobile friendly. So why no mobile friendly tag? Any ideas gratefully received. Thanks Fraser

    | fraserhannah
    0

  • Dear Moz Community, Could I pick your brains on SEO plugins for WordPress? Our web developer has installed an SEO plugin called Yoast, and I am not quite sure of it's efficiency. The problem we have at the moment is that the Page Title is not updating on Google the way we anticipated. To solve this issue we unchecked forced rewrite under the title options, but this had no effect. For instance our name on Google appears as Man Van London all the time, despite any amendments we make it always has Man Van London at the start of the title. ( website: www.manvanlondon.co.uk) If Yoast is the best SEO plugin for wordpress, is there any solution to fix this issue? Or is anyone familiar with another plugin? Does anyone suggest to not use plugin's at all? Thank you for your time. Looking forward to your wisdom. Monica

    | monicapopa
    0

  • Morning, So I checked our site's index status on WMT, and I'm being told that Google is indexing 3,511 pages and the robots are blocking 3,331. This seems slightly odd as we're only disallowing 24 pages on the robots.txt file. In light of this, I have the following queries: Do these figures mean that Google is indexing 3,511 pages and blocking 3,331 other pages? Or does it mean that it's blocking 3,331 pages of the 3,511 indexed? As there are only 24 URLs being disallowed on robots.text, why are 3,331 pages being blocked? Will these be variations of the URLs we've submitted? Currently, we don't have a sitemap. I know, I know, it's pretty unforgivable but the old one didn't really work and the developers are working on the new one. Once submitted, will this help? I think I know the answer to this, but is there any way to ascertain which pages are being blocked? Thanks in advance! Lewis

    | PeaSoupDigital
    0

  • Hi All, Don't know if anyone can help me but Moz is showing lots of errors for my website for not having title tags for pages when they do? Also when a user refines they search results it is seeing every instance of this as a new page - we have canonical tags across the site to stop this happening yet it is still occurring each time - is there anything else we can do to resolve this problem? It's creating lots of errors for us. Thanks, Laura

    | Citybase
    0

  • Hi Mozzers, A website I manage has a mobile friendly version of their main website and a /m version as well. I was wondering if anyone had any experience in the best way of handling this? Should we just get rid of the /m version and tag the mobile friendly version? Thanks!

    | KarlBantleman
    0

  • Q: Is it dangerous (SEO fallout) to use javascript redirects? Tech team built a browser side tool for me to easily redirect old/broken links. This is essentially a glorified 400 page -- pops a quick message that the page requested no longer exists and that we're automatically sending you to a page that has the content you are looking. Tech team does not have the bandwidth to handle this via apache and this tool is what they came up with for me to deliver a better customer experience. Back story: very large site and I'm dealing with thousands of pages that could/should/need to be redirected. My issue is incredibly similar to what Rand mentioned way back in a post from 2009: Are 404 Pages Always Bad for SEO? We've also decided to let these pages 404 and monitor for anything that needs an apache redirect. Tool mentioned above was tech's idea to give me "the power" to manage redirects. What do you think?

    | FR123
    0

  • I have over 500,000 429 errors in webmaster tools. Do I need to be concerned about these errors?

    | TheKrazyCouponLady
    0

  • We have developed a very large domain with well over 500 pages that need to be indexed. The tool we usually use to create a sitemap has a limit of 500 pages. Does anyone know of good tool we can use to create a sitemap text and xml that doesn't have a limit of pages? Thanks!

    | TracSoft
    0

  • Hello Mozers, Could you please share the best Dynamic Sitemap Generator you are using.  I have found this place: http://www.seotools.kreationstudio.com/xml-sitemap-generator/free_dynamic_xml_sitemap_generator.php Thanks in advanced for your help.

    | SEOPractices
    0

  • Hi, 2 weeks ago we've made big changes in title and meta descriptions. To solve the missing title and descriptions. Also set the right canonical. Now i see that in WMT despite the canonical it shows duplicates in meta descriptions and titles. i've setup the canonical like this:
    1. url: www.domainname.com/category/listing-family/productname
    2. url: www.domainname.com/category/listing-family/productname-more-info The canonical on both pages is like this: I'm aware of creating duplicate titles and descriptions, caused by the cms we use and also caused by wrong structure of category/products (we'll solve that nest year) that's why i wanted the canonical, but now it's not going any better, did i do something wrong with the canonical?

    | Leonie-Kramer
    0

  • Hello, I am new to the fullsite=true method of mobile site to desktop site, and have recently found that about 50 of the instances in which I added fullsite=true to links from our blog show as a duplicate to the page that it is pointing to? Could someone tell me why this would be? Do I need to add some sort of rel=canonical to the main page (non-fullsite=true) or how should I approach this? Thanks in advance for your help! L

    | lfrazer
    0

  • We have a .us and a .com version of our site that we direct customers to based on location to servers. This is not changing for the foreseeable future. We had restricted Google from crawling the .us version of the site and all was fine until I started to see the https version of the .us appearing in the SERPs for certain keywords we keep an eye on. The .com still exists and is sometimes directly above or under the .us. It is occasionally a different page on the site with similar content to the query, or sometimes it just returns the exact same page for both the .com and the .us results. This has me worried about duplicate content issues. The question(s): Should I just get the https version of the .us to not be crawled/indexed and leave it at that or should I work to get a rel=canonical set up for the entire .us to .com (making the .com the canonical version)? Are there any major pitfalls I should be aware of in regards to the rel=canonical across the entire domain (both the .us and .com are identical and these newly crawled/indexed .us pages rank pretty nicely sometimes)? Am I better off just correcting it so the .us is no longer crawled and indexed and leaving it at that? Side question: Have any ecommerce guys noticed that Googlebot has started to crawl/index and serve up https version of your URLs in the SERPs even if the only way to get into those versions of the pages are to either append the https:// yourself to the URL or to go through a sign in or check out page? Is Google, in the wake of their https everywhere and potentially making it a ranking signal, forcing the check for the https of any given URL and choosing to index that? I just can't figure out how it is even finding those URLs to index if it isn't seeing http://www.example.com and then adding the https:// itself and checking... Help/insight on either point would be appreciated.

    | TLM
    0

  • Example URL:  http://www.sja.ca/English/Community-Services/Pages/Therapy Dog Services/default.aspx The above page is generated dynamically depending on what province the visitor visits from.  For example, a visitor from BC would see something quite different than a visitor from Nova Scotia; the intent is that the information shown should be relevant to the user of that province. How does this effect SEO?  How (or from what location) does Googlebot decide to crawl the page? I have considered a subdirectory for each province, though that comes with its challenges as well.  One such challenge is duplicate content when different provinces may have the same information for some pages.  Any suggestions for this?

    | ey_sja
    0

  • we are changing the e-commerce store and usually we do the redirection of all pages URLs... We usually don´t redirect css, js and images files .... Shouldn´t we redirect images as well?

    | SeoMartin1
    0

  • I noticed another post on this, but I have another question.  I am getting this message from Analytics: Property http://www.example.com is receiving data from redundant hostnames. Consider setting up a 301 redirect on your website, or make a search and replace filter that strips "www." from hostnames. Examples of redundant hostnames: example.com, www.example.com. We don't have a 301 in place that manages this and I am quite concerned about handling that the right way.  We do have a canonical on our homepage that says: rel="canonical" href="http://www.example.com/" /> I asked on another site how to safely set up our 301 and I got this response: RewriteCond %{HTTP_HOST} !^www.example.com$ [NC]
    RewriteRule ^ http://www.example.com%{REQUEST_URI} [R=301,L,NE] Is this the best way of handling it?  Are there situations where this would not be the best way? We do have a few subdomains like beta.example.com in use and have a rather large site, so I just want to make sure I get it right. Thanks for your help! Craig

    | TheCraig
    0

  • Hi Guys I'm reading some very contrasting and confusing reviews regarding urls and the impact they have on a sites ability to rank. My client has a number of flooring products, 71 to be exact - categorised under three sub categories 1. Gallery Wood - 2. Prefinshed Wood - 3. Parquet & Reclaimed. All of the 71 products are branded products (names that are completely unrelated to specific keyword search terms. This is having a major impact regarding how we optimise the site. FOR EXAMPLE: A product of the floor called "White Grain" - the "Key Word" we would like to rank this page for is Brown Engineered Flooring. I'm interested to know, should the name of the branded product match the url? What would you change to help this page rank better for the keyword - Brown Engineered Flooring. Title page: White Grain Url: thecompanyname.com/gallery-wood/white-grain (white grain is the name of the product) Key Word: Brown Engineered Flooring **Seo Title: **White Grain, Brown Engineered Flooring by X Meta Description: BLAH BLAH Brown Engineered Flooring BLAH BLAH Any feedback to help get my head around this would be really appreciated. Thank you.

    | GaryVictory
    0

  • Its only been 10 days and i have repurchased the domain name/ renewed. The who is info, website and contact information is all still the same. However we have lost all rankings and i am hoping that our top rankings come back. Does anyone have experience with such a crappy situation?

    | waqid
    0

  • I'm building a new website and am setting up internal link structure with subcategories and hoping to do so with best Seo practices in mind.  When linking to a subcategory's main page, would I make the internal link  www.xxx.com/fishing/ or www.xxx.com/fishing/index.html or does it matter?  I'm just trying to avoid duplicate content I guess, if Google saw each page as a separate page.  Any other cautions when using subdirectories in my navigation?

    | wplodge
    0

  • Hello, we are having issues with our wordpress woo commerce website sending ecommerce data to google analytics. We insert the remarketing tag on the header.php of the wordpress site inside the tag but it gives us the following error: 1. Cannot find CDATA comment 
    2. Requires Validation - tried validating it with our Merchant ID but didn't work out, still says unvalidated. We also had our product feed uploaded on our Merchant Account.
    3. Missing line breaks I had a theory that it should be inserted on each product detail pages, so I was looking for custom code plugin for woocommerce but still cannot find one.

    | ryanparrish
    0

  • Recently rebuilt a site for an auctioneers, however it has a problem in that none of the lots and auctions are being indexed by Google on the new site, only the pages like About, FAQ, home, contact. Checking WMT shows that Google has crawled all the pages, and I've done a "Fetch as Google" on them and it loads up fine, so there's no crawling issues that is standing out. I've set the "URL Parameters" to no effect too. Also built a sitemap with all the lots in, pushed to Google which then crawled them all (massive spike in Crawl rate for a couple days), and still just indexing a handful of pages. Any clues to look into would be greatly appreciated. https://www.wilkinsons-auctioneers.co.uk/auctions/

    | Blue-shark
    0

  • I've heard that a blog's has a deep relation with content keywords available in WMT, my WMT is showing top content keyword "PNG"  and it's because of images available in my posts. also the images sizes like 150x150 is also in my content keyword. Should I noindex my images? or is there any other solution to handle this issue?

    | hammadrafique
    0

  • Hi! I have a gallery on my website. When you click to view the next image it goes to a new page but the content is exactly the same as the first page. This is flagging up a duplicate content issue. What is the best way to fix this? Add a canonical tag to page 2,3,4 or add a noindex tag? I have found a lot of conflicting answers. Thanks in advance

    | emma1986
    0

  • Hi, we recently redesigned our online shop. We have done the 301 redirects for all product pages to the new URL (and went live about 1.5 week ago), but GWT indicated that the old product URL and the new product URL are 2 different pages with the same meta title tags (duplication) - when in fact, the old URL is 301 redirecting to the new URL when visited. I found this article on google forum: https://productforums.google.com/forum/#!topic/webmasters/CvCjeNOxOUw
    It says we either just wait for Google to re-crawl, of use the fetch URL function for the OLD URLs. Question is, after i fetch the OLD URL to tell Google that it's being redirected, should i click the button 'submit to index' or not? (See screengrab - please note that it was the OLD URL that was being fetched, not the NEW URL). I mean, if i click this button, is it telling Google that: a. 'This old URL has been redirected, therefore please index the new URL'? or
    b. 'Please keep this old URL in your index'? What's your view on this? Thanks

    | Essentia
    1

  • when changing a domain name, should we redirect all the pages to their new pages or only the indexed pages? Thanks

    | bigrat95
    0

  • Hi, Newbie alert! I need to set up 301 redirects for changed URLs on a database driven site that is to be redeveloped shortly. The current site uses canonical header tags. The new site will also use canonical tags. Should the 301 redirects map the canonical URL on the old site to the corresponding canonical for the new design . . . or should they map the non canonical database URLs old and new? Given that the purpose of canonicals is to indicate our preferred URL, then my guess is that's what I should use. However, how can I be sure that Google (for example) has indexed the canonical in every case? Thx in anticipation.

    | ztalk112
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.