Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Moz Q&A is closed.

After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Hi, Our IP was recently blacklisted - we had a malicious script sending out bulk mail in a Joomla installation. Does it hurt our SEO if we have a domain hosted on that IP? Any solid evidence? Thanks.

    White Hat / Black Hat SEO | | bjs2010
    0

  • So, we're a service where you can book different hairdressing services from a number of different salons (site being worked on). We're doing both a male and female version of the site on the same domain which users are can select between on the homepage. The differences are largely cosmetic (allowing the designers to be more creative and have a bit of fun and to also have dedicated male grooming landing pages), but I was wondering about duplicate pages. While most of the pages on each version of the site will be unique (i.e. [male service] in [location] vs [female service] in [location] with the female taking precedent when there are duplicates), what should we do about the likes of the "About" page? Pages like this would both be unique in wording but essentially offer the same information and does it make sense to to index two different "About" pages, even if the titles vary? My question is whether, for these duplicate pages, you would set the more popular one as the preferred version canonically, leave them both to be indexed or noindex the lesser version entirely? Hope this makes sense, thanks!

    On-Page Optimization | | LeahHutcheon
    0

  • Hello here, In the past I was able to find out pretty easily how many images from my website are indexed by Google and inside the Google image search index. But as today looks like Google is not giving you any numbers, it just lists the indexed images. I use the advanced image search, by defining my domain name for the "site or domain" field: http://www.google.com/advanced_image_search and then Google returns all the images coming from my website. Is there any way to know the actual number of images indexed? Any ideas are very welcome! Thank you in advance.

    Intermediate & Advanced SEO | | fablau
    1

  • Hi Mozzers, We are having an issue with our XML sitemap images not being indexed. The site has over 39,000 pages and 17,500 images submitted in GWT.  If you take a look at the attached screenshot, 'GWT Images - Not Indexed', you can see that the majority of the pages are being indexed - but none of the images are. The first thing you should know about the images is that they are hosted on a content delivery network (CDN), rather than on the site itself. However, Google advice suggests hosting on a CDN is fine - see second screenshot, 'Google CDN Advice'.  That advice says to either (i) ensure the hosting site is verified in GWT or (ii) submit in robots.txt.  As we can't verify the hosting site in GWT, we had opted to submit via robots.txt. There are 3 sitemap indexes: 1) http://www.greenplantswap.co.uk/sitemap_index.xml, 2) http://www.greenplantswap.co.uk/sitemap/plant_genera/listings.xml and 3) http://www.greenplantswap.co.uk/sitemap/plant_genera/plants.xml. Each sitemap index is split up into often hundreds or thousands of smaller XML sitemaps. This is necessary due to the size of the site and how we have decided to pull URLs in.  Essentially, if we did it another way, it may have involved some of the sitemaps being massive and thus taking upwards of a minute to load. To give you an idea of what is being submitted to Google in one of the sitemaps, please see view-source:http://www.greenplantswap.co.uk/sitemap/plant_genera/4/listings.xml?page=1. Originally, the images were SSL, so we decided to reverted to non-SSL URLs as that was an easy change.  But over a week later, that seems to have had no impact.  The image URLs are ugly... but should this prevent them from being indexed? The strange thing is that a very small number of images have been indexed - see http://goo.gl/P8GMn. I don't know if this is an anomaly or whether it suggests no issue with how the images have been set up - thus, there may be another issue. Sorry for the long message but I would be extremely grateful for any insight into this.  I have tried to offer as much information as I can, however please do let me know if this is not enough. Thank you for taking the time to read and help. Regards, Mark Oz6HzKO rYD3ICZ

    Intermediate & Advanced SEO | | edlondon
    0

  • I'm using Yoast SEO plugin to generate XML sitemaps on my e-commerce site (woocommerce). I recently changed the category structure and now only 25 of about 75 product categories are included. Is there a way to manually include urls or what is the best way to have them all indexed in the sitemap?

    Intermediate & Advanced SEO | | kisen
    0

  • So I am looking at a site for a client, and I think I already have my answer, but wanted to check with you guys. First off the site is in FLASH and HTML.  I told the client to dump the flash site, but she isn't willing right now. So the URLS are generated like this. Flash: http://www.mysite.com/#/page/7ca2/wedding-pricing/ HTML: http://www.mysite.com/?/page/7ca2/wedding-pricing/ checking the site in Google with a site:mysite, none of the interior pages are indexed at all. So that is telling me that Google is pretty much ignoring everything past the # or ?. Is that correct? My recommendation is to dump the flash site and redo the URLS in a SEo friendly format.

    Web Design | | netviper
    0

  • I have looked into DKI (Dynamic Keyword Insertion), but have not found a solution and thought that some excellent Mozzer might be able to help.  Here is the idea: We have landing pages for hundreds of cities.  The local content on each of these cities changes page to page, however the keywords that we are going after are the same.  So, I am trying to create a dynamic ad group that looks something like this: Headline: {City Name} {Keyword} Description: We cover {City Name} {Keyword}, get more info now! URL: http://www.website.com/{City Name} Please let me know if you can assist with this, B

    Paid Search Marketing | | Reis_Inc.
    0

  • Hi, I am fairly new to SEO and have just noticed the end of my title text has been cut off by Google in the serps results. Everything i have read tells me titles should be maximum of 70 characters, however, Google is only displaying 54. See below Security systems | wireless | battery powered | Police... Nobody else on the page is showing more than 54 characters. Am i missing something obvious? Any and all help gratefully appreciated. Thanks Si

    On-Page Optimization | | DaddySmurf
    0

  • We're launching an .org.hk site with English and Traditional Chinese variants. As the local population speaks both languages we would prefer not to have separate domains and are deciding between subdomains and subfolders. We're aware of the reasons behind generally preferring folders, but many people, including moz.com, suggest preferring subfolders to subdomains with the notable exception of language-specific sites. Does this mean subdomains should be preferred for language specific sites, or just that they are okay? I can't find any rationale to this other than administrative simplification (e.g. easier to set up different analytics / hosting), which in our case is not an issue. Can anyone point me in the right direction?

    International SEO | | SOS_Children
    0

  • Hullo all, I run an e-commerce website and hence have a lot of product category/sub-category pages to handle. Despite giving each of these category pages meta descriptions, in the Google SERPs, a lot of these descriptions don't show up fully. Rather, only half the text that I'd inputed as my meta desc. shows up; the other half has generic stuff from that page given.  I've attached a screen shot to give you an example of what comes up in the SERPs. Could you please tell me what exactly is the problem? Is it a coding issue? Or has Google not crawled that page? Need help asap! Thank you in advance! aE9RKXJ

    Technical SEO | | suchde
    0

  • Hi, Find all links in the site and anchor text and i need this done on my own website so i know if we dont have links that are anchored to numbers and punctuations that are not seen at all. Thanks

    Technical SEO | | mtthompsons
    0

  • This morning a competitor of ours decided to go on a PPC rampage against us.  Basically our budgeting money was spent within the first hour of going live on bing.  Its pretty obvious whats going on as we had a tremendous amount of clicks all from the exact same keyword within a short period of time. Obviously first step was to contact bing and they are going to refund me a credit once they go through their process, but they didnt really give me confidence about the future.  It seems they may not be able to prevent this from continually happening.. ? The attacker used some sort of IP spoofing as the clicks were all from different IP's which is probably why it snuck pasted Bing.  Wondering what have you guys done in the past to prevent this or combat it? Thankfully it didnt happen on google

    Paid Search Marketing | | DemiGR
    0

  • Hey moz New client has a site that uses: subdomains ("third-level" stuff like location.business.com) and; "fourth-level" subdomains (location.parent.business.com) Are these fourth-level addresses at risk of being treated differently than the other subdomains? Screaming Frog, for example, doesn't return these fourth-level addresses when doing a crawl for business.com except in the External tab. But maybe I'm just configuring the crawls incorrectly. These addresses rank, but I'm worried that we're losing some link juice along the way. Any thoughts would be appreciated!

    Technical SEO | | jamesm5i
    0

  • I'm working with a business that has multiple locations (13) in several different states. Is it best practice to have one central FB page for the company and/or separate location pages? It's for a self storage company that does not have one central phone number, so each location would have separate information listed on the page. They do have a central website with different pages for each location. I'd love to hear the communities thoughts on the best way to handle this.

    Social Media | | DougHoltOnline
    0

  • Hi! I'm currently working with http://www.muchbetteradventures.com/. They have a previous version of the site, http://v1.muchbetteradventures.com, as sub domain on their site. I've noticed a whole bunch of indexing issues which I think are caused by this. The v1 site has several thousand pages and ranks organically for a number of terms, but the pages are not relevant for the business at this time. The main site has just over 100 pages. More than 28,400 urls are currently indexed. We are considering turning off the v1 site and noindexing it. There are no real backlinks to it. The only worry is that by removing it, it will be seen as a massive drop in content. Rankings for the main site are currently quite poor, despite good content, a decent link profile and high domain authority. Any thoughts would be much appreciated!

    Intermediate & Advanced SEO | | Blink-SEO
    0

  • I'm using Wisita for video hosting. My strategy is to allow other sites to embed videos to get links - I want the video linking back to my site not wistia or the embeding site. Do I need to use the Wistia CNAME or is a wistia subdomain acceptable?

    Image & Video Optimization | | BruceMcG
    0

  • I've been told that pipes are the best separators for title tags.  Can anyone tell me the best ones for H1 and H2 tags?  Do I go with pipes, commas, hyphens, underscores...?

    On-Page Optimization | | Greatmats
    0

  • I went to Mozcon - where are the video presentations located?

    Industry Events | | JeanYates
    0

  • Is there any specific format to update the Disavow file? Also if I submitted the file a months ago, and need to update it now... should I leave the old 'excluded domains' or should I remove them? Lets say this is what I have: How would you update it? #explanation from to Google went here... and ended here.
    "domain:exampledomainalreadysubmitted1.com"
    "domain:exampledomainalreadysubmitted2.com"
    "domain:exampledomainalreadysubmitted3.com" Thanks for your input

    Link Building | | dhidalgo1
    0

  • We have a website that is publicly visible. This website has content. We'd like to take that same content, put it on another website, behind a paywall. Since Google will not be able to crawl those pages behind the paywall is there any risk to ua doing this? Thanks! Mike

    Content Development | | FOTF_DigitalMarketing
    0

  • I'm currently doing an audit for an online auto parts store and am having a hard time wrapping my head around their duplicate content issue. The current set up is this: The catalogue starts with the user selecting their year of vehicle They then choose their brand (so each of the year pages have listed every single brand of car, creating duplicate content) They then choose their model of car and then the engine And then this takes them to a page listing every type/category of product they sell (so each and every model type/engine size has the exact same content!) This is amounting to literally thousands of pages being seen as duplicates It's a giant mess. Is using rel=canonical the best thing to do? I'm having a hard time seeing a logical way of structuring the site to avoid this issue. Anyone have any ideas?

    On-Page Optimization | | ATMOSMarketing56
    0

  • Hi , my website is opening with IP too.  i think its duplicate content for google...only home page is opening with ip, no other pages,  how can i fix it?,  might be using .htaccess i am able to do...but don't know proper code for this...this website is on wordpress platform... Thanks Ramesh

    Technical SEO | | unibiz
    0

  • Hi, I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok? Thank for any help!

    On-Page Optimization | | JohnHuynh
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.