Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • We just recently launched a new version of our website. This new version allowed us to integrate research into technical SEO updates to enhance our search visibility. Based on experience from those viewing this post, what is a good average timeframe in which I should start seeing some effects from these changes in Google? I know this question is hard to answer because of all the variables that are part of the answer but I need something to take to the c-level as an estimate of what to expect. I figured experience might tell a good story here.

    | Smart_Start
    0

  • I'm seeing this error on Google Webmaster Console: | URL: | http://www.awlwildlife.com/wp-admin/admin-ajax.php | | | Error details | Linked from | | |
    | Last crawled: 11/15/16First detected: 11/15/16 The target URL doesn't exist, but your server is not returning a 404 (file not found) error. Learn more Your server returns a code other than 404 or 410 for a non-existent page (or redirecting users to another page, such as the homepage, instead of returning a 404). This creates a poor experience for searchers and search engines. More information about "soft 404" errors | Any ideas what I should do about it? Thanks!

    | aj613
    0

  • Hi, We are in the process of considering our domain url options for a new site. The plan is to migrate other site (bringing their link juice) to an main brand level domain. At the moment our desired .com url is unattainable however from a band perspective another extension e.g (.group) would probably be a better brand fit - however I wanted to know what the implications might be from an SEO perspective. At the moment some of our sub domains are ranking extremely well for desired keywords. Assuming we implement the correct redirect rules to maintain these rankings, would there be any other implication for our rankings (particularly in the UK and US) for not using a .com domain and using an alternatve TLD extension. Thanks

    | carlsutherland
    0

  • Hi SEO's, I have a question about moving local landing pages from many separate pages towards integrating them into a search results page. Currently we have many separate local pages (e.g. www.3dhubs.com/new-york). For both scalability and conversion reasons, we'll integrate our local pages into our search page (e.g. www.3dhubs.com/3d-print/Bangalore--India). **Implementation details: **To mitigate the risk of a sudden organic traffic drop, we're currently running a test on just 18 local pages (Bangalore) = 1 / 18). We applied a 301 redirect from the old URL's to the new URL's 3 weeks ago. Note: We didn't yet update the sitemap for this test (technical reasons) and will only do this once we 301 redirect all local pages. For the 18 test pages I manually told the crawlers to index them in webmaster tools. That should do I suppose. **Results so far: **The old url's of the 18 test cities are still generating > 99% of the traffic while the new pages are already indexed (see: https://www.google.nl/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=site:www.3dhubs.com/3d-print/&start=0). Overall organic traffic on test cities hasn't changed. Questions: 1. Will updating the sitemap for this test have a big impact? Google has already picked up the new URL's so that's not the issue. Furthermore, the 301 redirect on the old pages should tell Google to show the new page instead, right? 2. Is it normal that search impressions will slowly shift from the old page towards the new page? How long should I expect it to take before the new pages are consistently shown over the old pages in the SERPS?

    | robdraaijer
    0

  • Hi all, I'm working on a site where when I crawl it with SF, SF doesn't pick up on the meta description (as in the source code it IS blank). However, the meta description has been set via the Yoast Wordpress plugin and it does exist in the source code and is shown in the SERPs. The code looks like this: <title>Dining Table and Chairs set</title> So my question is: will this be affecting SEO and how the website is ranking if all the actual  are blank? Thank you

    | Bee159
    1

  • Hai Moz memebers, Can you pls suggest me some best seo  websites that you people read articles everyday a part from MOZ

    | SEO_GB
    1

  • Hi, One of our blog-post has been interlinked with thousands of internal links as per search console; but lists only 2 links it got connected from. How come so many links it got connected internally? I don't see any. Thanks, Satish

    | vtmoz
    0

  • Hi - I create virtual tours which I host and my clients embed (this site will be a holiday directory one day and linking is unlikely).  What can I do with the embed code they use - most use iframes - to get maximum Seo juice? Example tour below https://bestdevonholidays.co.uk/lavender/virtualtour.html Thanks

    | virtualdevon
    0

  • At Siftery (siftery.com) we have about 250k pages, most of them reflected in our sitemap. Though after submitting a sitemap we started seeing an increase in the number of pages Google indexed, in the past few weeks progress has slowed to a crawl at about 80k pages, and in fact has been coming down very marginally. Due to the nature of the site, a lot of the pages on the site likely look very similar to search engines. We've also broken down our sitemap into an index, so we know that most of the indexation problems are coming from a particular type of page (company profiles). Given these facts below, what do you recommend we do? Should we de-index all of the pages that are not being picked up by the Google index (and are therefore likely seen as low quality)? There seems to be a school of thought that de-indexing "thin" pages improves the ranking potential of the indexed pages. We have plans for enriching and differentiating the pages that are being picked up as thin (Moz itself picks them up as 'duplicate' pages even though they're not. Thanks for sharing your thoughts and experiences!

    | ggiaco-siftery
    0

  • Hello everyone, My main website is: http://www.virtualsheetmusic.com Whereas the above site's related "affiliate" website is located on the subdomain below: http://affiliates.virtualsheetmusic.com I was wondering if having that "affiliate section" on a subdomain could affect the main website negatively in some way... or would be better to put it in a sub-folder on the main website, or even on a totally different domain. Thanks in advance for any advice!

    | fablau
    0

  • Hi Moz community, I work at a web design company. I found my competitors have a lot of site-wide backlinks from their clients with optimized anchor text "affordable web design by XXX". Some of the clients' website are not even relevant to web design or design industry. I am sure those are dofollow links. Although I heard a lot of sayings that site-wide backlinks look unnatural and spammy, why the top ranking guys are still using this way to acquire backlinks? Does Google really actually say no to this? Thanks for any help and explanation. Best, Raymond

    | Raymondlee
    0

  • Hi, We are a wholesaler for electronics parts and accessories on our main site Our Business for parts is more B2B. Accessories is more B2C oriented. Right now, our catalog is not the best SEO Friendly kind. We want to move all accesories to a new site. Now. So as not to get duplicated content, and user experience, I guess the best is remove all category pages and product pages from our main site. That would generate lots of 404. What would be the way to handle this as we are talking about hundreds, maybe thousands of pages ? Thanks.

    | Kepass
    0

  • I am about to make a domain name change for my online shop. I have heard  that redirecting my HTTP to my https is a good SEO Practice. I have www, non-www, as well as https-www and https-non-www declared in Search console. Both have non-www set as preferred domain. Is the redirect rule from HTTP to https really usefull ? Thanks

    | Kepass
    0

  • i am looking to expand from the UK and open a location in the US.  i curretly have a .co.uk domain.  what would you recommend I do with th website, create a new one wth a .com domain?

    | Caffeine_Marketing
    0

  • Hi We recently change the layout of our website to a responsive theme in the hope of improve our rankings. We are getting less more traffic but less conversions to sales. Our Spam sore has these flagged No Contact Info - we have the phone number on top of every page and a contact us link at the bottom of every page  - Is there something we are missing ? Low Number of Pages Found - we have over 3000 products each of which has a page, plus other info pages on our stite - Whay would this be flagged.RegardsAdrienne

    | CostumeD
    0

  • We have found a cache of about 10 URLs, some are ranking above our main URL in Google SERPS. What is the best course of action here? a. Redirect all to the homepage?
    b. Link all domains to the homepage?
    c. Link all domains to select pages on on main site, being careful not to anchor text spam
    d. 301 redirect all to the main site. Is there any disadvantage to your recommendation? Is there likely to be a penalty incurred? I feel like we'll get the strongest increase in rankings by following option c but it feels like option d may be safer. Thanks in advance for your help!

    | moconn
    0

  • Hi, I'm doing an audit of a site for a very competitive term (project management software). The site ranks for its root domain on the second page. They have a lot of other non-blog pages that are geared towards longer tail versions that include that term (project management software pricing, project management tool comparison, etc). My question is: are those pages cannibalizing potential search traffic? Should they just stick to the one page (root domain) and include those longtail keywords on the page instead of creating various pages that seem to possibly be cannibalizing traffic? Is this a fair conclusion that these other pages is causing them to rank lower for the main head term?

    | jim_shook
    0

  • I run a exotic pet website which currently has several types of species of reptiles. It has done well in SERP for the first couple of types of reptiles, but I am continuing to add new species and for each of these comes the task of getting ranked and I need to figure out the best process. We just released our 4th species, "reticulated pythons", about 2 weeks ago, and I made these pages public and in Webmaster tools did a "Fetch as Google" and index page and child pages for this page: http://www.morphmarket.com/c/reptiles/pythons/reticulated-pythons/index While Google immediately indexed the index page, it did not really index the couple of dozen pages linked from this page despite me checking the option to crawl child pages.  I know this by two ways: first, in Google Webmaster Tools, if I look at Search Analytics and Pages filtered by "retic", there are only 2 listed. This at least tells me it's not showing these pages to users.  More directly though, if I look at Google search for "site:morphmarket.com/c/reptiles/pythons/reticulated-pythons" there are only 7 pages indexed. More details -- I've tested at least one of these URLs with the robot checker and they are not blocked.  The canonical values look right. I have not monkeyed really with Crawl URL Parameters. I do NOT have these pages listed in my sitemap, but in my experience Google didn't care a lot about that -- I previously had about 100 pages there and google didn't index some of them for more than 1 year.  Google has indexed "105k" pages from my site so it is very happy to do so, apparently just not the ones I want (this large value is due to permutations of search parameters, something I think I've since improved with canonical, robots, etc).  I may have some nofollow links to the same URLs but NOT on this page, so assuming nofollow has only local effects, this shouldn't matter. Any advice on what could be going wrong here.  I really want Google to index the top couple of links on this page (home, index, stores, calculator) as well as the couple dozen gene/tag links below.

    | jplehmann
    0

  • Hi all, I've got an interesting issue and a bit of a technical challenge for you. It's a bit complicated to explain, but please bear with me. We have a client website (http://clientwebsite.com) which we are having a hard time ranking in the past few months. Main keywords simply don't show up in Top100 searches, even though we are constantly building backlinks through Guest Posts, Citations, Media mentions, Profile links etc. Normally, we use ahrefs to look at the client's website backlinks, but just today we used Majestic to look at the backlink profile and one backlink stood out. This is a backlink from a development server (http://developmentwebsite.com) which redirects to http://clientwebsite.com
    The developers who were working on the redesign of the client website, put it up on their server and forgot to delete it.
    Also, the content inside the development website is almost identical with the client website. We then checked to see if http://developmentwebsite.com is indexed.
    It's not. Although, inside the robots file http://developmentwebsite.com/robots.txt there's:
    User-agent: *
    Allow: /
    The funny (and weird thing) is that http://developmentwebsite.com/ and all development website inner pages are not indexed in Google. But if we go to http://developmentwebsite.com/inner-page, it doesn't redirect to the corresponding http://clientwebsite.com/inner-page, it's the same development website page URL and the pages even have links to the client website, but like I said, none of the pages of the development website are indexed, even though crawlers are allowed in the robots.txt's development website. In your opinion, could this be the reason why we are having a hard time to rank the client website? Second question is:
    How do we approach in solving this issue?
    Do we simply delete the whole http://developmentwebsite.com with all the inner pages?
    Or should we do 301 redirrects on a per-page basis?

    | zakkyg
    0

  • I started working for this eCommerce site 2 months ago, and my SEO site audit revealed a massive spider trap. The site should have been 3500-ish pages, but Google has over 30K pages in its index. I'm trying to find a effective way of making a list of all URLs indexed by Google. Anyone? (I basically want to build a sitemap with all the indexed spider trap URLs, then set up 301 on those, then ping Google with the "defective" sitemap so they can see what the site really looks like and remove those URLs, shrinking the site back to around 3500 pages)

    | Bryggselv.no
    0

  • Hello, our programmer recently updated our http version website to https.  Does it matter if we have TWO 301 redirects?  Here is an example: http://www.colocationamerica.com/dedicated_servers/linux-dedicated.htm 301 https://www.colocationamerica.com/dedicated_servers/linux-dedicated.htm 301 https://www.colocationamerica.com/linux-dedicated-server We're getting pulled in two different directions.  I read https://moz.com/blog/301-redirection-rules-for-seo and don't know if 2 301's suffice. Please let me know.  Greatly appreciated!

    | Shawn124
    0

  • Hey Guys, Have you ever seen coding like this in a robots.txt, I have never seen a noindex rule in a robots.txt file before - have you? user-agent: AhrefsBot User-agent: trovitBot
    User-agent: Nutch
    User-agent: Baiduspider
    Disallow: / User-agent: *
    Disallow: /WebServices/
    Disallow: /*?notfound=
    Disallow: /?list=
    Noindex: /?*list=
    Noindex: /local/
    Disallow: /local/
    Noindex: /handle/
    Disallow: /handle/
    Noindex: /Handle/
    Disallow: /Handle/
    Noindex: /localsites/
    Disallow: /localsites/
    Noindex: /search/
    Disallow: /search/
    Noindex: /Search/
    Disallow: /Search/
    Disallow: ? I have never seen a noindex rule in a robots.txt file before - have you?
    Any pointers?

    | eLab_London
    0

  • I came across an article mentioning this as a strategy for getting product pages (which are tough to get links for) some link equity. See #21: content flipping: https://www.matthewbarby.com/customer-acquisition-strategies Has anyone done this? Seems like this isn't what the tag is meant for, and Google may see this as deceptive? Any thoughts? Jim

    | jim_shook
    0

  • I have previously switched sites to https but this one is behaving a little different. On September 19th I switched to https. I did 301 redirects at the .htaccess, added it to search console, and since we are using Magento I changed the base url. In the past when I have done this the http site index just gradually drops while the https site gradually rises. In early October the http site started to slightly drop but since 10/23 there have been no changes. For the https site it started to increase up until 10/23 then stayed flat. Why have they stayed stuck like that for a month?

    | Tylerj
    0

  • Hi, Our ranking between India and USA is varying inversely like....when we improve in India...drops in USA and vice-versa. And the possible correlation I have from my end is: We have reclaimed some links and removed same. It happens at both redirects and removal of those redirects. When we reclaimed some links, we improved in India and dropped in US. And when we removed these redirects, vice-versa. Any idea on this? Please share. Thanks, Satish

    | vtmoz
    0

  • This page is showing up as a 404 in Google Search console- https://www.wolfautomation.com/blog/autonics/ It shows it has been linked from these pages- https://www.wolfautomation.com/blog/raffel/ https://www.wolfautomation.com/blog/new-temperature-controllers-from-autonics/ https://www.wolfautomation.com/blog/ge-industrial/ https://www.wolfautomation.com/blog/temp-controller/ https://www.wolfautomation.com/blog/tx4s/ I never created this page, I don't want this page but it keeps showing up. The problem is the link isn't found on those pages anywhere so I can't delete it. What am I missing? How can I get rid of it?

    | Tylerj
    0

  • We looked to use the Screaming Frog Tool to crawl this website and get a list of all meta-titles from the site, however, it only resulted with the one result - the homepage. We then sought to obtain a list of the URLs of the site by creating a sitemap using https://www.xml-sitemaps.com/. Once again however, we just go the one result - the homepage. There is something that seems to be restricting these tools from crawling all pages. If you anyone can shed some light as to what this could be, we'd be most appreciative.

    | Gavo
    0

  • Hello! Our website using JAVA redirect (legal reasons) , I noticed that pages that have JS redirect don't get the same page authority for example:
    The old home page have 60 PA while the new home page get only 22 PA I know that Google don't have problem with JS redirects and they passing all the juice like regular 301
    but all SEO tools are straggling to understand it, why? Does anyone know what I'm talking about?

    | Roi.Bar
    0

  • Hi-diddly-ho SEO gurus, quick question. I just saw this article and wanted to get thoughts from the people here. https://www.searchenginejournal.com/google-says-now-ok-put-content-behind-tabs/178020/ I am constantly at war with our UX guy on this subject because he believes, along with our CEO, that tabbed and accordion style information is better from THE UX standpoint. Less clutter on a page but with information still readily available. I am not here to argue that point but was wondering if you agree with the article posted here. I had to inform them their roll needed to be slowed until I could get something a little more concrete on the matter.

    | spadedesign
    0

  • Hi, We have cigarettes and viagra as keywords in our sub-domain where our clients can post their business content. We have decent number of impressions and clicks for these related keywords. I have seen that these two words, especially "viagra" is most spammed. So are these hurting us? We dropped post Penguin update. Any correlation? Do you think that these keywords penalise us? We don't have messages or suggestion from Google Thanks, Satish

    | vtmoz
    0

  • Hello all - I've just been trying to crawl a site with Screaming Frog and can't get beyond the homepage - have done the usual stuff (turn off JS and so on) and no problems there with nav and so on- the site's other pages have indexed in Google btw. Now I'm wondering whether there's a problem with this robots.txt file, which I think may be auto-generated by Joomla (I'm not familiar with Joomla...) - are there any issues here? [just checked... and there isn't!] If the Joomla site is installed within a folder such as at e.g. www.example.com/joomla/ the robots.txt file MUST be moved to the site root at e.g. www.example.com/robots.txt AND the joomla folder name MUST be prefixed to the disallowed path, e.g. the Disallow rule for the /administrator/ folder MUST be changed to read Disallow: /joomla/administrator/ For more information about the robots.txt standard, see: http://www.robotstxt.org/orig.html For syntax checking, see: http://tool.motoricerca.info/robots-checker.phtml User-agent: *
    Disallow: /administrator/
    Disallow: /bin/
    Disallow: /cache/
    Disallow: /cli/
    Disallow: /components/
    Disallow: /includes/
    Disallow: /installation/
    Disallow: /language/
    Disallow: /layouts/
    Disallow: /libraries/
    Disallow: /logs/
    Disallow: /modules/
    Disallow: /plugins/
    Disallow: /tmp/

    | McTaggart
    0

  • Hi Guys, Hope you'll be able to help me with a technical problem I am facing right now. We are a company right ? We own 2 webistes. Let's say one sells car parts, the other one buys second hand car parts to refurbish them and sell them. (It is not our case, just an example very similar to ours). sellparts.com buyparts.com Both are ecommerce websites, with large catalogues (7000 skus). sellparts sells a lot and is a big actor in its market. buyparts.com doesn't work nad has a really low DA. My new SEO external consultant, which I am not too convinced about, is telling me to cross link the sites on product level using cross-linking extensions. He want have them do-follow. That would mean having hundreds or thousands of links with really similar linking patterns. buy [parts] [model ] [make] sell [parts] [model ] [make] That to me seems a bit too much and I am worried it compromises the sellparts site's SEO. So should i no-follow the links ? Or do it differently ?

    | Kepass
    0

  • Hi, A client of mine who owns a website reached out to me. He got penalized a while ago and has long since recovered (not sure exactly, but for sure a year). His domain authority is in the upper 30s but is still not ranking for many of his keywords that he ranked on the first page. I am not so familiar with the technical aspects of penalties and such, but is this a common scenario? Why is his domain authority great but his ranking downright awful? Does he have a chance if he builds great links, or is something else wrong that we can't figure out?

    | Rachel_J
    0

  • Hi, I rus a travel photography business. The primary function of the website is to sell prints, though I blog about my travels on the same domain name as well as a few pieces of content that are helpful to users interested in some of the places I travel to. I do okay with it, but obviously, I am always looking for a way to increase visibility and sales of prints. I own a couple of high quality keyword domain names, that I've been trying to figure out what to do with. One of which is for a city that my prints of my photography are probably best known for. The domains I'm really trying to decide what to do with are basically a www.citystatephotography.com and www.citystatephotos.com, where the city and state are the ones I'm targeting. The question is, what do I do with it? I've seen various ideas from other photographers that have various levels of success. Here are the options I'm considering: Just redirect it to the photo gallery of photos that I'm trying to rank highly for. From what I read on various blogs, this doesn't really do much of anything, but maybe I've read wrong? Create a website or microsite with some quality content related to the city that also links back to my photography website on various places and possibly once in the navigation. I do have quality content I could put up that would be helpful to people from the city besides just trying to get sales. But there's always a chance this will cannibalize my original domain without helping sales, I assume? Spam my photo galleries across two domains. Most of my photography galleries would stay on my main domain that I already run, but the photo galleries that are key to that city would be hosted on that citystatephotography.com domain name. I've seen a photographer from Colorado do quite well with this method. (www.imagesofrmnp.com and www.morninglight.us) He's heavily known for his images of Rocky Mountain National Park and that seems to be his main brand, but all of his non-RMNP travel photography goes on the other site. The two sites look almost identical, though they link back and forth fairly extensively. There doesn't seem to be much in the way of duplicate content either. I've considered this method, but I'm nervous I'll kill what I've already built up if this were to fail. Do nothing with the domains. Seems wasteful, as these domains, particularly the citystatephotography.com domain seems useful in some way. Any thoughts? Thanks in advance!

    | shannmg1
    0

  • We have a client that will not grant us access to their Google Search Console (don't ask us why). Is there anyway possible to submit a XML sitemap to Google without using GSC? Thanks

    | RosemaryB
    0

  • Hi all, In my website, I would like to use CSS to set the anchor text to "website design service"(my company provides web design service) but show the button text as "website", due to some artistic reasons. Therefore, the anchor text for the link is "website design service" but what users see is "websites". Does this sound spammy to Google? Is it a risky move that might hurt my SEO? Looking for some advises here. Thank you very much. Best,

    | Raymondlee
    0

  • This September, Matt Cutts announced a new crackdown on widget links. But they clearly still work so it's a matter of scale and usage in IMO. Years ago I started recommending changing links within widgets to use branded anchor text instead of keyword rich anchor text so as not to create an unusual amount of keyword focused anchor text. It's also clearly more natural. So far this has been working very well. The new warning is concerning and I recognize the "best practice" according to Google would be to no-follow these links, but I'm not quite ready to do this unless a risk of unrecoverable penalty is apparent. My thoughts are it's a matter of scale. If there are tens of thousands of widget links and they dominate the link profile that would be a serious matter. If there are only thousands of widget links and they are a small part of the total link profile it is much less of a concern. Does anyone have any direct experience with getting warnings on this matter?

    | Envoke-Marketing
    1

  • Hi Mozzers, I referred a client of mine (last time) to a programmer that can transition their site from http to https.  They use a wordpress website and currently use EPS Redirects as a plugin that 301 redirects about 400 pages.  Currently, the way EPS redirects is setup (as shown in the attachment) is simple: On the left side you enter your old url, and on the the right side is the newly 301'd url.  But here's the issue, since my client made the transition to https, the whole wordpress backend is setup that way as well.  What this means is, if my client finds another old http url that he wants to redirect, this plugin only allows them to redirect https to https. As of now, all old http to https redirects STILL work even though the left side of the plugin switched all url's to a default HTTPS.  But my client is worried the next plugin update he will lose all http to https redirects.  While asking our programmer to add all 400 redirects to .htaccess, he states that's too many redirects and could slow down the website.  Well, we don't want to lose all 400 301's and jeopardize our SEO. Question: what does everyone suggest as an alternative solution/plugin to redirect old http urls to https and future https to https urls? Thank you all! Ol8km

    | Shawn124
    0

  • We have a scenario on a domain that recently moved to enforcing SSL. If a page is requested over non-ssl (http) requests, the server automatically redirects to the SSL (https) URL using a good old fashioned 301. This is great except for any page that no longer exists, in which case you get a 301 going to a 404. Here's what I mean. Case 1 - Good page: http://domain.com/goodpage -> 301 -> https://domain.com/goodpage -> 200 Case 2 - Bad page that no longer exists: http://domain.com/badpage -> 301 -> https://domain.com/badpage -> 404 Google is correctly re-indexing all the "good" pages and just displaying search results going directly to the https version. Google is stubbornly hanging on to all the "bad" pages and serving up the original URL (http://domain.com/badpage) unless we submit a removal request. But there are hundreds of these pages and this is starting to suck. Note: the load balancer does the SSL enforcement, not the CMS. So we can't detect a 404 and serve it up first. The CMS does the 404'ing. Any ideas on the best way to approach this problem? Or any idea why Google is holding on to all the old "bad" pages that no longer exist, given that we've clearly indicated with 301s that no one is home at the old address?

    | boxclever
    0

  • Yesterday we had a client discover that our staging URLs were being indexed in Google.  This was due to a technical oversight from our development team (forgot to upload meta robots tags). We are trying to remove this content as quickly as possible.   Are there any methods in the Google Search Console to expedite this process? Thanks

    | RosemaryB
    0

  • Hi I work on an ecommerce shop & we've discussed changing some of out categories. We have one named cupboards & lockers, but want to split this out, so we have Cupboards then Lockers so customers can browse through our main navigation like this. For SEO I know initially our rankings will be affected, but long term moving categories up a level will be an improvement & will be more relevant - has anyone does this before and could provide any advice? Becky

    | BeckyKey
    0

  • What's the best optimised linking between sub-domains and domains? And every time we'll give website link at top with logo...do we need to link sub-domain also with all it's pages? If example.com is domain and example.com/blog is sub-domain or sub-folder... Do we need to link to example.com from /blog? Do we need to give /blog link in all pages of /blog? Is there any difference in connecting domains with sub-domains and sub-folders?

    | vtmoz
    0

  • Hello I'm new to Moz Forums and was wondering if anyone out there could help with a query. My client has an ecommerce site selling a range of pet products, most of which have multiple items in the range for difference size animals i.e. [Product name] for small dog
    [Product name] for medium dog
    [Product name] for large dog
    [Product name] for extra large dog I've got some really great rankings (top 3) for many keyword searches such as
    '[product name] for dogs'
    '[product name]' But these rankings are for individual product pages, meaning the user is taken to a small dog product page when they might have a large dog or visa versa. I felt it would be better for the users (and for conversions and bounce rates), if there was a group page which showed all products in the range which I could target keywords '[product name]', '[product name] for dogs'.  The page would link through the the individual product pages. I created some group pages in autumn last year to trial this and, although they are well-optimised (score of 98 on Moz's optimisation tool), they are not ranking well.  They are indexed, but way down the SERPs. The same group page format has been used for the PPC campaign and the difference to the retention/conversion of visitors is significant. Why are my group pages not ranking? Is it because my client's site already has good rankings for the target term and Google does not want to show another page of the site and muddy results?
    Is there a way to prioritise the group page in Google's eyes?  Or bring it to Google's attention? Any suggestions/advice welcome. Thanks in advance Laura

    | LauraSorrelle
    0

  • Does putting a blog on a proxy server (the pointed at the main site) hurt SEO? i.e. can Google tell? And if they can, does it matter? My server people won't use PHP on their servers but we want a Wordpress blog. So their suggested solution is that they put the blog on a proxy server and point it at the ourdomain.com/blog subfolder on our site. So to all intents and purposes it's hosted in the same place. They assure me this is normal practice and point out that our (main site) images are already being sourced from a CDN. Obviously we'll deal with Google not seeing two separate versions of the same site. But apart from this, is there any negative effect we could suffer from in SEO terms?

    | abisti2
    0

  • Hi, Our rank dropped and we noticed it's a major drop in "Mobile" devices category, which is contributing to the overall drop. What exactly drops mobile rankings? We do not have any messages in search console. We have made few redirects and removed footer links. How these affect? Thanks,
    Satish

    | vtmoz
    0

  • Hi, I haven't tested this yet, so before I do I wanted to see if anyone had some experience with this. I have lower level categories I want to rank for SEO for example: Say I want to rank 'Standard Metal Lockers' - with the way our site is set up, I have to work within a classification, which isn't always easy. So it would be categorised as follows: Cupboards & Lockers > Lockers > Standard Lockers > Standard Metal Lockers The URL structure would remain /standard-metal-lockers & I would link this from the 'Lockers' page. Is this too deep in the site structure to rank? I think if it's linked properly & promoted it will be fine, but I'd like to see if anyone else has had this issue. Becky

    | BeckyKey
    0

  • Hello guys, Yesterday, I used SEMrush to search for the keyword "branding agency" to see the SERP. The Liquidagency ranks 5th on the first page. So I went to their homepage but saw no exact keywords "branding agency", even in the page source. Also, I didn't see "branding agency" as a top anchor text in the external links to the page (from the report of SEMrush). I am an SEO newbie, can someone explain this to me, please? Thank you.

    | Raymondlee
    0

  • Howdy Moz, I have noticed a common anomaly across the majority of my client accounts (see attached image). Have lost thousands of organic keywords worldwide? (no loss in UK rankings though which are the ones that matter) Has there been an algo update? Seems strange. Thanks, Joshua nYP3i

    | AscentGroup
    0

  • Hi Guys, Currently working with a couple Shopify ecommerce sites, currently the main category urls cannot be optimised for SEO as they are auto-generated and basically filtered pages. Examples: http://tinyurl.com/hm7nm7p http://tinyurl.com/zlcoft4 One solution we have came up with is to create HTML based pages for each of these categories example: http://site.com.au/collections/women-sandals In the backend and keep the filtered page setup. So these pages can be crawled and indexed. I was wondering if this is the most viable solution to this problem for Shopify? Cheers.

    | jayoliverwright
    0

  • Hello, Simple question - Should we be redirecting our HTTP pages to HTTPS? If yes, why, if not, why? Thanks!

    | HB17
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.