Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • Hi there, I have a URL structuring / redirect question. I have many pages on my site but I set each page up to fall under one of two folders as I serve two unique markets and want each side to be indexed properly. I have SIDE A: www.domain/FOLDER-A.com and SIDE B: www.domain/FOLDER-B. The problem is that I have a page for www.domain.com and www.domain/FOLDER-A/page1.com but I do NOT have a page for www.domain/FOLDER-A. The reason for this is that I've opted to make what would be www.domain/FOLDER-A be www.domain.com and act the primary landing page the site. As a result, there is no page located at www.domain/FOLDER-A. My WordPress template (Divi by Elegant Themes) forced me to create a blank page to be able to build off the FOLDER-A framework. My question is that given I am forced to have this blank page, do I leave it be or create a 302 or 307 redirect to www.domain.com? I fear using a 301 redirect given I may want to utilize this page for content at some point in the future. This isn't the easiest post to follow so please let me know if I need to restate the question. Many thanks in advance!

    | KurtWSEO
    0

  • Hi, I periodically use the Google site command to confirm that our client's websites are fully indexed. Over the past few months I have noticed a very strange phenomenon which is happening for a small subset of our client's websites... basically the home page keeps disappearing and reappearing in the Google index every few days.  This is isolated to a few of our client's websites and I have also noticed that it is happening for some of our client's competitor's websites (over which we have absolutely no control). In the past I have been led to believe that the absence of the home page in the index could imply a penalty of some sort.  This does not seem to be the case since these sites continue to rank the same in various Google searches regardless of whether or not the home page is listed in the index. Below are some examples of sites of our clients where the home page is currently not indexed - although they may be indexed by the time you read this and try it yourself.  Note that most of our clients are in Canada. My questions are: 1. has anyone else experienced/noticed this? 2. any thoughts on whether this could imply some sort of penalty? or could it just be a bug in Google? 3. does Google offer a way to report stuff like this? Note that we have been building websites for over 10 years so we have long been aware of issues like www vs. non-www, canonicalization, and meta content="noindex" (been there done that in 2005).  I could be wrong but I do not believe that the site would keep disappearing and reappearing if something like this was the issue. Please feel free to scrutinize the home pages to see if I have overlooked something obvious - I AM getting old. site:dietrichlaw.ca  - this site has continually ranked in the top 3 for [kitchener personal injury lawyers] for many years. site:burntucker.com - since we took over this site last year it has moved up to page 1 for [ottawa personal injury lawyers] site:bolandhowe.com - #1 for [aurora personal injury lawyers] site:imranlaw.ca - continually ranked in the top 3 for [mississauga immigration lawyers]. site:canadaenergy.ca - ranks #3 for [ontario hydro plans] Thanks in advance! Jim Donovan, President www.wethinksolutions.com

    | wethink
    0

  • Hi All - Quick question that I think I know the answer to, but I feel like I've been going around in circles a bit. My client is launching a new product and wants us to build a microsite for it (product.clientname.com). My client really dislikes their brand website, and wants to use paid media to push their audience to this new microsite. However, they also said want it to rank well organically. I feel uneasy about this, because of the subdomain vs. subfolder argument. I believe that the product will also be listed/featured on their main brand website. What is the best way forward? Thanks!

    | AinsleyAgency
    0

  • Hi everyone This may seem a bit obvious but I am getting conflicting answers on this, we have a client that has a wiki that is basically an online manual of their software. They do it like this because the manual is so big and is constantly developing, there are thousands of pages with loads of links that are pointing to various sections of relevance on the main site as well, the majority of these are No Follow but I have noticed that they have a single link on the navigation that is a direct link to their main site that is a follow link, obviously this is a sitewide. Would this be seen as being detrimental to the main site, should I have this set as No Follow as well. Thanks in Advance

    | Andrew_Birkitt
    0

  • For example: shower cabins (660), used in our onpage-navigation which links to a product list page.

    | Maxaro.nl
    0

  • Hello everyone! I'm working with a site right now that is currently formatted as subdomain.domain.net. The old version of the site was formatted as domain.net, with domain.com and several other variants redirecting to the current format, subdomain.domain.net. All of these redirects are 302, and I'm wondering if I should have all these changed to 301. Many of our old backlinks go to the old format of domain.net and i know the juice isn't being passed through, but i was wondering if there is any reason why you may want a 302 over a 301 in this case? Any insight would be appreciated. Thanks!

    | KathleenDC
    0

  • Firstly, this is quite extensive so thank you to anyone who answers some or all of the below! So this is quite a lengthy ordeal, and I'm going to start by saying that I'm no SEO expert (yet). I've paid for SEO for years and only on the odd occasion has it made any real difference. It has come to the point now where I've spent so much money on SEO over the years with practically no benefit that I can't afford to do it anymore, so I am teaching myself. So, back in July my website was hacked for a total sum of three weeks. My SEO/Hosting company at the time didn't do anything about it, let the hack sit there and didn't even take the site offline. It just so happened that at the time I was changing over to a new site at the time anyway, so I launched the new site (completely different in structure to the old one), did all of the relevant 301 redirects, and my traffic hasn't recovered since. I have gone from around 100-150 daily visits to 0-10. The descriptions, keywords, alt image tags, h1 & 2, meta data, etc. is all much better (a lot of it was empty on the previous site) on the new site than it was on the previous site so I was assuming it would be better, but it isn't. Anyone got any suggestions as to why this might be? Here are some specific questions: Canonical Problem? My site is ecommerce and lists some products in several categories, that has resulted in a high duplicate content rate. Is it expected/accepted by google that this would be the case for an ecommerce website or do I need to sort out some serious canonical urls to fix the issue? The site structure of my website could also be a problem, but I'm not qualified enough to know for sure. If you view a product/sub-category, then remove the category section of that link, the product will still appear. I don't know if this structure is good or not? i.e. if you click both links below, the link will appear all the same. http://thespacecollective.com/space-clothing/nasa-and-space-t-shirts http://thespacecollective.com/nasa-and-space-t-shirts Is this a problem for SEO? Duplicate Product Tag Problem? I have many duplicate product tags appearing on many products, should these be blocked in the robots.txt? i.e. http://thespacecollective.com/space-memorabilia/space-flown/apollo-11-flown-cm-meteorite-acrylic http://thespacecollective.com/space-memorabilia/space-flown/apollo-11-flown Site Code Structure When choosing the template I would use for my website I did not stop to consider if the code was SEO friendly, this on my part was due to my ignorance on the subject. Is the site structure SEO-friendly or is it hindering my efforts? Website: http://thespacecollective.com Again, thank you to anyone who takes the time to read/care about the issues facing a newbie. My only option now is to learn SEO myself (which is well overdue), so any advice/answers are appreciated!

    | moon-boots
    0

  • Hi everyone I have a question on trailing slashes in URL. The crux of it is this: is having both: example.com/subdirectory/ and: example.com/subdirectory on all of your subdirectories considered duplicate content by Google - or in some other way really bad? We have done a heck a lot of research into this, and it would seem...no one knows for sure (it is easy to get lost in a sea of Webmaster tool forums from 2012). Google itself has both URLs for it's subdirectories (try https://www.google.co.uk/maps and https://www.google.co.uk/maps/) as does Moz; and yet there are some rumblings on the internet of people who think you must put a 'redirect' (although not really a redirect as it isn't a 301) in your htaccess file to one or the other (so for example.com/subdirectory/ would 'forward' to example.com/subdirectory); and this is what bbc.co.uk do. We tried putting this htaccess 'forward' in as an experiment, but I noticed our site then stopped being fully crawled by Google bot, so we reversed it. Can any one shed any light?

    | NickOrbital
    0

  • Hello Mozzers! Apologies if this question has been asked before, but I couldn't find an answer so here goes... Currently I have one robots.txt file hosted at https://www.mysitename.org.uk/robots.txt We host our shop on a separate subdomain https://shop.mysitename.org.uk Do I need a separate robots.txt file for my subdomain? (Some Google searches are telling me yes and some no and I've become awfully confused!

    | sjbridle
    0

  • Hi Moz folks, We have launched an international site that uses subdirectories for regions and have had trouble getting pages outside of USA and Canada indexed. Google Search Console accounts have finally been verified, so we can submit the correct regional sitemap to the relevant search console account. However, when submitting non-USA and CA sitemap files (e.g. AU, NZ, UK), we are receiving a submission error that states, "Your Sitemap appears to be an HTML page," despite them being .xml files, e.g. http://www.t2tea.com/en/au/sitemap1_en_AU.xml. Queries on this suggest it's a W3 Cache plugin problem, but we aren't using Wordpress; the site is running on Demandware. Can anyone guide us on why Google Search Console is rejecting these sitemap files? Page indexation is a real issue. Many thanks in advance!

    | SearchDeploy
    0

  • Hi, If a site is quite new and setup as https from the beginning why would http variations exist? There are 301 redirects in place from the http to the https variation and also canonical tags pointing back to the http variation? This seems contradictory to me. I'm not sure why the http variations exist at all but they have gone to the trouble of redirecting these to the https variation indicating that it is the variation of choice but at the same time using a canonical tag that indicates the http variation is the original/main url? Thanks

    | MVIreland
    0

  • I manage a store selling prescription glasses, many of which are unisex or apply to more than one category. I have already assigned the canonical URL for each category, but my question is, if a product appears in more than one category, do I need to set the canonical URL in each product to reflect the category I want it to index under? Therefore, any additional categories that product appears in simply refers the link value back to the canonical URL. I note that in Yoast, under each product, there's note in the canonical setting to leave it empty to default to permalink, so this has confused me a little. I'm just concerned that by applying a product to multiple categories, it may be causing duplicate content, as I have a lot of duplicate issues which I'll raise in another question. Thanks!

    | SushiUK
    1

  • I am managing a woocommerce store selling prescription glasses/spectacles. We have a lot of categories with similar names and I want to adopt the best possible naming convention to get the best from search. So we have a number of similar categories for both Men's and women's glasses. Currently they are named as follows: Women's Glasses-Women's Rimless Glasses
    -Women's Semi Rimless Glasses
    -Women's Plastic Glasses
    -Women's Metal Glasses
    -Women's Retro Glasses Currently, this results in the following URL structure for sub categories: https://www.glassesonspec.co.uk/product-category/womens-glasses-2/womens-rimless-glasses/ (For some reason WooCommerce is adding -2 to the end of the primary category name, it will not let me change it for some reason, this is the subject of a further investigation!) So first question, is there too much duplication of the word glasses on the sub items? for example, should they read; Women's Glasses
    -Rimless
    -Semi Rimless
    -Plastic
    -Metal
    -Retro Hence giving this URL structure: https://www.glassesonspec.co.uk/product-category/womens-glasses-2/rimless/ OR, should we change the top level category name to just Women's and let the sub categories complete the picture?: Women's
    -Rimless Glasses
    -Semi Rimless Glasses
    -Plastic Glasses
    -Metal Glasses
    -Retro Glasses Giving this example URL structure: https://www.xxxxxxxxxxxxx.co.uk/product-category/womens/rimless-glasses/ This would solve my hyphenation problem, however my fear is the top level category on it's own is not descriptive enough when viewed as stand alone: https://www.xxxxxxxxxxxxx.co.uk/product-category/womens/ The second part of my question relates to how to deal with the change in URL structure. I am using Yoast Premium, so will that pick up the changes and automatically redirect to the new one as it does when done manually? Or will I need to take a different approach using HTACCESS commands? I hope the above makes sense, Many thanks, Bob

    | SushiUK
    0

  • New theme I am working in ads ?v=1d20b5ff1ee9 to every URL. Theme developer says its a server setting issue. GoDaddy support says its part of cache an becoming prevalent in new themes. How does this impact SEO?

    | DML-Tampa
    0

  • Hi Our site's front page has almost 900 internal links on it (it's an ecommerce site with about 25,000 products). A lot of these are on a pretty involved dropdown menu, which is on every page. I can't really do anything to get this figure down (its outside my remit), but one thing the developers have done is make all the menu links nofollow on the mobile version of the menu (site is responsive) - otherwise there would be even more links! My question is as to whether doing this for the mobile menu is a good idea, in terms of SEO?

    | abisti2
    1

  • Html Improvements in Webmaster shows many as Duplicate Titles. As attached they are not duplicates we made a way to make text hyperlinks if the name matches other objects in our site. How can we deal in such case for Google not to this it as 2 different URl's rather they are one. As the ones with ?alinks are just hyperlink URL's Say we have a name as "James" and he has a biography in our site. Say "Gerald" has a Bio as well and we talk about "James" in "Geralds" bio the word "James" gets a hyperlink automatically so when anyone clickes "James" it goes to his bio. k5jDM

    | ArchieChilds
    0

  • Recently a site moved from blog.site.com to site.com/blog with an instruction like this one: /etc/httpd/conf.d/site_com.conf:94: ProxyPass /blog http://blog.site.com
    /etc/httpd/conf.d/site_com.conf:95: ProxyPassReverse /blog http://blog.site.com It's a Wordpress.org blog that was set as a subdomain, and now is being redirected to look like a directory. That said, the robots.txt file seems to be ignored by Google bot. There is a Disallow: /tag/ on that file to avoid "duplicate content" on the site. I have tried this before with other Wordpress subdomains and works like a charm, except for this time, in which the blog is rendered as a subdirectory. Any ideas why? Thanks!

    | rodelmo4
    0

  • Hey guys, I was thinking about creating subdomains for one of my websites. I want to divide my website in different subdomains (blog.[site].com / directory.[site].com / etc.) but I'm afraid that this will negatively impact my rankings. My blog for example has a lot of supporting content for my products and services that are primarily hosted on the homepage. Have you guys ever created subdomains at a later stage of your website's existence? What kind of impact did you notice? Would you recommend it? Thanks a million!

    | Nizar.
    1

  • What is Google's minimum desktop responsive webpage width? Fetch as Google for desktop is showing a skinnier version of our responsive page.

    | Desiree-CP
    0

  • I'll try to keep this as clear and high level as possible. Thank you in advance for any and all help! We're managing a healthcare practice which specializes in neurosurgical treatments. As the practice is rather large, the doctors have several "specialties" in which they focus in, i.e. back surgery, facial surgery, brain surgery, etc. They have a main website (examplepractice.com) which holds ALL of their content on each condition and treatment in which they deal with. So, if someone enters their main homepage they will see conditions and treatments for all the specialties categorized together. However, linked within the main site are "mini-sites" for each specialty (same domain, same site) (examplepractice.com/brain-surgery), but with a different navigation menu to give the illusion of "separate website". These mini-sites are then tailored from a creative, content and UX perspective to THAT specific group of treatments and conditions. Now, anyone who enters this minisite will find information pertaining to only that specialty. The mini-sites are NOT set up as folders, but rather just a system of URLs that we have mapped out to each page. We set up the pages this way to maintain an exclusive feel for the site. Instead of someone drilling into a specific condition and having the menu change, we created the copies. But, because of how this is set up, we now have duplicate content for each treatment and condition child page (one on the main site, one on the minisite). My question (finally) is will this cause a problem in the future? Are we essentially splitting the "juice" between these two pages? Are we making it easier for our competitors to outrank us? We know this layout makes sense from the perspective of a user, but we're unclear how to move forward from a search perspective. Any tips?

    | frankmassanova
    1

  • The people in my client's office get different results when they search for their company name in Google.  For example one person ALWAYS gets the right rail knowledge panel with full details about the company while her boss NEVER sees it. They are both on desktop search. Rosemary

    | RosemaryB
    0

  • question answered

    | EPPD
    0

  • Hi all, this is my first post so be kind 🙂 - I have a one page Wordpress site that has the Yoast plugin installed. Unfortunately, when I first submitted the site's XML sitemap to the Google Search Console, I didn't check the Yoast settings and it submitted some example files from a theme demo I was using. These got indexed, which is a pain, so now I am trying to remove them. Originally I did a bunch of 301's but that didn't remove them from (at least not after about a month) - so now I have set up 410's - These also seem to not be working and I am wondering if it is because I re-submitted the sitemap with only the index page on it (as it is just a single page site) could that have now stopped Google indexing the original pages to actually see the 410's?
    Thanks in advance for any suggestions.

    | Jettynz
    0

  • Hiya, As you know, you can specify the date of the last change of a document in various places, for example the sitemap, the http-header, ETag and also add an "expected" change, for example Cache-Duration via header/htaccess (or even the changefreq in the sitemap). Is it advisable or rather detrimental to use multiple variants that essentially tell browser/search engines the same thing? I.e. should I send a lastmod header AND ETag AND maybe something else? Should I send a cache duration at all if I send a lastmod? (Assume that I can keep them correct and consistent as the data for each will come from the very same place.) Also: Are there any clear recommendations on what change-indicating method should be used? Thanks for your answers! Nico

    | netzkern_AG
    0

  • delete

    | poke1
    0

  • So this is a very old problem, but I'm finally getting around to trying to figure it out.  My site experienced a dramatic organic traffic drop from Google (-40%) on May 17th 2014.  It then drops another 30% on May 19th 2014.  See graphs.  According to Moz these two dates correlate with Payday Loan 2.0 and Panda 4.0. Panda makes complete sense, as this site (www.ausedcar.com) has a large amount of content that is syndicated across other sites (used car inventory is essentially the same everywhere on the Internet).  Payday Loan on the other hand, which seems to be the primary traffic drop doesn't make any sense at all.  Is it possible I started getting hit by Panda on the 17th and then it completed on the 19th?  I know the dates for algorithm changes are not perfect. Next, assuming it is Panda, what are some things you guys have done to help with this?  As I mentioned this content is duplicated all over the Internet, so it seems like Google arbitrarily picks winners and losers (my site is twenty years old!).  I know I need unique content, but not sure how exactly to do that besides rewriting words so it doesn't appear duplicate.

    | Catbelly
    0

  • Hi, The service we provide has not so many searches per month. A long tail keyword that describes the service well has at the most 400 searches per month. We wrote a post for this keyword and we ranked number 1 for many months. Now we're on page 2 and I the truth is we stopped writing blog posts because we were raking well for our best keywords. I added a few new posts and lost ranking on my top keywords so I gave up, deleted them and recover the rankings for the keywords I wanted the most. The problem is that I have lost these positions and I know we're supposed to be updating the blog regularly. What would you suggest? Should we keep writing about the same thing and use rel canonical? There aren't that many keywords related to what we offer. I appreciate any ideas.

    | Naix
    0

  • I am a weebly user and have a site (crmi-online.com) which has an RSS Feed of my current blog entries on my home page.  I am struggling with the following issues... Duplicate Content - Moz shows duplicate content for my blog articles.  Which are also housed here (http://www.crmi-online.com/news-and-press-releases).  Could having the feed on my site be causing this issue? Traffic- My Weebly stats show that the RSS Feed is my 2nd highest ranking page at 713 views per this month.  However when I compare it to the data in Moz my news feeds or the actual blog page are not one of my top ranking pages.  Can someone help to explain the difference?  Also, it appears that my site gets a lot of spam traffic and I wonder if removing the feed would help and give more accurate numbers. Is it necessary? - Is it necessary to have both the feed and the blog page?  I want to move to one of the new weebly responsive themes after having trouble with a current custom/non-supported template.  If I do that the feed will have to go since it jumbles the content in mobile view.  I am not sure of the need except for the aesthetics on the home page since I dont see any correlation to traffic to the actual article/blog itself. Thanks in advance for your help!

    | CRMI
    0

  • We're using a lot of videos on our new website (www.4com.co.uk), but our immediate discovery has been that this has a negative impact on load times. We use a third party (Vidyard) to host our videos but we also tried YouTube and didn't see any difference. I was wondering if there's a way of using multiple videos without seeing this load speed issue or whether we just need to go with a different approach. Thanks all, appreciate any guidance! Matt

    | MattWatts
    1

  • I had a problem with a lot of crawl errors (on Google Search Console) a while back, due to the removal of a shopping cart. I thought I'd dealt with this & Google seemed to agree (see attached pic), but now they're all back with a vengeance! The crawl errors are all the old shop pages that I thought I'd made clear weren't there anymore. The sitemaps (using Yoast on Wordpress to generate these) all updated 16 Aug but the increase didn't happen till 18-20. How do I make it clear to Google that these pages are gone forever? Screen-Shot-2016-08-22-at-10.19.05.png

    | abisti2
    0

  • Just checking in - i'm working on a site with tons of broken backlinks from high authority sites. For instance, I've discovered that some 90% of their backlinks are broken, and these are from highly recognizable, name brand magazines, newspapers, blogs and the like. Right now, the site has a Domain Authority of 48 (better than most in the industry from what i am learning) yet as the site has been around for years and has gone through 5 redesigns, there is an absolute ton of solid inbound backlinks that are getting 404's. Using Screaming Frog (list mode) I've also learned there are a ton of 301's that turn out to be redirecting to 404 pages so that also starts to add up. I always knew this was a problem / opportunity and I've always considered it a high priority to fix (301) broken links of this sort to improve ranking (you know, using htaccess or WordPress Redirection tools) -- and to avoid multiple redirects wherever possible. In fact, I consider it a basic all-win, no-lose strategy. I always assumed this was the case and I also assume this will continue to be so. However, as a professional, I always want to double check my assumptions every now and then... Is this still considered solid strategy? Are there any issues that one should look out for?

    | seo_plus
    0

  • One of my clients organic ranking is now for zero keyword.

    | Beachflower
    0

  • We have a lot of duplicate pages (600 urls) on our site (total urls 800) built on the Magento e-commerce platform. We have the same products in a number of different categories that make it easy for people to choose which product suits their needs. If we enable the canonical fix in Magento will it dramatically reduce the number of pages that are indexed. Surely with more pages indexed (even though they are duplicates) we get more search results visibility. I'm new to this particular SEO issue. What do the SEO community have to say on this matter. Do we go ahead with the canonical fix or leave it?

    | PeterDavies
    0

  • Does google index images or ALT text only?

    | ArchieChilds
    0

  • Hello MOZ Community! I have been scouring the web to try and assess what may be responsible for our sites traffic decline which fell off quite suddenly between March and April of this year. While I have heard of two specific things that may have impacted us: The Mobile-Friendly update (April 2016) Phantom Update (May 2016) I would like to share what we have tried to uncover ourselves, and would welcome your thoughts/feedback on what may have happened, thoughts on repairing the rank damage, and in general. In May we noticed a large cooling off of our organic searches from Google, the largest a >30% occured in June, but on average we're seeing a trend of 23% less visitors from organic search than only months prior. Paired with this was a decrease in overall visibility, like a number of pages were struck from indexing. The number of URLs ranked in the Top 100 listings dropped by ~ 32% at the beginning of April We review our visibility with SISTRIX's Visibility Index tool, which showed a visibility plummet of almost 50% between mid-March and mid-April Traffic, until towards the end of July, early August has remained low and now only really returning to normal levels in the past few days. All that said, we're puzzled by what would cause such a sharp decline in visibility > organic search. Regarding the known/unknown Google Updates, while our primary site is responsive, we do have a number of old help/support pages that are not responsive – So that is one theory, but I'd appreciate any thoughts or insights you have, or to know if anyone else experienced a drop within the same timeframe.

    | kernewein
    0

  • My website hasn't using responsive design and separate domain for mobile optimize, But we using dynamic serving for mobile version, Read more about dynamic serving here So our website must different design for both version, And then what would be happen in term of SEO if our website hasn't show the same content as desktop but still align with the main content,  Such as Desktop has longer content compare to mobile version or Desktop has long H1 but mobile is shorter than. What should we do for this case and how to tell Google Bot.

    | ASKHANUMANTHAILAND
    0

  • Hi MOZ community, I'm hoping you guys can help me with this. Recently our site switched our landing pages to include a 180 item and 60 item version of each category page. They are creating duplicate content problems with the two examples below showing up as the two duplicates of the original page. http://www.uncommongoods.com/fun/wine-dine/beer-gifts?view=all&n=180&p=1 http://www.uncommongoods.com/fun/wine-dine/beer-gifts?view=all&n=60&p=1 The original page is http://www.uncommongoods.com/fun/wine-dine/beer-gifts I was just going to do a rel=canonical for these two 180 item and 60 item pages to the original landing page but then I remembered that some of these landing pages have page 1, page 2, page 3 ect. I told our tech department to use rel=next and rel=prev for those pages. Is there anything else I need to be aware of when I apply the canonical tag for the two duplicate versions if they also have page 2 and page 3 with rel=next and rel=prev? Thanks

    | znotes
    0

  • I’ve been trying a couple of new site auditor services this week and they have both flagged the fact that I have some nofollow links to internal pages. I see this subject has popped up from time to time in this community. I also found a 2013 Matt Cutts video on the subject: https://searchenginewatch.com/sew/news/2298312/matt-cutts-you-dont-have-to-nofollow-internal-links At a couple of SEO conferences I’ve attended this year, I was advised that nofollow on internal links can be useful so as not to squander link juice on secondary (but necessary) pages. I suspect many websites have a lot of internal links in their footers and are sharing the love with pages which don’t really need to be boosted. These pages can still be indexed but not given a helping hand to rank by strong pages. This “equity sculpting” (I made that up) seems to make sense to me, but am I missing something? Examples of these secondary pages include login pages, site maps (human readable), policies – arguably even the general contact page. Thoughts? Regards,
    Warren

    | Warren_Vick
    1

  • Hi, This may or may not be be an issue, but would like some SEO advice from someone who has a deeper understanding. I'm currently working on a clients site that has a bespoke CMS built by another development agency. The website currently has a sitemap with one link - EG: www.example.com/category/page. This is obviously the page that is indexed in search engines. However the website structure uses www.example.com/page, this isn't indexed in search engines as the links are canonical. The client is also using the second URL structure in all it's off and online advertising, internal links and it's also been picked up by referral sites. I suspect this is not good practice... however I'd like to understand whether there are any negative SEO effectives from this structure? Does Google look at both pages with regard to visits, pageviews, bounce rate, etc. and combine the data OR just use the indexed version? www.example.com/category/page - 63.5% of total pageviews
    www.example.com/page - 34.31% of total pageviews Thanks
    Mike

    | MikeSutcliffe
    0

  • hi community let's say i have to 2 e-commerce sites selling the same English books in different currencies - one of the site serves the UK market ( users can purchase in sterling) while another one European markets ( user can purchase in euro). Sites are identical. SEO wise, while the "European" site homepage has a good ranking across major search engines in europe, product pages do not rank very well at all. Since site is a .com too it s hard to push it in local search engines. I would like then to push one of the sites across all search engines,tackling duplicate content etc.Geotargeting would make the rest. I would like to add canonicals tag pointing at the UK version across all EU site product pages, while leaving the EU homepage rank. I have 2 doubts though: is it ok to have canonical tags pointing at an external site. is it ok to have part of a site with canonical tags, while other parts are left ranking?

    | Mrlocicero
    0

  • We have multiple domains on the same C Block IP Address. Our main site is an eCommerce site, and we have separate domains for each of the following: our company blog (and other niche blogs), forum site, articles site and corporate site. They are all on the same server and hosted by the same web-hosting company. They all have unique and different content. Speaking strictly from a technical standpoint, could this be hurting us? Can you please make a recommendation for the best practices when it comes to multiple domains like these and having separate or the same IP Addresses? Thank you!

    | Motivators
    0

  • I've read in many articles that pages can "pass" rank to other pages internally. Is anyone aware of any well done internal linking case studies which confirm this? If my homepage has the strongest Page Authority, would linking to another page deeper into my website from my homepage boost my rank for the deeper page in Google (more so than linking to the deep page from a page with lower page authority)?

    | poke1
    0

  • Hi, I am banging my head against the wall regarding the website of a costumer: In "duplicate title tags" in GSC I can see that Google is indexing a whole bunch parametres of many of the url's on the page. When I check the rel=canonical tag, everything seems correct. My costumer is the biggest sports retailer in Norway. Their webshop has approximately 20 000 products. Yet they have more than 400 000 pages indexed by Google. So why is Google indexing pages like this? What is missing in this canonical?https://www.gsport.no/herre/klaer/bukse-shorts?type-bukser-334=regnbukser&order=price&dir=descWhy isn't Google just cutting off the ?type-bukser-334=regnbukser&order=price&dir=desc part of the url?Can it be the canonical-tag itself, or could the problem be somewhere in the CMS? Looking forward to your answers Sigurd

    | Inevo
    0

  • Hi everyone, I stumbled upon my Google Analytics screen resolution report and saw this: http://imgur.com/a/rwobq Are they the same screen resolution since they do have separate sessions in Google Analytics? What's the reason behind this? 😞

    | Francis.Magos
    0

  • Let's say you have a page on your website which displays the current discounts available for iPhones. The page is a list of deals with buttons to reveal a promo code. Would adding contextual content to these pages improve rankings? If the main keywords are already on the page, such as "Save 20% on iPhone 5 with this great iPhone coupon code" where iPhone coupon code is the target keyword. Does it still make sense to put 500+ words of contextual content on that page, even when the content isn't really something the viewer cares about? I've noticed websites doing this, and ranking well. I wanted to know if this is a significant ranking factor or just a coincidence.

    | poke1
    0

  • We have several pages in our site like this one, http://www.spectralink.com/solutions, which redirect to deeper page, http://www.spectralink.com/solutions/work-smarter-not-harder. Both urls are listed in the sitemap and both pages are being indexed. Should we remove those redirecting pages from the site map? Should we prevent the redirecting url from being indexed? If so, what's the best way to do that?

    | HeroDesignStudio
    0

  • Hi Moz Fan, My websites - https://finance.rabbit.co.th/ has run financial service, So our main keywords is about "Insurance" in Thai, But today I have an issues regarding to carnonical tag. We have a link that containing by https://finance.rabbit.co.th/car-insurance?showForm=1&brand_id=9&model_id=18&car_submodel_id=30&ci_source_id=rabbit.co.th&car_year=2014 and setting canonical to this url - https://finance.rabbit.co.th/car-insurance within 5,000 items. But in this case I have an warning by site audit tools as Duplicate Page Title (Canonical), So is that possible to drop our ranking. What should we do, setting No-Index, No-Follow for all URL that begin with ? or keep them like that.

    | ASKHANUMANTHAILAND
    0

  • I have used to rich snippet to my website & everything is working fine except showing the total number of products listed in the particular category. Check out the screenshot below: aH7XM

    | promodirect
    0

  • Hello, i am facing the below problem and want your help. I will give an example for you to understand. The problem with the Greek language is that in or words we have tones above a letter in the most words. I will present now which the problem is. When you want to write dissertation the right way is "πτυχιακές εργασίες" and this is the most important service we offer the problem is that people are searching using the same keyword but without the tones like : "πτυχιακες εργασιες" . If you see monthly searches in keyword planner the difference in volume is huge. Also in rankings for the first one i am on place 2 but for the second one i am on place 3. The problem is that i cannot have the grammatically wrong phrase in my page title and also in my content so what should i do? The same problem is almost on all keywords where people are searching without a tone or searching the same phrase using English characters like "ptyxiakes ergasies". I have fully analyzed the problem. Please give a hand thanks in advance

    | anavasis
    0

  • So, trying to get e big e-commerce site to work on their page loading issuses. Their question left me without an answer, so how fast should a site be, so that it will get a Green light at the Googles Page Speed test? Is there a number in seconds? Do we know that?

    | ziiiva123
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.