Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Are encoded urls bad for internal linking? Does it effect SEO?

    On-Page Optimization | | patmatt
    0

  • A page on our WordPress powered website has had an error message thrown up in GSC to say it is included in the sitemap but set to 'noindex'. The page has also been removed from Google's search results. Page is https://www.onlinemortgageadvisor.co.uk/bad-credit-mortgages/how-to-get-a-mortgage-with-bad-credit/ Looking at the page code, plus using Screaming Frog and Ahrefs crawlers, the page is very clearly still set to 'index'. The SEO plugin we use has not been changed to 'noindex' the page. I have asked for it to be reindexed via GSC but I'm concerned why Google thinks this page was asked to be noindexed. Can anyone help with this one? Has anyone seen this before, been hit with this recently, got any advice...?

    Technical SEO | | d.bird
    0

  • Hi there, We have a variation on the subdomain/sub-directory question... Our business has two monetising areas, a clinic and a shop. To market them, we do recipes, blogs and social media, rather than relying primarily on SEO, but we do want to up our SEO game. Our primary site is www.example.co.uk This is Wordpress and where we market the clinic, host the recipes and blogs, and is our main email domain. Our second site is Woocommerce, at  www.example.shop Our shop market is primarily in the UK, but we seem to pick up a fair amount of international business, partly because the clinic does virtual consultations to many countries. The shop is online only. We have physical clinics across the UK. Both sites cross link extensively, eg with blogs advertising products in the shop. The branding is intentionally related yet different, because they have very distinct functions, and eg. I don’t want to clutter the interface or distract people with blog or clinic once we have funnelled them to the shop checkout. I would also like to separate the blog and recipe elements from the clinic, using a slightly different theme with different functions. We use a lot of plugins, and the more we aggregate functions on the same Wordpress instance, the more likely something is to go wrong. I like the new TLDs because they are more “human”, and they identify where you are and what you are doing more clearly. We do email footers with links to example.clinic (redirected to www.example.co.uk) and example.shop. They are simple and explain what is going on. Conversely, shop.example.co.uk is not so easy to write or read out. www.example.co.uk/shop looks like an afterthought, rather than a shop in its own right with its own home page. So there would have to be a really good SEO reason for me to merge the shop into the main site with reverse proxy or multisite. Do you think that there is such a good reason? If not, by the same token, would it make sense to separate out example.blog or even naturedoc.recipes from example.clinic  and use .co.uk as a single page portal to the three separate sites? My instinct, for what it is worth is that Google is smart enough to have started thinking that domains linked by topic TLDs can be equivalent to subdomains, and to recognise that we are not trying to build links from spammy unrelated sites. My last area is about human behaviour... Are people are as happy to click on or type in a new TLD like .clinic as a local .co.uk one?  ...when (a) it is not a discredited TLD like .biz, and (b) it gives them more insight into what they will get when they arrive. And since we have the .uk domain, should we switch to this shorter version at the same time? I already use it for custom shortcodes (eg. example.uk/fte6 for people to type in from printed material or instagram). I can’t help feeling .uk has been unsuccessful, and its use now looks bad, even if it is shorter. Many thanks in advance.

    Local SEO | | MizRabble
    0

  • Hi all, as a Newbie getting my holiday home rental site up & running, I just cannot find a clear answer to this after many hours research. Moz & everyone else advises I need to optimise by ensuring my keywords are in my H1s, that H1s need to be on every page, more than 20 characters, but still unique, relevant, a statement of the content, appealing to users & not keyword stuffed. How can I include my keywords ("holiday home tasmania" & "tasmania holiday rentals") in the heading on EVERY page & still make every heading unique, relevant & not keyword stuffed? I only have 10 pages Home / The Space / Amenities / Location / About / Guide / House Rules / Reviews / Contact & they by nature need to be information based, not designed like a more creative Blog (which I will add later). Eg - my Amenities page which is a quick reference list so people can easily see inclusions & find if we have features they need/want. It seems really awkward & not in keeping with the chic, designer image I am trying to project to have "Amenities At Your Holiday Home Tasmania", "Three Beaches Tasmania Holiday Rental Location", "About This Beach Holiday Home Tasmania", "Your Guide To The Best Of Your Holiday Accommodation's Local Area", House Rules Of Your Chic Holiday Home Tasmania", "It's Easy To Contact Your Next Tasmania Holiday Rental" Much of the information out there including Moz's seems to be oriented towards blogs where there is a lot more creative freedom for an expressive H1, rather than a service business in a competitive space where people need to access facts & features quickly in order to make a buying decision & are very quickly going to notice & be irritated by the use of similar sounding phrases in every headline AND sprinkled throughout the page content. Many thanks, Cherie - Australia

    On-Page Optimization | | Luminatrix
    1

  • Hello, Thank you to anyone who takes the time to share their thoughts on this. I will preface this by saying that I am very new to the community and have lots to learn, so please forgive any obvious errors on my part. That having been said am very happy to receive positive criticism and feedback 🙂 Quick Background: We are a high end mobile wellness business based in Toronto Canada offering in home/office servicing including: yoga, pilates, nutrition, meditation, chiropractors, etc... As we are expanding  we are transitioning form new leads coming from business partners and word of mouth to driving new business online As such we have an new Squarespace site (which is the first site I ever built, so any feedback is welcome) and are venturing into social media, SEO, local citations etc... for the first time We have a significant content catalogue originally  for client and instructor education that we are now repurposing for this new digital adventure but have not yet deployed While currently focused in Torotno, we have plans to expand to several other countries in the next two years. As the site is quite new and we have little content or incoming links I was thinking now is the time to switch to .com from .ca before we roll out Website: www.anahana.ca Risk Reward? & Other Issues? Both domains are currently verified with Squarespace, and it seems easy enough to switch. What could blow up by making this switch which I might not be aware of? Our emails and business card use the .ca, but I don't think this would matter too much 6-12 months out... is there something else I might be missing on this? .com and using subfolders or subdomains as opposed to country specific TLDs ? This is something I am still working on understanding, but from what I have learned thus far, if we are going to progressively roll out a large content library, is it not better from an SEO standpoint to have this all in one domain? Local SEO and legal considerations for TLDs when operating local Service Area Businesses. I am sure there are many other angles here that I am missing and am not really looking for any hard answer on much of this, but any general advice, suggested resources, and experienced insights would be extremely helpful. Thanks so much, cj

    Local SEO | | CJ777
    0

  • If a company is Safe-Tec but the domain name is Safetechelmets (no dash) and the Twitter account is Safe-Tec Smart and the FaceBook is Safe-Tech Safety Gear, how damaging is this for SEO, and is there a way to prevent the damage without changing the Twitter Account, Facebook account of domain name? Thank you so much in advance.

    Branding | | BirdIsTheWord
    2

  • Hi All,
    We have two websites:
    example.info - this is a working site in Russian hreflang="ru"
    example.com - this new site We want to start with US. For the US, we will have: local address and phone, currency in $, fully translated content.
    In the future we want to expand the business (ie en-GB, en-CA, de-DE, fr-CA, fr-FR). For each country, a regional dialect, currency, address and telephone number will be used. I need to choose the right URL structure so that there won't be problems in the future. 1. When configuring geotargeting (ie fr-CA and en-CA ) in the URL of the page specify: • http://example.com/ca/ - hreflang="en-CA" - Can use Search Console geotargeting
    • http://example.com/ca/fr/ - hreflang="fr-CA"
    or
    • http://example.com/en-ca/ - hreflang="en-CA" - Can I use a geo-targeting search console?
    • http://example.com/fr-ca/ - hreflang="fr-CA" .
    or
    • http://example.com/ca-en/ - hreflang="en-CA" - Can I use a geo-targeting search console?
    • http://example.com/ca-fr/ - hreflang="fr-CA" . quote: To geotarget your site on Google:
    o Page or site level: Use locale-specific URLs for your site or page. 2. If I set the target (ie "en-CA", "fr-CA" and "fr-FR"). Can I use the page http://example.com/fr/ with customized targeting (hreflang = "fr-FR") for french speakers worldwide (hreflang= "fr"), ie: french speakers worldwide quote: "If you have several alternate URLs targeted at users with the same language but in different locales, it's a good idea also to provide a catchall URL for geographically unspecified users of that language. For example, you may have specific URLs for English speakers in Ireland (en-ie), Canada (en-ca), and Australia (en-au), but should also provide a generic English (en) page for searchers in, say, the US, UK, and all other English-speaking locations. It can be one of the specific pages, if you choose." 3. Where is it better to place select of language and country on the page?
         Header, footer, pop-up window ......
         The page http://example.com will be used for hreflang = "en". In my case, do I need x-default? Can I use a page with hreflang="en"configured as the x-default version? ie: Is it right?

    Local Website Optimization | | SergeyFufaev
    0

  • I'm working with a content team on a job search guide for 2019. We already have a job search guide for 2018. Should we just edit the content of the job search guide for 2018 to make it current for 2019, which means the job search guide for 2018 would not exist anymore or should we keep the 2018 guide and just create a new web page for the 2019 guide that way both exist. We currently rank very well for the 2018 job search guide.

    Content Development | | Olivia954
    1

  • Hi Everyone, We just did a site migration ( URL structure change, site redesign, CMS change). During migration, dev team messed up badly on a few things including SEO. The old site had pages canonicalized and self canonicalized <> New site doesn't have anything (CMS dev error) so we are working retroactively to add canonicalization mechanism The legacy site had URL’s ending with a trailing slash “/” <> new site got redirected to Set of url’s without “/” New site action : All robots are allowed: A new sitemap is submitted to google search console So here is my problem (it been a long 24hr night for me  🙂 ) 1. Now when I look at GSC homepage URL it says that old page is self canonicalized and currently in index (old page with a trailing slash at the end of URL). 2. When I try to perform a live URL test, I get the message "No: 'noindex' detected in 'robots' meta tag" , so indexation cant be done. I have no idea where noindex is coming from. 3. Robots.txt in search console still showing old file ( no noindex there ) I tried to submit new file but old one still coming up. When I click on "See live robots.txt" I get current robots. 4. I see that old page is still canonicalized and attempting to index redirected old page might be confusing google Hope someone can help to get the new page indexed! I really need it 🙂  Please ping me if you need more clarification. Thank you ! Thank you

    Intermediate & Advanced SEO | | bgvsiteadmin
    1

  • Hi, A follow up question from another one  I had a couple of months ago: It has been almost 2 months now that my hreflangs are in place. Google recognises them well and GSC is cleaned (no hreflang errors). Though I've seen some positive changes, I'm quite far from sorting that duplicate content issue completely and some entire sub-folders remain hidden from the SERP.
    I believe it happens for two reasons: 1. Fully mirrored content - as per the link to my previous question above, some parts of the site I'm working on are 100% similar. Quite a "gravity issue" here as there is nothing I can do to fix the site architecture nor to get bespoke content in place. 2. Sub-folders "authority". I'm guessing that Google prefers sub-folders over others due to their legacy traffic/history. Meaning that even with hreflangs in place, the older sub-folder would rank over the right one because Google believes it provides better results to its users. Two questions from these reasons:
    1. Is the latter correct? Am I guessing correctly re "sub-folders" authority (if such thing exists) or am I simply wrong? 2. Can I solve this using canonical tags?
    Instead of trying to fix and "promote" hidden sub-folders, I'm thinking to actually reinforce the results I'm getting from stronger sub-folders.
    I.e: if a user based in belgium is Googling something relating to my site, the site.com/fr/ subfolder shows up instead of the site.com/be/fr/ sub-sub-folder.
    Or if someone is based in Belgium using Dutch, he would get site.com/nl/ results instead of the site.com/be/nl/ sub-sub-folder. Therefore, I could canonicalise /be/fr/ to /fr/ and do something similar for that second one. I'd prefer traffic coming to the right part of the site for tracking and analytic reasons. However, instead of trying to move mountain by changing Google's behaviour (if ever I could do this?), I'm thinking to encourage the current flow (also because it's not completely wrong as it brings traffic to pages featuring the correct language no matter what). That second question is the main reason why I'm looking out for MoZ's community advice: am I going to damage the site badly by using canonical tags that way? Thank you so much!
    G

    Intermediate & Advanced SEO | | GhillC
    0

  • Hi all! Currently we are implementing the href lang tag. I'm not really sure how to solve this: We sell our products in the Netherlands and Belgium. For the Netherlands we have 1 category page for pebbles (stones) which contain both rounded and non-rounded pebbles. In the Netherlands there is not really a difference between them (people search for pebbles and that's it). The URL: https://www.website.com/nl/pebbles. In Belgium there is a difference (people specifically search for rounded/non-rounded pebbles). Therefore, in Belgium we have 2 pages (we don't have an overall page): https://www.website.com/be/pebbles-rounded.
    https://www.website.com/be/pebbles-non-rounded. My question now is, what to do with the hreflang tags on these pages? Thanks in advance! Best, Remco

    Technical SEO | | AMAGARD
    0

  • I want to understand what is PageAuthority and Domain Authority and how we scale for SEO? Suggestions highly appreciated

    Getting Started | | SathishFirecompass
    0

  • I am working with a client who owns a driving school. There are 32 franchises in the same region of the state. These are independently owned and run. Each franchise has its own service area, phone number, offering, and contact information. The tricky thing is, they teach the classes either online or at local high schools, so it's service-based and there is no physical address to publicly display. They are trying to straighten out their Google Listings. They have 15 or so GMBs already either claimed or unclaimed. Some have the incorrect street addresses, some have service areas, etc. I am planning to manually create or clean up each one, marking them as serving the city they are in. I'm trying to put together an estimate on how much time it will take, and I'm wondering if there is any way to do this from a central ownership standpoint, instead of tackling each one independently. The same user will be the owner of each listing... Is there any way to speed up the process of claiming & optimizing (or any recommended approach in general)?

    Local Listings | | triveraseo
    1

  • Hope everyone is doing well! Long story short, we have been getting a lot of customer testimonials. We started putting them on Facebook and then we decided to start adding more details into the articles section of the site. The problem is many of our customers do not go to the articles section or the FB page. We purchased a plugin to integrate most recent Facebook posts into our website. We were going to use it on our homepage as well as our top category pages. I always run content changes through the MOZ scoring utility. After integration into the home page  it dropped my score from a 99 to a 91. The first issue is consistent.... to many links... the second issue we got Rand's WTH are you doing face with a warning that we are stuffing keywords.... Not our intent, but the FB posts do have keywords in them. So for now we are testing the integration on just our garage floor tile category page but I am wondering the following: 1. In general does this type of utility improve UX 2. Does Google make some concession for 'keyword stuffing' when it is because of social media posts being displayed on the page? 3. Given that we post daily, does the 'fresh content' outweigh the increased keyword density. We are ranking top 10 for several major national keywords and we don't want to screw that up. But we want to continue to improve our layout and UX

    Social Media | | GFLLCCO
    2

  • I am trying to figure out how much traffic goes to a particular page on a website. How would I go about finding this?

    Moz Bar | | jbcorcoran
    6

  • Hi All, I have a branded product that's always ranked 1st. It's a popular product which attracts the majority of our website's traffic. Now it's suddenly dropped from 1st to 20th. Can anyone advise me why this has happened? I've made no radical changes in the last month. keyword: Wattbike Atom   Url:https://wattbike.com/gb/product/atom atom

    On-Page Optimization | | WattbikeSEO
    0

  • Hi Guys, Got a site I am working with on the Wix platform. However site audit tools such as Screaming Frog, Ryte and even Moz's onpage crawler show the pages having no content, despite them having 200 words+. Fetching the site as Google clearly shows the rendered page with content, however when I look at the Google cached pages, they also show just blank pages. I have had issues with nofollow, noindex on here, but it shows the meta tags correct, just 0 content. What would you look to diagnose? I am guessing some rogue JS but why wasn't this picked up on the "fetch as Google".

    Technical SEO | | nezona
    0

  • I am redesigning my website with an updated theme.
    One of the theme plugins takes my url: https://example-site.com/1st-level-page/2nd-level-page/page-content and changes it to https%3A%2F%2Fexample-site.com%21st-level-page%22nd-level-page/page-content%2F Does this effect how a web crawler see's my internal links?
    Can this have a negative effect on my SEO?
    Should I edit the plugin to force it to render the url exactly as written?

    Link Building | | patmatt
    0

  • Hello! I'm having an issue with my website Rooms Index, the website is in Hebrew so I'll provide examples in English for better understandings. When I'm searching Rooms by Hour in Haifa, google doesn't show the intended category page which is this, instead it shows my homepage in the results, this happens only for certain areas, while other areas are working well such as Tel aviv. For example if I searched day use in Las Vegas it'd show me the Las Vegas page dayuse.com/las-vegas, but searching for Brooklyn I'd only see dayuse.com. the pages are indexed and I can find them if I search site:roomsindex.co.il what could cause such problem?

    Local Website Optimization | | AviramAdar
    0

  • I analyse my Birthday Page "https://www.giftalove.com/birthday"with comapare link profiles and found that total Internal Link 47,234. How my internal link suddenly increse. Please provide my details about my internal links.

    Technical SEO | | Packersmove
    0

  • Hi All, I know that the sitemaps.xml URL must be findable but what about the sitemaps/pageinstructions.xml URL? Can we safely noindex the sitemaps/pageinstructions.xml URL? Thanks! Yael

    Intermediate & Advanced SEO | | yaelslater
    0

  • On a category page the products are listed via/in connection with the search function on the site.  Page source and front-end match as they should. However when viewing a browser rendered version of a google cached page the URL for the product has changed from, as an example - https://www.example.com/products/some-product to https://www.example.com/search/products/some-product The source is a relative URL in the correct format, so therefore /search/ is added at browser rendering. The developer insists that this is ok as the query string in the Google cache page result URL is triggering the behaviour, confusing the search function - all locally.  I can see this but just wanted feedback that internally Google will only ever see the true source or will it's internal rendering mechanism possibly trigger similar behaviour?

    Intermediate & Advanced SEO | | MickEdwards
    1

  • Hi guys, I'm owning a London removal company - Mega Removals and wants to achieve 1st page rankings on Google UK for keywords like: "removals London", "removal company London", "house removals London" but have no success so far. I need professional advice on how to do it. Should I hire an SEO or should focus on content? I will be very grateful for your help.

    Local Website Optimization | | nanton
    1

  • Since September 13 sites have linked to our real estate website. MOZ domain authority of these sites ranks from 45-83. MOZ page authority ranges from 25-36. Six of these links were created in the last 3 weeks. The remainder have been live since Late September/October. Our domain authority has only increased from 18 to 19 since September. Our site has only 805 linking domains. Why has the increase in domain authority been so little? I have read that domain authority it algorithmic and that improving up to 25 is not too difficult. A few notes about the incoming links: -11 of the 13 do not rank for any keywords, 1 ranks for 20 keywords and one ranks for 1 keyword. 
    -One has 12 inbound links, three have 1 inbound links and the remaining 11 have no inbound links. 
    -One has 4 linking domains, three have one linking domains and the reaming 11 having no linking domains. 
    -7 of the 13 are listing in Google Search Console. The remaining 7 are not.
    -Only 3 of the 13 are shown as MOZ linking domains to our website.
    -Website traffic has increased somewhat since link building began. 
    -There has was initially some improvement in ranking but that has dropped off. From the metrics above, does anything sound deficient about these links? Is the fact that they are not being picked up by MOZ a negative sign or is it just an indication that MOZ's link index is rather incomplete? How about the fact that the links are not indicated in Google Search Console? Any insight would be greatly appreciated! Thanks, 
    Alan

    Moz Pro | | Kingalan1
    0

  • We currently only have one option for implementing our Schema. It is populated in the JSON which is rendered by JavaScript on the CLIENT side. I've heard tons of mixed reviews about if this will work or not. So, does anyone know for sure if this will or will not work. Also, how can I build a test to see if it does or does not work?

    Intermediate & Advanced SEO | | MJTrevens
    0

  • In looking at one of our competitors in Moz today, I noticed in their linking domains report showed a bit.ly link with a DA of 97. According to Moz this is a follow link as well. Should I be using bit.ly links to take advantage of their high domain? I was under the impression that bit.ly links were not considered backlinks. fRT2j5cRtXo3ARPq9

    Intermediate & Advanced SEO | | misterfla
    0

  • I want to know about link building strategy and does it matter the amount of times someone clicks on link on to your site from the external site to affect your domain authority

    Link Building | | SimplifyValetStorage
    0

  • I am in SEO for years but never met this problem. I have client who's web page was hacked and there was posted many, hundreds of links, These links has been indexed by google. Actually these links are not in comments but normal external urls's. See picture. What is the best way to remove them? use google disavow tool or just redirect them to some page? The web page is new, but ranks good on google and has domain authority 24. I think that these spam url's improved rankings too 🙂 What would be the best strategy to solve this. Thanks. k9Bviox

    White Hat / Black Hat SEO | | AndrisZigurs
    0

  • Any ideas how to increase the Yandex Site Quality Index via onpage changes?

    International SEO | | lcourse
    1

  • Hi I tried to add an analytics script. Google Tag Assistant recognizes the script that was added properly, but I don't receive any data in Analytics. I also have tried to implement the Analytics script with the Monsterinsights plugin, the code is well recognized by Google Tag Assistant, but I don't receive any data in Analytics. What is going wrong here? Website: https://www.dakwerken-vandriessche.be Thanks for your advice! BmRUkFJ

    Reporting & Analytics | | conversal
    1

  • Hey Folks, So I have a 1000 word articles talking about say Dubai Holiday. Is it okay to have 4-5 Dubai Holiday as anchor linked to the same page. Or it should be only be used once.

    Intermediate & Advanced SEO | | SAISEO
    0

  • Has anyone noticed that the Domain Authority (DA) as reported in Moz Pro has changed only within the last 1-2 months? We have screen shots showing plots of DA vs competitors w/ line graph 2 months ago starting in NOV 2017 which today starting JAN 2018 and comparing shows DA up to 50% different!
    The change is seen both in the Links Overview and under the Spam Score sections still marked "NEW". Can Moz confirm that it's only recently within the last 2 months that in Moz Pro the NEW DA numbers have retroactively been updated even though the new Link Explorer has been publicly out since APR 30 from https://moz.com/community/q/moz-s-new-link-explorer-including-our-revamped-index-and-da-pa-scores-is-now-open-to-everyone? Look at the top green line starting ~12 months ago on both graphs, w/ old below 40 and new above 50. We've seen even greater differences for other tracked domains. Thanks! view

    Link Explorer | | Amplitude_Digital
    0

  • Hi, We have a current category set up that is starting to rank OK but we are going through a site re-build and this category URL will now better describe a new category of products. My dilemma is if I 301 redirect the current url to my new category I won't be able to use the URL for the new one. But if I don't redirect it will the pages that have already been ranked under this url then confuse customers and search engines. For example - Products and sub-categories under the URL /personalised-toys will now become /personalised-toys-for-boys but I want to use the /personalised-toys URL for a different set of sub categories and products. Any assistance or ideas or just definitely don't do it in a particular way would be greatly appreciated

    Intermediate & Advanced SEO | | neil_stickergizmo
    0

  • Hi there, I recently realised that the citations and directories i was building used the same content than the one on my website. I know this is not best practice. I will for sure make sure it doesn't happen in the future, but I am affraid of the ones i built in the past. How much do you think this would affect my rankings, and do you think this is a priority to go through my citations and directories to modify it?

    Link Building | | H.M.N.
    0

  • Hi, I have a seo question that I just cant seem to get an answer for. Right now I have an international ecommerce shop selling to several european countries.
    The setup is like this:
    www.domain.com -> 301 -> www.domain.com/nl (dutch)
    www.domain.com/uk (english)
    www.domain.com/fr (french)
    www.domain.com/de (german) I also have the domainnames for the countries I sell to:
    www.domain.nl / www.domain.fr / www.domain.co.uk / www.domain.fr Its quite easy to switch to the later because I allready own the domains and can technically change this within an hour in our magento2 multistore. I think the customers would trust their local tld more than the .com and my question is: Is it also better for SEO?
    Or will the splitting of the backlinks hurt me. Thanks for your insights.

    Search Behavior | | Internet360
    0

  • I'm not sure what logic Moz is using for its reporting of Site Crawl issues, but it appears to be pretty flawed (unless I'm missing something, which is possible). I've got a client site that has been in Moz for about 6 months now. Every time the crawler runs, the same number of pages are reported as having been crawled. However I'm consistently getting "New Issues" reported that should have been reported during previous crawls. Example: A redirect chain was reported several month ago. The referring URL was the homepage of the website, and we tracked it down to an old link in the header. This was fixed, marked as resolved, and the issue was not shown on the next crawl. Several weeks later, the same issue was reported for a different page on the website - a page which has existed since 2014 and was already crawled many times. Again, we fixed. Fast-forward to the report that just ran on 12/1 and we have the same issue reported, for a different page, which has also existed for years and has been previously crawled. It's very hard to explain to a client "this item you are seeing has been resolved", only to have it continually crop back up in future reports. Note this is not limited to redirect chains - that's just an example. I'm seeing this for other items such as missing canonicals, duplicate titles, etc.

    Moz Bar | | RucksackDigital
    0

  • Hi how do you find whois type ownerhsip details etc etc for a .co domain ? are they on who is etc or a different system ? All Best Dan

    Technical SEO | | Dan-Lawrence
    0

  • Hi, I have a question concerning multilingual SEO for webshops. This is the case: the root domain is example.be, which has several subdomains. One of these subdomains is shop.example.be which is used for two webshops (Dutch and French), being shop.example.be/nl and shop.example.be/fr .
    The other root domain is example.lu (for Luxemburg) which is only used for the subdomain for the Luxemburg webshop in French, being shop.example.lu/fr.
    The content on the .lu/fr webshop is a small part of the content on the .be/fr webshop, and the product descriptions are the same and are both of course in French. The webshops will be redesigned and restructured, and the question is what to do with the .lu/fr webshop. There are two possibilities: Integrate this webshop for Luxemburg in the existing .be webshop, since most of the products are the same and the .lu webshop doesn't get a lot of visitors because of Luxemburg being a small country. The only thing to do then would be setting up a 301 from the .lu webshop to the .be/fr version to transfer link value.
    People in Luxemburg already sometimes get pages from the .be/fr webshop in the SERP anyway because these already have a bigger authority than the .lu/fr pages. Keep the .lu/fr webshop and use hreflang tags so the correct pages with similar content are shown in the correct country. I know that when using different TLD's this normally isn't an issue anyway, so implementing hreflang tags even isn't really necessary. Please feel free to share your thoughts about what would be the best approach. Thanks!

    International SEO | | Mat_C
    1

  • Hello, I have a large number of product pages on my site that are relatively short-lived: probably in the region of a million+ pages that are created and then removed within a 24 hour period. Previously these pages were being indexed by Google and did receive landings, but in recent times I've been applying a NoIndex tag to them. I've been doing that as a way of managing our crawl budget but also because the 410 pages that we serve when one of these product pages is gone are quite weak and deliver a relatively poor user experience. We're working to address the quality of those 410 pages but my question is should I be no-indexing these product pages in the first place? Any thoughts or comments would be welcome. Thanks.

    Intermediate & Advanced SEO | | PhilipHGray
    0

  • Hi community members, I am looking after SEO at our company and there are lots of changes happening about our website; especially technical changes. It's hard for me to look after every deployment of the website like change of server location, etc. We generally agree that every change related to website must be notified by SEO to understand the ranking fluctuation and how search engines welcome them. I just wonder what technical deployments of a website I could confidently ignore to save time and give a go ahead to technical team without interrupting or waiting for my approval. Thanks

    Web Design | | vtmoz
    1

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.