Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.

  • This question is deleted!

    | Dgoad1
    0

  • We are in a bit of a tricky situation since a key top-level page with lots of external links has been selected as a duplicate by Google. We do not have any canonical tag in place. Now this is fine if Google passes the link juice towards the page they have selected as canonical (an identical top-level page)- does anyone know the answer to this question? Due to various reasons, we can't put a canonical tag ourselves at this moment in time. So my question is, does a Google selected canonical work the same way and pass link juice as a user selected canonical? Thanks!

    | Lewald1
    0

  • disavow spam

    I recieved a huge amount of spamy link (most of them has spam score 100) Currently my disavow link is arround 85.000 lines but at least i have 100.000 more domain which i should add them. All of them are domains and i don't have any backlink in my file. My Problem is that google dosen't accept disavow link which are more than 2MB and showes this message : File too big: Maximum file size is 100,000 lines and 2MB What should i do now?

    | sforoughi
    0

  • reviews ecommerce schema product page

    Hi, I'm looking to introduce historical customer reviews onto our product pages but i want an opinion on whether a product page that's indexed will jump from 0 reviews to possible 30+, what if any problems that could arise from this.. For a bit of background, we've been collecting customer reviews/ratings since 2015 on our internal system. I'm only looking to start using feedback from 2020 onwards. The current set up is that the product page will display the latest 30 reviews, on the same page is a link that will take the user to another page where they can review all the customer feedback. I'm using Google Schema to markup the text to ensure it is firstly understood by google and displays correctly too. So back to my original question. Will an e-commerce product page that currently has no customer reviews that is indexed, been seen differently if when the next time it's crawled its found to have, say 30+ reviews? Are there any implications this way? What's your experience? I look forward to reading your feedback.
    Thanks

    | Train4Academy.co.uk
    0
  • This question is deleted!

    0

  • Recently my company started consulting for a SaaS company. They're clearly the best known, most trusted company on their area of work and they have the strongest brand, best product and therefore more users than any of their competitors by a big margin. Still, 99% of their traffic comes from branded, despite having 3x more domains, better performance scores and more content. Even using tools such as SimilarWeb for comparing user satisfaction metrics, they seem to have lower bounce rates and more visits per session. Still, they rank for almost nothing that is non branded on Google (they rank extremely well for almost everything on bing and DuckDuckGo). They don't have any obvious issues with crawling or indexation - we've gone to great depths to tick off any issues that could be affecting this. My conclusion is that it's either a penalty or a bug, but GSC is not flagging any manual actions. These are the things we've identified: All the content was moved from domain1.com to domain2.com at the end of 2017. 301s were put in place, migration was confirmed on GSC. Everything was done with great care and we couldn't identify any issues with it. Some subdomains of the site, especially support, rank extremely well for all sorts of keywords, even very competitive ones but the www subdomain ranks for almost nothing on Google. The www subdomain has 1,000s of domains pointing to it while the support has only a few 100s. Google is performing delayed rendering attempts on old pages, JS and CSS particularly versions of assets that were live before the migration in 2017, including the old homepage. Again, the redirects have been in place for 3 years. Search Console frequently showing old HTML (at least a year old) in cache despite a recent crawl date and a current 301. Search Console frequently processing old HTML (at least a year old) when reporting on schema. Search Console is sometimes selecting pages from the old domain as the canonical of a URL of an existing page of the current domain, despite a long-standing 301 and the canonicals being well configured for 3 years now. Has anyone experienced anything similar in the past? We've been doing an analysis of old SEO practices, link profile, disavow... nothing points to black hat practices and at this point we're wondering if it's just Google doing a terrible job with this particular domain.

    | oline123
    0

  • We have 2 variations of domains. www and non-www
    Both can be seen by users and have been linked to on press releases, but only the www one has data on Google Search Console.
    In the case, what is the best practice for us?

    | aaaannieee
    0

  • Hi all. Should you use rel sponsored on internal links? Here is the scenario: a company accepts money from one of their partners to place a prominent link on their home page. That link goes to an internal page on the company's website that contains information about that partner's service. If this was an external link that the partner was paying for, then you would obviously use rel="sponsored" but since this is a link that goes from awebsite.com to awebsite.com/some-page/, it seems odd to qualify that link in this way. Does this change if the link contains a "sponsored" label in the text (not in the rel qualifier)? Does this change if this link looks more like an ad (i.e. a banner image) vs. regular text (i.e. a link in a paragraph)? Thanks for any and all guidance or examples you can share!

    | Matthew_Edgar
    0

  • the crawl request of my company site: https://www.dhgate.com/ has dropped nearly over 95%, from daily 6463599 requests to 476493 requests at 12:00am on 9th, Oct (GMT+8). This dramatic dropping trend not only showed in our GSC crawl stats report but also our company's own log report. We have no idea what’s going on. We want to know whether there is an update of google about crawlling, or is this the issue of our own site? If something is wrong with our site, in what aspects would you recommend us to check, analyze and accordingly optimize?

    | DHgate_2014
    0

  • Hi All, I have a client who operates in multiple countries with the sub directory structure. In AU for their main brand name .com site still ranks in the first position but /au ranks for most of the other terms. Current we have a 301 redirect in place for .com for anyone accessing the site from AU to /au. This is only for home pages as other .com pages don't rank in Australia. Just wondering what implication this can have on our SEO campaign. Cheers
    Thank you for your expertise and insights in advance.

    | SSP21
    1

  • I use Yoast as SEO for my new Wordpress website https://www.satisfiedshoes.com/, however I couldn't get the sitemaps with Yoast as it was giving me error 404, and regardless of what I tried, it wasn't working. So I then got the All In One SEO while still having Yoast installed, I easily got the AIO sitemaps and then submitted them successfully to the Google search console. My question is that now I got the sitemaps on Google, since I'd rather use Yoast, If I want to delete AIO, will the sidemaps given to Google become invalid? There is no point keeping both SEO plugins active right? Thank You

    | iamzain16
    0

  • Hey guys, Recently (approx 1 month ago) did a migration from the .co.uk version of our site to .com/en. We've been doing a migration every few months to get everything under our .com. Previous migrations haven't had any problems at all, and hreflang tags detected correctly. For this new UK migration (that was done 1 month ago) google is saying that it doesn't detect any hreflang tags. We place our hreflang tags in our sitemap and so far we haven't had any problems with it. Here's the sitemap: https://camaloon.com/en/web-sitemap.xml Any thoughts on what could be happening? I really appreciate your input and help 🙌

    | mooj
    0

  • We are not sure that page does matter or not for google ranking as I am working for this keyword "flower delivery in Bangalore" from last few months and I saw some website's google first page who have low page speed but still ranking so I am really worried about my page that has also low page speed. will my Bangalore page rank on google the first page if the speed is low and kindly suggest me more tips for the ranking best factors which really works in 2020 and one more thing that domain authority really matters in this year? as I also saw some websites with low domain authority and ranking on google's first page. Home page: Flowerportal Bangalore page: https://flowerportal.in/flower-delivery/bangalore/ focus Keyword is: Flower delivery in Bangalore, send flowers to Bangalore

    | vidi3423
    1

  • I have a problem and i have a question on that. Many important keywords and long tail keywords are ranking with the home page url.
    How can i enrich the home page content without having a bad result and should i try to make the google spider send to more specific page?

    | mazzamz
    0

  • I have a section(no. of webpages with content) on my site with display ads. The site is mainly for UK visitors. I want to show ads to UK visitors but not US visitors. Rest of the content will be the same for UK and US. There will just be 1 page with same UrL for US and UK. Hence, no href lang tags are being used.
    Is there any correlation between display ads and SEO? Would not showing ads in US cause any issues for bots or do bots consider display ads and SEO as two completey different aspects. Asking as Google bot crawls from the US.

    | Kohliharleen
    1

  • Hey all! Someone I work with recently redirected one of their site pages via Squarespace. They used Squarespace-provided code to make a 301 redirect. Following this, the primary keywords for the page that was redirect to have dropped pretty significantly. Any Squarespace pros out there who can help me figure out what's going on?

    | kelseyworsham
    0

  • Regarding backlinks, I'm wondering which is more advantageous for domain authority and Google reputation: Option 1: More backlinks including a lot of spammy links Option 2: Fewer backlinks but only reliable, non-spam links I've researched this topic around the web a bit and understand that the answer is somewhere in the middle, but given my site's specific backlink volume, the answer might lean one way or the other. For context, my site has a spam score of 2%, and when I did a quick backlink audit, roughly 20% are ones I want to disavow. However, I don't want to eliminate so many backlinks that my DA goes down. As always, we are working to build quality backlinks, but I'm interested in whether eliminating 20% of backlinks will hurt my DA. Thank you!

    | LianaLewis
    1
  • This question is deleted!

    0

  • We're seeing a couple of temporary redirects. One for the http pointing to https. Another for /checkout pointing to/checkout/cart. We don't have an internal dev so not sure how to remove these. Would anyone know? I've set up the 301s but they're not overriding and I'm still seeing the issues in the crawl. Thanks in advance for your help!

    | LASClients
    0

  • magento 302

    Hi all, I'm assigned a site in Magento. After the first craw, we found almost 15k 302 redirects. A sample URL ends with this /stores/store/switch/?SID=qdq9mf1u6afgodo1vtvk0ucdpb&___from_store=default&___store=german&uenc=aHR0cHM6Ly9qdWljeWZsdXRlcy5jb20vP19fX3N0b3JlPWdlcm1hbg%2C%2C And they are currently 302 redirecting to the homepage as well as other main pages and also product pages it seems. Some of these point to account pages where customers log in. Probably best for me to de-index those so no issues there. But I'm worried about the 302 redirects to public pages. The extension we have installed is SEO Suite Ultimate by MageWorx. Does anyone here have experience here specifically and how did you fix it? Thanks, JC

    | LASClients
    0

  • This one has sort of been asked already but I cannot find an answer. When we evaluate a new SEO client, previously with Majestic we would review the root domain vs sub domain (www) for which had the higher Trust Flow and Citation flow, and if there was a major difference, adjust the Google indexed domain to the higher peforming one. Is there a way to do this with Moz, Domain Authority, and Sub Domain authority are always returning the same DA for me. Thanks in advance.

    | practiceedge1
    0
  • This question is deleted!

    | Davden
    0

  • Hi. We recently created a Christmas category page on our eCommerce website (christowhome.co.uk). Earlier today, I Googled ‘Christow Christmas Silhouette Lights’ (Christow being the name of our website and Christmas silhouette lights being one of the sub-categories we recently created). I was curious to see how the page appeared on search. Bizarrely, the page appeared multiple times on search (if you click on the link above, it should show you the search results). As you can see, multiple meta titles and descriptions have been created for the same page. This is something that is affecting a number of our Christmas category pages. I don't quite understand why this has happened. We recently added filters to the category. Could the filters be responsible? Any idea how I can prevent this from happening? How I can stop Google indexing these weird replica pages? Many thanks, Dave

    | Davden
    0

  • crawl errors 4xx error

    I have a client who sells highly technical products and has lots and lots (a couple of hundred) pdf datasheets that can be downloaded from their website. But in order to download a datasheet, a user has to register on the site. Once they are registered, they can download whatever they want (I know this isn't a good idea but this wasn't set up by us and is historical). On doing a Moz crawl of the site, it came up with a couple of hundred 401 errors. When I investigated, they are all pages where there is a button to click through to get one of these downloads. The Moz error report calls the error "Bot verification". My questions are:
    Are these really errors?
    If so, what can I do to fix them?
    If not, can I just tell Moz to ignore them or will this cause bigger problems?

    | mfrgolfgti
    0

  • Hi, Trying to markup products for a site that does not show prices. Is there any way to markup a product price when the business model is: 1. customer calls or contacts shop. 2. shop gives a price quote based on level of detail and finish on the product 3. there is no base or top price. Thanks in advance!

    | plahpoy
    0

  • I have two websites that are hosted in the same CMS. Rather than having two separate robots.txt files (one for each domain), my web agency has created one which lists the sitemaps for both websites, like this: User-agent: * Disallow: Sitemap: https://www.siteA.org/sitemap Sitemap: https://www.siteB.com/sitemap Is this ok? I thought you needed one robots.txt per website which provides the URL for the sitemap. Will having both sitemap URLs listed in one robots.txt confuse the search engines?

    | ciehmoz
    0

  • hey guy, can anyone help me in finding broken outbound link on my website by using moz ? does Moz has this function ?

    | rogerdam
    0

  • I have two sites that are facilitated hosting in similar CMS. Maybe than having two separate robots.txt records (one for every space), my web office has made one which records the sitemaps for the two sites, similar to this:

    | eulabrant
    0

  • How do I fix a 404 redirect chain? I can't seem to find the answer and I'm worried about it effecting my SEO. Any help would be great!

    | sammecooper
    0

  • I have tons of links that I have had added a redirect to after creating my companies new website. Is it bad to have all these 301s? How do I permanently redirect those links? Also, on Google Search Console it's telling me I have 1,000+ excluded links. Is this bad? Will it negatively affect me? Is this something to do with my sitemap? Any help would be greatly appreciated 🙂

    | sammecooper
    0

  • Our website for my olansi company in London, China has hundreds of pages dedicated to every service we provide to China local areas. The total number of pages is approximately 100. Google caters pretty well for long-tail searches when it indexes all these pages, so we usually get a fair amount of traffic when this happens. However, Google occasionally drops most of our indexed pages from search engine results for a few days or weeks at a time - for example, Google is currently indexing 60 pages while last week it was back at 100. Can you tell me why this happens? When these pages don't display, we lose a lot of organic traffic. What are we doing wrong? Site url:https://www.olanside.com

    | sesahoda
    0

  • Curious if anyone else is having this problem. I have, for example, a page that is listed in Search Console as having a CLS of .44 - it is listed as a "CLS issue." The same page rendered in LightHouse shows 0 for field data CLS and 0.02 for lab data (both in the "green"). It has been over a month since I made updates to the page to improve CLS. I tried to submit a validation in Search Console, but "validation failed." I'm not sure what else to fix on the page when LightHouse data shows it as in the green! I have the same issue with other pages as well.

    | LivDetrick
    0

  • seo 4xx error 4xx error error fix error

    Hello! I have a new blog that is only 1 month old and I already have over 3000 4xx errors which I've never had on my previous blogs. I ran a crawl on my site and it's showing as my social media links as being indexed as pages. For example, my blog post link is:
    https://www.thebloggersincentive.com/blogging/get-past-a-creative-block-in-blogging/
    My site is then creating a link like the below:
    https://www.thebloggersincentive.com/blogging/get-past-a-creative-block-in-blogging/twitter.com/aliciajthomps0n
    But these are not real pages and I have no idea how they got created. I then paid someone to index the links because I was advised by Moz, but it's still not working. All the errors are the same, it's indexing my Twitter account and my Pinterest. Can someone please help, I'm really at a loss with it.
    2f86c9fe-95b4-4df5-aeb4-73570881938c-image.png

    | thebloggersi
    0

  • serp favicon

    Hi, I have a website where the favicon is not showing in the google mobile serps. It's appearing the default icon instead (world icon). This is the tag I have place in the head section of the website: <link rel="shortcut icon" href="/favicon.ico" /> The size of the favicon is 48x48 and it's appearing correctly in the browser tag. I've checked that the google robot can crawl it and in the server logs I can see requests from the "Google Favicon" user-agent. Has anyone had this same problem? Any advice?

    | dMaLasp
    0

  • MOZ Community, I am trying to gauge both the potential upside and downside of buying a few (relatively long) URLs that encompass some new keywords that are surfacing in our industry and creating permanent redirects to our branded website. [This wasn't my idea!] These URLs haven't previously had any content or owners so their domain authority is low. Will Google still ding us for this behavior? I hope not but I worry that there might be some penalty for having a bunch of redirects pointing at our site. I have read that google will penalize you for buying content-rich sites with high DA and redirecting those URLs to your site but I am unclear about this other approach. It seems like a fairly mundane (and fruitless) play. I tried to explain that we won't reap any SEO rewards for owning these URLS (if there is no content) but that wasn't really heard. Thanks for any resources or information you can share! I would appreciate any resources.

    | ColleenHeadLight
    0

  • i add script for star snippet in my website but not work in my posts you can see it in this URL https://dlandroid.com/lucky-patcher/ when I searched in google my custom keyword "Lucky patcher apk" my competitor show with star snippet in SERP but my site doesn't show snippet stars.

    | hongloanj
    1
  • This question is deleted!

    0

  • I used http status code as 410 for some low quality pages in my site to Redirect to homepage. is this useful to improve my homepage authority?
    my website is: Nitamoshaver.com

    | ghorbanimahan
    0

  • For our product page, we want to be able to show the pricing in the local currency of the visitor. I discussed this with our web developer and he said that we can create country-specific pages, so one for UK, Australia, etc. I am afraid that this solution might hurt our SEO as Google might see this as duplicated content. What are your thoughts about this? The website runs on WordPress.

    | Maggie.Casas
    0

  • OK, been trying to piece together what is best practice for someone I'm working with, so here goes; Website was redesigned, changed urls from url a to url b. 301's put in place. However, the new url structure is not optimal. It's an e-commerce store, and all products are put in the root folder now: www.website.com/product-name A better, more organized url structure would be: www.website.com/category/product-name I think we can all agree on that. However, I'm torn on whether it's worth changing everything again, and how to handle things in terms of redirects. The way I see things, it would result in a redirect chain, which is not great and would reduce link equity. Keeping the products in the root moving forward with a poor structure doesn't feel great either. What to do? Any thoughts on this would be much appreciated!

    | Tomasvdw
    0

  • I have been trying to filter this traffic out of my Google Analytics data since it all seems to be related to spam traffic. I have had multiple instances wherein using this filter:
    (Backslash not displaying in message preview - I have written backlash to indicate its placement in the filter) Custom Filter - Exclude - Browser Size - ^backlash(not setbackslash)$ Traffic seems to appropriately filter out - but then the filter ceases working. In looking at a new site with Browser Size = (not set) traffic the filter preview doesn't appear to work either. Am I implementing the filter incorrectly? How do I filter this traffic out of GA data sucessfully? If I use the exact same method using RegEx in Google Data Studio - the filter works perfectly.

    | fuelmedical
    1

  • I have a client that has a HUGE website with thousands of product pages. We don't currently have a sitemap.xml because it would take so much power to map the sitemap. I have thought about creating a sitemap for the key pages on the website - but didn't want to hurt the SEO on the thousands of product pages. If you have a sitemap.xml that only has some of the pages on your site - will it negatively impact the other pages, that Google has indexed - but are not listed on the sitemap.xml.

    | jerrico1
    0

  • Hello, first sorry for my bad english,it isn't my first lanugage.
    I have a website with 13 years of history and activity. 5 Months ago we received an warning message from our domain provider which would seize our domain because of sanctions (i live in Iran), and they have seized many of iranian domains since then, therefore i have decided to quickly change my domain to another address so i could save my website as much as possible before they take out my domain...
    I have moved my website successfully to a new domain address and have done everything necessary for a good domain move (301 all links, change template and...) I have also used the "Change of Address Tool" provided in google search console so google knows my new domain address and change all of my links...
    Unfortunately 90% of my traffics comes from google, therefore we are hevaily depending on organic traffic.
    Since i have changed my domain address my traffic has been declining, and now i have only 30% of the traffic input left from google compared to my old domain 5 months ago. (i had recently some SEO troubles too which could effect this decline even more)
    Fortunately my old domain wasn't seized by the domain provider and i have successfully transfered it to another provide recently so there is no danger for my old domain anymore.
    My question is, should i move my website back to my old domain (cancel the google "Change of Address Tool" and use it again to move the new domain back to the old domain)? My old domain has more than 13 Years of history,has many backlinks within this 13years, and till now, i cannot get good rankings with new posts on the new domain, sometimes google even does not index my new articles several days, but my old domain ranks still well (i have tested a new article on the old domain to see how it performs and it was not very good, but i think it ranks still better than my new domain).
    My top pages and categories has been redirected successful and are still ranking well on google on the new domain address and hasn't been affected negativly, my main problem are new posts that are not ranking well o even does not get indexed for several days!
    I don't know what to do now, are 5months not enough for google to completly transfer all domain scores from my old domain to the new one? Will all scores of my old domain even transfer to my old domain eventually? How about the many Backlinks i have pointed to the old domain? (which 90% i cannot change or ask to change to my new address) Will the backlinks scores pointing to the old domain transfer to the new domain?In other hand i fear to move my site back to the old domain because i don't know how google would behave, would all my seo score and rankings come back after i move back to the old domain? Also as far as i know, after 6months of using the google "Change of Address Tool" i cannot cancle the domain change anymore, therefore i have roughly 1 month to decide to cancle the move or not...
    Please if anyone could help or guide me what to do it would be life saving for me, because my whole income and my family depends on my website...  😞

    | Milad25
    0

  • We are in a bit of a tricky situation since a key top-level page with lots of external links has been selected as a duplicate by Google. We do not have any canonical tag in place. Now this is fine if Google passes the link juice towards the page they have selected as canonical (an identical top-level page)- does anyone know the answer to this question? Due to various reasons, we can't put a canonical tag ourselves at this moment in time. So my question is, does a Google selected canonical work the same way and pass link juice as a user selected canonical? Thanks!

    | Lewald1
    0

  • Hi, I'm hoping someone can provide some insight. I Google searched "citizenpath" recently and found that all of our our sitelinks have identical text. The text seems to come from the site footer. It isn't using the meta descriptions (we definitely have) or even a Google-dictated snippet from the page. I understand we don't have "control" of this. It's also worth mentioning that if you search a specific page like "contact us citizenpath" you'll get a more appropriate excerpt. Can you help us understand what is happening? This isn't helpful for Google users or CitizenPath. Did the Google algorithm go awry or is there a technical error on our site? We use up-to-date versions of Wordpress and Yoast SEO. Thanks! search.png

    | 123Russ
    0

  • Hi The Service area pages created on my Shopify website is not indexing on google for a long time, Tried indexing the pages manually and also submitted the sitemap but still the pages doesn't seem to get indexed.
    Thanks in Advance.

    | Bhisshaun
    0

  • Hi there, We upgraded our webshop last weekend and our moz crawl on monday found a lot of errors we are trying to fix. I am having some communication problems with our webmaster so I need a little help. We have extremely long category pages url, does anyone have a guess which kind of mistake our webmaster could make:
    https://site-name.pl/category-name?page=3?resultsPerPage=53?resultsPerPage=53 .... And it keeps on repeating the string ?resultsPerPage=53 exactly 451 times as if there was some kind of loop. Thanks in advance for any kind of hint 🙂
    Kind regards,
    Isabelle

    | isabelledylag
    0

  • As of June 1 doctor pages on our website that say "No ratings are available yet" are being Soft 404ed in our Google Console. We suspect the issue is that wording, due to this post. https://www.contentkingapp.com/academy/index-coverage/faq/submitted-soft-404/ Just wondering if anyone with more expertise than me on 404s or local seo can validate that it is likely this issue. Some examples:
    https://www.nebraskamed.com/doctors/neil-s-kalsi
    https://www.nebraskamed.com/doctors/leslie-a-eiland
    https://www.nebraskamed.com/doctors/david-d-ingvoldstad 63647547-aa09-42a8-afe5-4431f277f611-image.png

    | Patrick_at_Nebraska_Medicine
    0

  • Our company implemented Google Shopping for our site for multiple countries, currencies and languages. Every combination of language and country is accessible via a url path and for all site pages, not just the pages with products for sale. I was not part of the project. We support 18 languages and 14 shop countries. When the project was finished we had a total of 240 language/country combinations listed in our rel alternate hreflang tags for every page and 240 language/country combinations in our XML sitemap for each page and canonicals are unique for every one of these page. My concern is with duplicate content. Also I can see odd language/country url combinations (like a country with a language spoken by a very low percentage of people in that country) are being crawled, indexed, and appearing in serps. This uses up my crawl budget for pages I don't care about. I don't this it is wise to disallow urls in robots.txt for that we are simultaneously listing in the XML sitemap. Is it true that these are requirements for Google Shopping to have XML sitemap and rel alternate hreflang for every language/country combination?

    | awilliams_kingston
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.