Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • Hi All, Just launched a new Magento store & set the category suffix to blank & not .html or / So the desired url is https://www.example.com/category-1 But I am seeing a 301 redirect being implemented: https://www.example.com/category-1/ redirect to: https://www.example.com/category-1 I cant see this is the list of 301 redirects within the redirect panel in Magento but Moz & another redirect checker is picking it up. I am missing a setting or something ? Many Thanks,
    Pat

    | PaddyM556
    0

  • Hi Moz Community, I've recently created several new pages on my site using much of the same copy from blog posts on the same topics (we did this for design flexibility and a few other reasons). The blogs and pages aren't exactly identical, as the new pages have much more content, but I don't think there's a point to having both and I don't want to have duplicate content, so we've used 301 redirects from the old blog posts to the new pages of the same topic. My question is: can I go ahead and delete the old blog posts? (Or would there be any reasons I shouldn't delete them?) I'm guessing with the 301 redirects, all will be well in the world and I can just delete the old posts, but I wanted to triple check to make sure. Thanks so much for your feedback, I really appreciate it!

    | TaraLP
    1

  • Hi All, I am fairly new to the technical side of SEO and was hoping y'all could help me better understand the purpose of dynamic rendering with index.html pages and any implications they might hold for SEO. I work to support an eComm site that includes a subdomain for its product pages: products.examplesite.com. I recently learned from one of our developers that there are actually two sets of product pages - a set of pages that he terms "reactive,"  that are present on our site, that only display content when a user clicks through to them and are not retrievable by search engines. And then a second set of static pages that were created just for search engines and end in .index.html. So, for example: https://products.examplesite.com/product-1/ AND https://products.examplesite.com/product-1/index.html I am confused as to what specifically the index.html pages are doing to support indexation, as they do not show up in Google Site searches, but the regular pages do. Is there something obvious I am missing here?

    | Lauren_Brick
    0

  • Hi, wondering if anyone can point me to resources or explain the difference between these two. If a site has url parameters disallowed in Robots.txt is it redundant to edit settings in Search Console parameters to anything other than "Let Googlebot Decide"?

    | LivDetrick
    0

  • EX: https://www.STORENAME.com/collections/all-deals/alcatel– Tagged "Alcatel", when I run audits, I come across these URLS that give me duplicate content and missing H1.  This is Canonical: https://www.STORENAMEcom/collections/all-deals/alcatel Any advice on how to tackle these I have about4k in my store! Thank you

    | Sscha003
    0

  • I have a client that only wants UK users to be able to purchase from the UK site. Currently, there are customers from the US and other countries purchasing from the UK site. They want to have a single webpage that is displayed to users trying to access the UK site that are outside the UK. This is fine but what impact would this have on Google bots trying to crawl the UK website? I have scoured the web for an answer but can't find one. Any help will be greatly appreciated. Thanks 🙂

    | lbagley
    0

  • Hello, If we publish a blog post with a url which accidentally contains a number at the end (blog.companyname.com/subject-title-0), is it best-practice to update the URL (e.g. to blog.companyname.com/subject-title) and put in a 301 re-direct from the old to the new one or should it simply be left as is? I've read that 301's lose link equity and relevance so is it really worth re-directing for the sake of a cleaner url? Thanks for your input! John

    | SEOCT
    1

  • We have 2 separate sites for desktop (www.example.com) and mobile (m.example.com) As per the guideline, we have added Rel=alternate tag on www.example.com to point to mobile URL(m.example.com) and Rel=canonical tag on m.example.com to point to Desktop site(www.example.com).However, i didn't find any guideline on what canonical tag we should add ifFor Desktop sitewww.example.com/PageA - has a canonical tag to www.example.com/PageBOn this page, we have a Rel=alternate tag m.example.com/pageAWhat will be the canonical we should add for the mobile version of Page Am.example.com/PageA - Canonical tag point to www.example.com/PageA -or www.example.com/PageB?Kalpesh

    | kguard
    0

  • I have site issues on certain subpages which I am not focusing for SEO, e.g. "News".  I was wondering does the issues of these subpages affect my overall SEO performance (those pages which I will like it to be rank, e.g. "Products" page)?

    | erica.lee
    0

  • bing is removed my indexing

    | GSTPVT33
    0

  • Hi, Im working for the first time on a magento webshop.  But i run into a problem where crawlers find then thousands of pages while there are a few hunderd products. I expect is has something to do with filters that generate dynamic URL's.  I can't find any setting in Magento to prevent this and i think this will hurt SEO performance because  of duplicate content and high amount of pages that need to be crawled while the site has no authority. What would my approach be to solve this? Do i need to ad certain tags to the pages or are these settings in my robots file.

    | J05B
    0

  • Hello, I have a domain with GoDaddy and current site is hosted there as well.  I want to leave my domain with GoDaddy and build a brand new site on HostGator. The current website was designed to get us started.  Not any significant traffic, backlinks, or SEO.  The domain is not really what I want.  There are 80 pages including those that are no longer in service.  The keywords are not as relevant today.   Current site domain is whiterocktech.net The new site will be very much different with SEO leading the way.  We have designed it yet have not opened an account yet with HostGator.  In addition, we have found a shorter more appropriate domain name.  Not ideal but easy to type in yet it has a dash.  This site is wr-crm.com. Questions: Does it make sense to "cut bait" from the current site given the lack use? Does it make sense to build the site and still set redirects from the old domain pages to a new one? Given so little traffic, is there really an effect on SEO if we sunset the old domain? Could I strip out the old domain website and just post a message on one page to come to our new site until old domain expires? I appreciate any insights on helping me with this decision. Mike

    | mmcgibbony
    0

  • The site in question uses Wordpress. They have a Resources section that is broken into two categories (A or B). Underneath each of these categories is 5 or 6 subcategories. The structure looks like this: /p/main-category-a/subcategory/blog-post-name /p/main-category-b/subcategory/blog-post-name All posts have a main category, but other posts often have multiple subcategories while some posts also fall into both main categories. What would be the easiest or most effective way to auto-populate the breadcrumb based on from where the person reached the blog post? So for example, a way to set Home -> Main Category -> Subcategory 1 as the breadcrumb if they reach it from the Subcategory 1 landing page. Or is this not possible and we should just set the breadcrumb manually based on where we feel it best lives? Thanks.

    | Alces
    0

  • All of our URLs in Google My Business are tagged with ?utm_source=gmb. This way when people click on it within a Google Map listing, knowledge graph, etc we know it came from there. I'm assuming using a canonical on all ?_utm_source _pages (we have others, including some in the index) won't cause any problems with this, correct? Since they're not technically traditional organic SERPs? Dumb question I know, but better safe than sorry. Thanks.

    | Alces
    1

  • Hi Everyone, A client of us has a Belgian website in 2 different languages, in French and in Dutch.
    We make use of hreflang tags, so each user gets to see the website in their preferred language. The landingspage on the website however, let's say www.example.be, has a 302 redirect to the French version of the website (www.example.be/fr/). And Dutch users get to see the Dutch version (www.example.be/nl/) of the website when they browse to www.example.be . Now, I want to get rid of the 302 redirect. Should I replace it to a 301 redirect, without sending every user automatically to one of the 2 versions? Should I just remove the redirect? Or should I just leave it this way? I would love to hear your thoughts on this. Jens

    | WeAreDigital_BE
    0

  • Hello all, I've spent the past day scouring guides and walkthroughs and advice and Q&As regarding this (including on here), and while I'm pretty confident in my approach to this query, I wanted to crowd source some advice in case I might be way off base. I'll start by saying that Technical SEO is arguably my weakest area, so please bear with me. Anyhoozles, onto the question (and advance apologies for being vague): PROBLEM I'm working on a website that, in part, works with providers of a service to open their own programs/centers. Most programs tend to run their own events, which leads to an influx of Event pages, almost all of which are indexed. At my last count, there were approximately 800 indexed Event pages. The problem? Almost all of these have expired, leading to a little bit of index bloat. THINGS TO CONSIDER A spot check revealed that traffic for each Event occurs for about a two-to-four week period then disappears completely once the Event expires. About half of these indexed Event pages redirect to a new page. So the indexed URL will be /events/name-of-event but will redirect to /state/city/events/name-of-event. QUESTIONS I'M ASKING How do we address all these old events that provide no real value to the user? What should a future process look like to prevent this from happening? MY SOLUTION Step 1: Add a noindex to each of the currently-expired Event pages. Since some of these pages have link equity (one event had 8 unique links pointing to it), I don't want to just 404 all of them, and redirecting them doesn't seem like a good idea since one of the goals is to reduce the number of indexed pages that provide no value to users. Step 2: Remove all of the expired Event pages from the Sitemap and resubmit. This is an ongoing process due to a variety of factors, so we'd wrap this up into a complete sitemap overhaul for the client. We would also be removing the Events from the website so there are not internal links pointing to them. Step 3: Write a rule (well, have their developers write a rule) that automatically adds noindex to each Event page once it's expired. Step 4: Wait for Google to re-crawl the site and hopefully remove the expired Events from its index. Thoughts? I feel like this is the simplest way to get things done quickly while preventing future expired events from being indexed. All of this is part of a bigger project involving the overhaul of the way Events are linked to on the website (since we wouldn't be 404ing them, I would simply suggest that they be removed entirely from all navigation), but ultimately, automating the process once we get this concern cleaned up is the direction I want to go. Thanks. Eager to hear all your thoughts.

    | Alces
    0

  • Now there is some conflicting beliefs here and I want to know what you think. If I got a high spam website to remove my backlink, is a disavow through search console still necessary ? Keep in mind if it helps even in the slightest to improve rankings im for it!

    | Colemckeon
    1

  • I work on a news site and we updated our Schema set up last week. Since then, valid Logo items are dropping like flies in Search Console. Both URL inspector & Rich Results test cannot seem to be able to detect Logo on articles. Is this a bug or can Googlebot really not see schema nested within other schema?Previously, we had both Organization and Article schema, separately, on all article pages (with Organization repeated inside publisher attribute). We removed the separate Organization, and now just have Article with Organization inside the publisher attribute. Code is valid in Structured Data testing tool but URL inspection etc. cannot detect it. Example: https://bit.ly/2TY9Bct Here is this page in URL inspector: By comparison, we also have Organization schema (un-nested) on our homepage. Interestingly enough, the tools can detect that no problem. That's leading me to believe that either nested schema is unreadable by Googlebot OR that this is not an accurate representation of Googlebot and it's only unreadable by the testing tools. Here is the homepage in URL inspector: In pseudo-code, our OLD schema looked like this: ​The NEW schema set up has the same Article schema set up, but the separate script for Organization has been removed. We made the change to embed our schema for a couple reasons: first, because Google's best practices say that if multiple schemas are used, Google will choose the best one so it's better to just have one script; second, Google's codelabs tutorial for schema uses a nested structure to indicate hierarchy of relevancy to the page. My question is, does nesting schemas like this make it impossible for Googlebot to detect a schema type that's 2 or more levels deep? Or is this just a bug with the testing tools?

    | ValnetInc
    0

  • Hi there I'd really appreciate any help you can give me. I want to redirect our old domain (example.com) to (somethingelse.com). They are both hosted separately. The old domain has a domain authority of 20 and never ranked well. We can't be sure Google simply doesn't like the old domain. I'll explore the links again to check. Another question is: do we even want to pass the old authority to the new website? Thank you.

    | kettlebellswing
    0

  • Hello, first we had this structure Categorie: https://www.stoneart-design.de/armaturen/ Subcategory: https://www.stoneart-design.de/armaturen/waschtischarmaturen/ Oft i see this https://www.xxxxxxxx.de/badewelt/badmoebel/ But i have heard it has something to do with layers so google can index it better, is that true ? "Badewelt" is an extra layer ? So i thought maybe we can better change this to: https://www.stoneart-design.de/badewelt/armaturen/ https://www.stoneart-design.de/badewelt/armaturen/waschtischarmaturen/ and after seeing that i thought we can do it also like this so the keyword is on the left, and make instead "badewelt" just a "c" and put it on the back https://www.stoneart-design.de/armaturen/c/ https://www.stoneart-design.de/armaturen/waschtischarmaturen/c/ I dont understand it anymomre which is the best one, to me its seems to be the last one The reason was about this: this looks to me keyword stuffing: Attached picture Google indexed not the same time the same url, so i thougt with this we can solve it Also we can use only the word "whirlpools" in de main category and the subs only the type without "whirlpools" in text thanks Regards, Marcel SC9vi60

    | HolgerL
    0

  • If I search for my client's phone number on Google, without gaps, ie 02036315541,  another company comes up at the top of the list. This company has a similar name to ours, but it is in a different town and it does different things. My company name is Energy Contract Renewals https://www.energycontractrenewals.co.uk/ and their company is https://energyrenewals.co.uk. As far as I can see, the other company does not mention our phone number anywhere on their site or on their GMB page so I don't know why they are coming up.  We do not come up at all for this search. However,  if I put our phone number in like this: 020 3631 5541, our company does come up and the other company does not. Anyone know how I can correct this or if it is even possible to do something about it?

    | mfrgolfgti
    1

  • Hey folks, have searched around and haven't been able to find an answer to this question. I've got a client who has very different search results when including his middle initial. His bio page on his company's website has the slug /people/john-smith; I'm wondering if we set up a duplicate bio page with his middle initial (e.g. /people/john-b-smith) and then 301 redirect it to the existent bio page, whether the latter page would get indexed by google and show in search results for queries that use the middle initial (e.g. "john b smith"). I've already got the metadata based on the middle initial version but I know the slug is a ranking signal and since it's a direct match to one of his higher volume branded queries I thought it might help to get his bio page ranking more highly. Would that work or does the 301'd page effectively cease to exist in Google's eyes?

    | Greentarget
    0

  • Hi , i bought my website this month , i built it myself now i want it to grow seo on google pages and stuff tried google ads but they always block my account don't know why so somebody help me please any company? individual willing to work it for me ? even if paid company no problem

    | planetdocs
    1

  • Hello there, I have some low quality pages in my site, can I redirect 301 them to my homepage? My website is:   https://idanito.com idanito.com

    | dannybaldwin
    0

  • I do not have a sku, global identifier, rating or offer for my product. Nonetheless it is my product. The price is variable (as it's insurance) so it would be inappropriate to provide a high or low price. Therefore, these items were not included in my product schema. SD Testing tool showed 2 warnings, for missing sku and global identifier. Google Search Console gave me an error today that said:  'offers, review, or aggregateRating should be specified' I don't want to be dishonest in supplying any of these, but I also don't want to have my page deprecated in the search results. BUT I DO want my item to show up as a product. Should I forget the product schema? Advice/suggestions? Thanks in advance.

    | RoxBrock
    1

  • Hi guys, We have this new online shop with over 1000 products (very technical products), synchronised with the SAP system of the company. So basically the page URLs are generated based on the following structure: Domain Name / Language / Product Category / Subcategory-1 / Subcategory-2 / Subcategory-3 / Product Name and Model Sometimes the URLs are over 130 characters length. Would this harm the shop's ranking, so should we really fix this, or it's something that can be ignored, having in mind the technical products in the shop? I would really appreciate your advice! Thanks!

    | Andreea-M
    0

  • Hi all, I have a client that has a unique search setup. So they have Region pages (/state/city). We want these indexed and are using self-referential canonicals. They also have a search function that emulates the look of the Region pages. When you search for, say, Los Angeles, the URL changes to _/search/los+angeles  _and looks exactly like /ca/los-angeles. These search URLs can also have parameters (/search/los+angeles?age=over-2&time[]=part-time), which we obviously don't want indexed. Right now my concern is how best to ensure the /search pages don't get indexed and we don't get hit with duplicate content penalties. The options are this: Self-referential canonicals for the Region pages, and disallow everything after the second slash in /search/ (so the main search page is indexed) Self-referential canonicals for the Region pages, and write a rule that automatically canonicalizes all other search pages to /search. Potential Concern: /search/ URLs are created even with misspellings. Thanks!

    | Alces
    1

  • Hi all! My client is in healthcare in the US and for HIPAA reasons, blocks traffic from most international sources. a. I don't think this is good for SEO b. The site won't allow Moz bot or Screaming Frog bot to crawl it.  It's so frustrating. We can't figure out what mechanism they are utilizing to execute this.  Any help as we start down the rabbit hole to remedy is much appreciated. thank you!

    | SimpleSearch
    0

  • Hello there, I have a strange problem: if you search "probike" (romanian google.ro), i get an OLD homepage with https://www.probike.ro , and my new homepage is https://probike.ro, the real problem is that title/meta description are not available (see attachment). Redirect 301: Ok 
    Robots.txt: Ok 
    Sitemap: Ok Help, is annoying 🙂 With respect, Andrei I33gThc

    | Shanaki
    0

  • Google is not showing Meta Description for the Keyword Rankings of my website in the SERPs. All of my Keywords Ranking are coming with just two fields. Which are just 1. Title Tag & 2. Page URL. The description tag is missing in it. Here is a proof Kindly advice please.

    | seobac
    1

  • IM working on a site which is really not indexing as it should, I have created a sitemap.xml which I thought would fix the issue but it hasn't, what seems to be happening is the Google is making www pages canonical for some of the site and without www for the rest. the site should be without www. see images attached for a visual explanation.
    when adding pages in Google search console without www some pages cannot be indexed as Google thinks the www version is canonical, and I have no idea why, there is no canonical set up at all, what I would do if I could is to add canonical tags to each page to pint to the non www version, but the CMA does not allow for canonical. not quite sure how to proceed, how to tell google that the non www version is in fact correct, I dont have any idea why its assuming www is canonical either??? k11cGAv zOuwMxv

    | Donsimong
    0

  • Hi , my website is opening with IP too.  i think its duplicate content for google...only home page is opening with ip, no other pages,  how can i fix it?,  might be using .htaccess i am able to do...but don't know proper code for this...this website is on wordpress platform... Thanks Ramesh

    | unibiz
    0

  • I just found I'm having a redirect chain issue for http://ifixappliancesla.com (301 Moved Permanently). According to Moz, "Your page is redirecting to a page that is redirecting to a page that is redirecting to a page... and so on" These are the pages involved: 301 Moved Permanently
    http://ifixappliancesla.com
    https://ifixappliancesla.com https://www.ifixappliancesla.com/ This is what Yoast support told me: "The redirect adds the https and then the www, ending at: https://www.ifixappliancesla.com/. You want all variants of your site's domain to end up at: https://www.ifixappliancesla.com/ " - which is totally true. But I would also like not to have the redirect chain issue! Could you please give me an advise on how to properly redirect my pages so I don't have that issue anymore?

    | VELV
    0

  • Main website xyz.com have only 10K external links while geo target sites xyz.com/ca (which is for Canada) have 100K external links in Google Search Console. Mostly links which are showing as external links for xyz.com/ca are from xyz.com
    Ex.-
    External links for xyz.com/ca -
    xyz.com/contact-us
    xyz.com/product
    xyz.com/category Is it possible? or missed any technique to implement in geo target sites.

    | Rajesh.Prajapati
    0

  • I have a redirect chain being flagged in Moz, this is the first time I have come across this, and Im convinced its causing an issue, the site is performing very badly. I have attached screen grabs of the redirect chain.  and the site is cotswold flat roofing Is this redirect chain causing me an issue, and what is the best practice for getting it fixed or removed, the site currenly does not even show up in Google even when searching for its won brand, I removed an x robot noindex a week ago,  and now the only issue that I can see is this redirect chain 6pOpGNU

    | Donsimong
    0

  • Hi there, A number of our URLs are being de-indexed by Google. When looking into this using Google Search Console the same message is appearing on multiple pages across our sites: 'Duplicate, submitted URL not selected as canonical' 'IndexingIndexing allowed? YesUser-declared canonical - https://www.mrisoftware.com/ie/products/real-estate-financial-software/Google-selected canonical - https://www.mrisoftware.com/uk/products/real-estate-financial-software/'Has anyone else experienced this problem?How can I get Google to select the correct, user-declared canoncial? Thanks.

    | nfrank
    0

  • My domain was indexed with HTTPS://WWW. now that we redirected it the certificate has been removed and if you try to visit the old site with https it throws an obvious error that this sites not secure and the 301 does not happen. My question is will googles bot have this issue. Right now the domain has been in redirection status to the new domain for a couple months and the old site is still indexed, while the new one is not ranking well for half its terms. If that is not causing the problem can anyone tell me why would the 301 take such a long time. Ive double and quadruple checked the 301's and all settings to ensure its being redirected properly. Yet it still hasn't fully redirected. Something is wrong and my clients ready to ditch the old domain we worked on for a good amount of time. backgorund:About 30 days ago we found some redirect loops .. well not loop but it was redirecting from old domain to the new domain several times without error. I removed the plugins causing the multi redirects and now we have just one redirect from any page on the old domain to the new https version. Any suggestions? This is really frustrating me and I just can't figure it out.  My only answer at this point is wait it out because others have had this issue where it takes up to 2 months to redirect the domain. My only issue is that this is the first domain redirect out of many that have ever taken more than a week or three.

    | waqid
    0

  • Hi, we have recently changed our brand name after 7 years and have changed our root domain to match (33shake.com since 2012, now 33 fuel.com) The site is the same (no migration to a new one) as there were no other business changes apart from the name/domain. 301 redirects are looking after all former 33shake.com links, which are now being redirected to their new 33fuel.com equivalents (slugs are the same in 99% of cases). My question is: We have a lot of backlinks for our old domain (33shake.com) on our own content via our YouTube channel (100+ videos) and also our podcast (64 episodes in, broadcast on 10 platforms). For maximum SEO benefit as we continue to restore domain authority, etc to 33fuel.com, are we best to leave these historical backlinks pointed at the old domain and let the redirects pick them up when people click? Or are we better off swapping all of these old historical backlinks so they point directly to the new domain? Any advice would be greatly appreciated, this is quite a maze we are now picking our way through! Warren

    | WP33
    1

  • I have an interesting problem with a site which has an x-robot tag blocking the site from being indexed, the site is in Wordpress, there are no issues with the robots.txt or at the page level, I cant find the noindex anywhere. I removed the SEO plug-in which was there and installed Yoast but it made no difference. this is the url: https://www.cotswoldflatroofing.com/ Its coming up with a HTTP error: x-robots tag noindex, nofollow, noarchive

    | Donsimong
    0

  • Hi, I know its best practice to redirect a website from http to https, instead of having many entry point to your website. When a website has been running for a long time on http and https, what are the SEO Pros and Cons of implementing a redirect from Http to Https?

    | FreddyKgapza
    1

  • a new page template was created the plan is to publish the new page (which has the same URL as before) to web and delete the old page that has the URL ,  will that have an SEO implications ?

    | lina_digital
    1

  • If some websites, which provide information about apps in a particular niche, are publishing the same content which we have given in our app's description when they refer our app for that particular niche then would it lead to spamming? Our website is getting a backlink from one such website so are we at any sort of risk? What should we do about it without having to lose that backlink?

    | Reema24
    0

  • Hello, Our company is international and we are looking to gain more traffic specifically from Europe. While I am aware that translating content into local languages, targeting local keywords, and gaining more European links will improve rankings, I am curious if it is worthwhile to have a company.eu domain in addition to our company.com domain. Assuming the website's content and domain will be exactly the same, with the TLD (.eu vs .com) being the only change - will this add us benefit or will it hurt us by creating duplicate content - even if we create a separate GSC property for it with localized targeting and hreflang tags? Also - if we have multiple languages on our .eu website, can different paths have differing hreflangs? IE: company.eu/blog/german-content German hreflang and company.eu/blog/Italian-content Italian hreflang. I should note - we do not currently have an hreflang attribute set on our website as content has always been correctly served to US-based English speaking users - we do have the United States targeted in Google Search Console though. It would be ideal to target countries by subfolder rather if it is just as useful. Otherwise, we would essentially be maintaining two sites. Thanks!

    | Tom3_15
    0

  • Hello So our site was hacked which created a few thousand spam URLs on our domain. We fixed the issue and changed all the spam urls now return 404. Google index shows a couple of thousand bad URLs. My question is- What's the fastest way to remove the URLs from google index. I created a site map with sof the bad urls and submitted to Google. I am hoping google will index them as they are in the sitemap and remove from the index, as they return 404. Any tools to get a full list of google index? ( search console downloads are limited to 1000 urls). A Moz site crawl gives larger list which includes URLs not in Google index too. Looking for a tool that can download results from a site: search. Any way to remove the URLs from the index in bulk? Removing them one by one will take forever. Any help or insight would be very appreciated.

    | ajiabs
    1

  • How is the best way to handle all the different variations of a website in terms of www | non www | http | https? In Google Search Console, I have all 4 versions and I have selected a preference. In Open Site Explorer I can see that the www and non www versions are treated differently with one group of links pointing to each version of the same page. This gives a different PA score. eg. http://mydomain.com DA 25 PA 35 http://www.mydomain.com DA 19 PA 21 Each version of the home page having it's only set of links and scores. Should I try and "consolidate" all the scores into one page? Should I set up redirects to my preferred version of the website? Thanks in advance

    | I.AM.Strategist
    0

  • I'm working on some htaccess redirects for a few stray pages and have come across a few different varieties of 301s that are confusing me a bit....Most sources suggest: Redirect 301 /pageA.html http://www.site.com/pageB.html or using some combination of: RewriteRule + RewriteCond + RegEx I've also found examples of: RedirectPermanent /pageA.html http://www.site.com/pageB.html I'm confused because our current htaccess file has quite a few (working) redirects that look like this: Redirect permanent /pageA.html http://www.site.com/pageB.html This syntax seems to work, but I'm yet to find another Redirect permanent in the wild, only examples of Redirect 301 or RedirectPermanent Is there any difference between these? Would I benefit at all from replacing Redirect permanent with Redirect 301?

    | SamKlep
    1

  • I have a page showing up on the insights report as being a redirect chain. This page however does not exist as far as I can tell. It is not on my dashboard anywhere and pointing a browser to it produces a messy page with Wordpress theme error code spit out.  How do I track this down to clean it up if the page does not exist within my Wordpress installation? The page for reference is https://butlermobility.com/dealers/downloads. As it stands today the dealers and downloads pages are separate. There is no downloads sub page within the dealers section.

    | NiteSkirm
    0

  • Hi, I've an ecommerce website with more than 50k urls and only 10% or so are getting crawled regularly by Google.
    Product listing pages represent roughly 80% of these 50k pages. Trying to improve this, I was thinking to remove altogether all (most?) of my product listing from search (via Robot.txt) to keep only the product pages themselves and the product categories. My organic situation since Jan 2019:
    Users: 2,300,000 (of which 9% are visiting product listing pages)
    Page views:  8,000,000 (of which 5% are product listing pages). Am I about to unleash armageddon (or more like harakiri) on my website by doing so or actually get Google to crawl much more relevant resources (product pages, product categories, blog content and so on)? Thanks,
    G

    | GhillC
    0

  • I'm implementing Schema.org, (JSON-LD), on an eCommerce site. Each product has a few different variations, and these variations can change the price, (think T-shirts, but blue & white cost $5, red is $5.50, and yellow is $6). In my Schema.org markup, (using JSON-LD), in each Product's Offer, I could either have a single Offer with a price range, (minPricd: $5, maxPrice $6), or I could add a separate Offer for each variation, each with its own, correct, price set. Is one of these better than the other? Why? I've been looking at the WooCommerce code and they seem to do the single offer with a price range, but that could be because it's more flexible for a system that's used by millions of people.

    | 4RS_John
    1

  • In German we have declination, i.e. we have the keyword "sicherheitskritische Systeme" which can get "sicherheitskritischer Systeme" when declinated. How does Google handle this? Do I have to rewrite all texts, so that the keywords are not declinated? (At least MozPro's ranking algorithm is sensitive and did not accept  sicherheitskritischer" for the keyword sicherheitskritische"...) Rewriting might lead to quite awful sentences! Thnaks a lot Andy

    | a2stucki
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.