Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Is there any benefit in requesting manual removal over using the Google disavow file?  Or is it just extra work for little result. I.e. is it better to just disavow straightaway and not mess around? Thanks

    | seoman10
    0

  • Should the h1 tag be used for the main page content or the logo? I understand the original method was too H1 the logo with the main search term, does this still hold true or should it be content focused?

    | seoman10
    0

  • Hi We have product mark up on our site, data-vocab rather than schema. I can't see it showing in Google SERPs, but when testing it appears to be correct. Are Google still selective with what schema they show for a site? Thanks

    | BeckyKey
    0

  • Hello, We've had the known problem of duplicate content through the gclid parameter caused by Google Adwords. As per Google's recommendation - we added the canonical tag to every page on our site so when the bot came to each page they would go 'Ah-ha, this is the original page'. We also added the paramter to the URL parameters in Google Wemaster Tools. However, now it seems as though a canonical is automatically been given to these newly created gclid pages; below https://www.google.com.au/search?espv=2&q=site%3Awww.mypetwarehouse.com.au+inurl%3Agclid&oq=site%3A&gs_l=serp.3.0.35i39l2j0i67l4j0i10j0i67j0j0i131.58677.61871.0.63823.11.8.3.0.0.0.208.930.0j3j2.5.0....0...1c.1.64.serp..8.3.419.nUJod6dYZmI Therefore these new pages are now being indexed, causing duplicate content. Does anyone have any idea about what to do in this situation? Thanks, Stephen.

    | MyPetWarehouse
    0

  • Hi guys, I was wondering if anyone knew of free TD*IDF analysis tools on the market? I know about onpage.org and Text-tools.net both paid. I was wondering if anyone knows of other tools? Cheers, Chris

    | jayoliverwright
    1

  • Hi, I am trying to rank a website in a competitive niche (health related). I am looking to hire someone really good in outreach and good llinkbuilding. I have tried elance and mostly Indian companies on it which couldn't really deliver. I have also checked moz's recommended list and those guys are way too pricey for start-ups. Is there members on the forum who come recommended? I am happy to work out reasonable fees and possibly also discuss equity in the business. Any help would be appreciated.

    | sarmad_malik
    0

  • Hi, We have recently gone through a domain change and thought we had done this by the book.  We 301'd the old site extensively  and optimized our new site as best we can.  All done with the help of a leading agency. However our old site had a DA score  of 49 and checking our new site's DA almost 3 months after the changes its scores 9! Furthermore with rankings 2-3wks after the change we saw small drops (1-3 positions), 3-10 weeks saw more gradual slides but remaining page 1 for most terms. Currently some core terms are slipping back to page 2. I have the below question on this and would like you to get your opinions and views. After a well planned domain change would its be expected that the DA score would remain in the same region or is a drop like this normal? Is the domain change the most likely cause of this drop or are there other factors that may cause this? What is the best approach for trouble shooting this situation?

    | SEO-SMB
    0

  • Hi, i have recently moved my wordpress blog to a new server.. Previously I had a url as website.com/blog My blog site is now running on the domain website.com Now most of my images are in the correct folder path wp-content/uploads Howerver, some of my images are pointing to a folder /blog/wp-content/uploads and so I am getting many missing image on the front end. How do i get the /blog/wp-content/uploads point to the new url wp-content/uploads Thanks guys.. Taiger

    | Taiger
    0

  • Hi Moz Community! Most of the category & sub-category pages on one of our client's ecommerce site are actually filtered internal search results pages. They can configure their CMS for these filtered cat/sub-cat pages to have unique meta titles & meta descriptions, but currently they can't apply custom H1s, URLs or breadcrumbs to filtered pages. We're debating whether 2 out of 5 areas for keyword optimization is enough for Google to crawl these pages and rank them for the keywords they are being optimized for, or if we really need three or more areas covered on these pages as well to make them truly crawlable (i.e. custom H1s, URLs and/or breadcrumbs)…what do you think? Thank you for your time & support, community!

    | accpar
    0

  • We are looking for an internal duplicate content checker that is capable of crawling a site that has over 300,000 pages. We have looked over Moz's duplicate content tool and it seems like it is somewhat limited in how deep it crawls. Are there any suggestions on the best "internal" duplicate content checker that crawls deep in a site?

    | tdawson09
    1

  • We have a client who has asked us to talk to link brokers to speed up the back linking process. Although I've been aware of them for ages I have never openly discussed the possible use of 'buying' links or engaging in that part of the industry. Do they have a place in SEO and if so what is the MOZ communities thoughts?

    | wearehappymedia
    0

  • I need to do a back link audit on an e-commerce site, their number of back links large proportion of them are okay but we seem to have collected a few bad ones (not by us) along the way. Need a recommendation of a good quality reliable tool or set of tools with a proven track record. Thanks

    | seoman10
    0

  • Hi everyone, It seems that Panda 4.2 has hit some industries more than others. I just started working on a website, that has no manual action, but the organic traffic has dropped massively in the last few months. Their external linking profile seems to be fine, but I suspect usability issues, especially the duplication may be the reason. The website is a recruitment website in a specific industry only. However, they posts jobs for their clients, that can be very similar, and in the same time they can have 20 jobs with the same title and very similar job descriptions. The website currently have over 200 pages with potential duplicate content. Additionally, these jobs get posted on job portals, with the same content (Happens automatically through a feed). The questions here are: How bad would this be for the website usability, and would it be the reason the traffic went down? Is this the affect of Panda 4.2 that is still rolling What can be done to resolve these issues? Thank you in advance.

    | iQi
    0

  • Would there be any positive effect from editing a site map down to a more curated list of pages that perform, or that we hope they begin to perform, in organic search? A site I work with has a sitemap with about 20,000 pages that is automatically created out of a Drupal plugin. Of those pages, only about 10% really produce out of search. There are old sections of the site that are thin, obsolete, discontinued and/or noindexed that are still on the sitemap. For instance, would it focus Google's crawl budget more efficiently or have some other effect? Your thoughts? Thanks! Best... Darcy

    | 94501
    0

  • Hi I wondered if anyone knew of any case studies to reinforce the importance of on page reviews for SEO & increasing traffic. I'd like to push it in my company, however it would be great to show them some results from a case study. Thank you!

    | BeckyKey
    1

  • I have a client that owns a business that really could be easily divided into two separate business in terms of SEO.  Right now his web site covers both divisions of his business. He gets about 5500 visitors a month.  The majority go to one part of his business and around 600 each month go to the other. So about 11% I'm considering breaking off this 11% and putting it on an entirely different domain name.  I think I could rank better for this 11%.  The site would only be SEO'd for this particular division of the company.  The keywords would not be in competition with each other. I would of course link the two web sites and watch that I don't run into any duplicate content issues. I worry about placing the redirects from the pages that I remove to the new pages.  I know Google is not a fan of redirects.  Then I also worry about the eventual drop in traffic to the main site now.  How big of a factor is traffic in rankings? Other challenges include that the business services 4 major metropolitan areas. Would you do this?  Have you done this?  How did it work? Any suggestions?

    | MSWD
    0

  • Hey guys: I tried to download all the crawl data from Google Search Console using the API and solutions like this one: https://github.com/eyecatchup/php-webmaster-tools-downloads but seems that is not longer working (or I made something wrong, I just receive a blank page when running the PHP file after some load time)... I needed to download more than 1.000 URLs long time ago, so I didn't tried to use this method since then. Is there any other solution using the API to grab all the crawl errors, or today this is not possible anymore? Thanks!

    | antonioaraya
    1

  • Because I want to increase site speed, Siteground (my hosting) suggested I use Cloudflare Plus which needs my site to have www in order to work. I'm also using a cloud hosting. Im a bit scared of doing this, and thus decided to come to the community. I used MOZ for over 6 months now and love the tool. Please help me make the best possible decisions and what steps to follow. It would be much appreciated. Thank you!

    | Andrew_IT
    0

  • In an interesting study by DeganSEO titled 'Negative Impact of 301 Redirects - A Case Study' a drop of rankings was observed when popular blog posts were redirected to product pages. One hypothesis is that the suppression is due to topical difference between the redirected pages (blog posts) and the target page. The topical difference issue is an interesting one when you consider it in the context of website migrations. We always recommend that 301 redirects are done at a page level and that if an equivalent page doesn't exist to just 301 anyway but to the most logical page. If you think about it Google are likely to frown on this because a) it's not a good experience for the user - 404 would be more accurate for them
    b) it's lazy - if you have good content that has gained authority/trust then create the same content on the new site don't trytp pass that to an entirely different page. Thoughts? Experiences?

    | QubaSEO
    0

  • Are there any tools out there to see historical positions of keywords for competitors? I haven't been tracking the keywords, just wondered if there is any cached data anywhere?

    | seoman10
    0

  • Hello All 🙂 Since launching my new website design - www.advanced-driving.co.uk I am not convinced Google is seeing all the content on the page. I took a long extract of text and did a search on Google and nothing was found. Also although in the search results for "advanced driving course" I can see the new title tag, the snippet isn't showing.. Is there anyway I can check this? As a scroll down I can see the URL changes ie: www.advanced-driving.co.uk
    then:
    http://www.advanced-driving.co.uk/#da-page_in_widget-3
    then:
    http://www.advanced-driving.co.uk/#da-page_in_widget-4
    then:
    http://www.advanced-driving.co.uk/#da-page_in_widget-5 Is this right? Thanks in advance..

    | robert78
    0

  • Question Answered

    | Veebs
    0

  • Have been looking at WMT and discovered that on 10 October there was substantial average position drop, from around 14 to 20.  
    Does anyone have any ideas why this may have happened? As far as I know there's nothing that we have done could have impacted.

    | seoman10
    0

  • Hi I'm trying to workout if there's something wrong with our pagination. We include the rel="next" and "prev" on our pages. When clicking on page 2 on a product page, the URL will show as something like - /lockers#productBeginIndex:30&orderBy:5&pageView:list& However, if I search site:http://www.key.co.uk/en/key/lockers in Google, it seems to find paginated pages: http://www.key.co.uk/en/key/lockers?page=2 I have a feeling something is going wrong here, but haven't worked massively on Pagination before. Can anyone help?

    | BeckyKey
    0

  • A problem has been introduced onto our sitemap whereby previously excluded URLs are no longer being correctly excluded. These are returning a HTTP 400 Bad Request server response, although do correctly redirect to users. We have around 2300 pages of content, and around 600-800 of these previously excluded URLs, An example would be http://www.naturalworldsafaris.com/destinations/africa-and-the-indian-ocean/botswana/suggested-holidays/botswana-classic-camping-safari/Dates and prices.aspx (the page does correctly redirect to users). The site is currently being rebuilt and only has a life span of a few months. The cost our current developers have given us for resolving this is quite high with this in mind. I was just wondering: How much of a critical issue would you view this? Would it be sufficient (bearing in mind this is an interim measure) to change these pages so that they had a canonical or a redirect - they would however remain on the sitemap. Thanks
    Kate

    | KateWaite
    0

  • HI All My website is 10 years old, and has decent rankings. The domain is www.advanced-driving.co.uk I have recently had a major overhaul of the site, before it was very outdated, with lots of duplicated content. My main keywords are "advanced driving course" and "advanced driving courses" both of which I am on page 1. However, since I have been live with new site - (5 days) I am not ranking for some easy win keywords. I have submitted new content thought webmaster tools, and whilst some content is ranking, others are not. The content not ranking is fresh and unique ( have used copyscape on all new pages). For example my homepage is on page 1 for "advanced driving courses london" - around rank 6. So I hand made some content titled advanced driving courses london to provide more of an exact match, outlining our courses in London and the routes we take - http://www.advanced-driving.co.uk/defensive-advanced-driving-courses-london/ However, this page which is unique does not rank at all....I have done this with another website and it worked well, but google is not understanding this at all. Also I am now on page 1 for "advanced driving course" but not for "advanced driving courses" - well I am but the page for the plural keyword is a page not really related - surely Googles semantic search should realise course and courses are the same! I suspect that Google is still getting used to my new website? No errors or anything in Webmaster tools... Can anyone confirm this - or outline if I have done something awful..!! Thanks Rob

    | robert78
    0

  • Hi guys, In the process of launching internationally ecommerce site (Magento CMS) for two different countries (Australia and US). Then later on expand to other countries like the UK, Canada, etc. The plan is for each country will have its own sub-folder e.g. www.domain.com/us, www.domain.com.au/au, www.domain.com.au/uk A lot of the content between these English based countries are the same. E.g. same product descriptions. 
    So in order to prevent duplication, from what I’ve read we will need to add Hreflang tags to every single page on the site? So for: Australian pages: United States pages: Just wanted to make sure this is the correct strategy (will hreflang prevent duplicate content issues?) and anything else i should be considering? Thankyou, Chris

    | jayoliverwright
    0

  • Hi Does anyone know how to properly add facets URL's to Robots txt? E.g. of our facets URL - http://www.key.co.uk/en/key/platform-trolleys-trucks#facet:-10028265807368&productBeginIndex:0&orderBy:5&pageView:list& Everything after the # will need to be blocked on all pages with a facet. Thank you

    | BeckyKey
    0

  • Hi, I have 3 Pages we won't use anymore in our website. Let's call them url A, url B and url C. To keep their SEO strength on our domain, I've though about redirecting all of them to url D. For what I understand, when 301 redirecting, about 85-90% of the link SEO juice is passed. Then, if I redirect 3 URLs to the same page... does url D receive all the link SEO juices for URLs added up? (approximately)
    e.g. future url D juice = 100% current url D juice + 85% url A juice + 85% url B juice + 85% url C juice Is this the best practice, or is there a better way? Cheers,

    | viatrading1
    0

  • Is it worthwhile going after a good score on Google page speed? Had prices but a LOT of money, and don't know if it's worth it or not. Also to add to the complication it is a new site. Does anyone have any experience if it helps rankings? Thanks

    | seoman10
    0

  • Hi We have a number of sister companies and link to them via a drop down in the footer - are these links as dangerous as anchor text links?

    | BeckyKey
    0

  • Hi Mozzers Ive noticed that I have some 302 redirects on my website which have been there for some time . They should really 301's but I am wondering if 302s pass link juice or not as from what I've read they don't so I just wanted to check if anyone knew for sure, thanks pete

    | PeteC12
    0

  • Dear Mozzers, I have an issue with my UK Website (short url is - http://goo.gl/dJ7IgD  ) whereby when I type my company name in to google.co.uk search the .com version returns in Search as opposed to the .co.uk and from looking at open site explorer the page rank of the .com is higher than the .co.uk ?. Infact I cant even see the .co.uk homepage version but other pages from my site. The .com version is also 301'd to the .co.uk. From looking at Open Site Explorer, I have noticed that we have more links pointing to .com as opposed to .co.uk.   Alot of these are from our own separate microsites which we closed down last year and  I have noticed the IT company who closed them down for some reason 301'd them to the .com version of our site as opposed to the .co.uk but If I look in http://httpstatus.io/  (http status checker tool) to check one of these mircosites it shows - 301 - 302 - 200 status codes which to me looks wrong ?. I am wondering what it should read ... e.g should it just be a 301 to a 200 status code ?. My Website short url is - http://goo.gl/dJ7IgD   and an example of some of 10 microsites we closed down last year which seems to be redirected to .com is http://goo.gl/BkcIjy   and http://goo.gl/kogJ02 As these were redirected almost a year ago - it is okay if I now get them redirected to the .co.uk version of my site or what should I do ? They currently redirect to the home page but given that each of the microsites are based on an individual category of my main site , would it be better to 301 them to the relevant category on my site. My only concern is that , may cause to much internal linking and therefore I wont have enough links on my homepage ? How would you suggest I go about building up my .co.uk authority so it ranks betters than the .com- I am guessing this is obviously affecting my rankings and I am losing link juice with all this. Any advice greatly appreciated . thanks Pete

    | PeteC12
    0

  • Hi I have a couple of questions regarding the best way to optimise SKU pages on a large ecommerce site. At the moment we have 2 landing pages per product - one is the primary landing page with no SKU, the other includes the SKU in the URL so our sales people & customers can find it when using the search facility on the site. The SKU landing page has a canonical pointing to the primary page as they're duplicates. Is this the best way? Or is it better to have the one page with the SKU in the URL? Also, we have loads of products with the very similar product descriptions, I am working on trying to include a unique paragraph or few sentences on these to improve the content - how dangerous is the duplicate content within your own site? I know its best to have totally unique content, but it won't be possible on a site with thousands of products and a small team. At the moment I am trying to prioritise the products to update. Thank you 🙂

    | BeckyKey
    0

  • Hi there, my company is a national theater news publisher. Quick question about a particular use case. When an editor publishes a story they can assign several discrete locations, allowing it to appear on each of those locations within our website. This article (http://www.theatermania.com/denver-theater/news/full-casting-if-then-tour-idina-menzel_74354.html), for example, appears in the Los Angeles, San Francisco, and Denver section. We force the author to choose a primary location from that list, which controls the location displayed in the URL. Is this a bad practice? I'm wondering if the fact that having 'Denver' in the URL is misleading and hurts SEO value, particularly since that article features several other cities.

    | TheaterMania
    0

  • I run this website: http://knowledgeweighsnothing.com/ It was initially built to get traffic from Facebook. The vast majority of the 1300+ posts are shorter curation style posts. Basically I would find excellent sources of information and then do a short post highlighting the information and then link to the original source (and then post to FB and hey presto 1000s of visitors going through my website). Traffic was so amazing from FB at the time, that 'really stupidly' these posts were written with no regard for search engine rankings. When Facebook reach etc dropped right off, I started writing full original content posts to gain more traffic from search engines. I am starting to get more and more traffic now from Google etc, but there's still lots to improve. I am concerned that the shortest/weakest posts on the website are holding things back to some degree. I am considering going through the website and deleting the very weakest older posts based on their quality/backlinks and PA. This will probably run into 100s of posts. Is it detrimental to delete so weak many posts from a website? Any and all advice on how to proceed would be greatly recieved.

    | xpers
    1

  • Hello, We have a site that manages pagination through parameters in urls, this way: friendly-url.html
    friendly-url.html?p=2
    friendly-url.html?p=3
    ... We've rencently added the canonical tag pointing to friendly-url.html for all paginated results. In search console, we have the "p" parameter identified by google. 
    Now that the canonical has been added, should we still configure the parameter in search console, and tell google that it is being use for pagination? Thank you!

    | teconsite
    0

  • Hey Dudes, Quick question about international sitemaps. Basically we have a mix of subfolders, subdirectories, and ccTLDs for our different international/language sites. With this in mind how do you recommend we set up the site map. I'm thinking the best solution would be to move the subfolders and subdirectories onto an index and put the ccTLD site maps on their own root only. domain.ca/sitemap (This would only contain the Canada pages) domain.com, fr.domain.com, domain.com/eu/ (These pages would all have an index on domain.com/sitemap that points to each language/nations index) OR Should all site have a site map under their area. domain.com/sitemap, fr.domain.com/sitemap, domain.com/eu/sitemap, domain.ca/sitemap? I'm very new to international SEO. I know that our current structure probably isn't ideal... but it's what I've inherited. I just want to make sure I get a good foundation going here. So any tips are much appreciated!

    | blake.runyon
    0

  • To obtain an SEO benefit from an SSL is there any particular type or brand which is recommended or has a track history? It seems you can pay anything between $20 and $???? (For that matter whatever you want to pay!). Any experience gratefully accepted! Thanks

    | seoman10
    0

  • Also suggesting a WBF topic. Ive read and researched with no luck here... would love a Moz staff reply too! Is it better to blog repeatedly on the same topic (writing multiple blogs around the topic of "content marketing" for example in hopes Google sees you as an authority on the topic over time) OR is this keyword cannibalization? Is it better to have one powerful and comprehensive page on a topic if it makes sense. Thanks!

    | RickyShockley
    0

  • Hi Guys, I have a crawl rate drop in webmastertools and can't figure out way. In the last month I removed a lot o duplicate pages that we don't need anymore, there were at least 1.5 million pages. Can this be a motive? D7O5x1l

    | Silviu
    0

  • I have a client that does website contract work for about 50 governmental county websites.  The client has the ability to add a link back in the footer of each of these websites.  I am wanting my client to get backlink juice for a different key phrase from each of the 50 agencies (basically just my keyphrase with the different county name in it).  I also want a different landing page to rank for each term.  The 50 different landing pages would be a bit like location pages for local search.  Each one targets a different county.  However, I do not have a lot of unique content for each page.  Basically each page would follow the same format (but reference a different county name, and 10 different links from each county website). Is this a good SEO back link strategy?  Do I need more unique content for each landing page in order to prevent duplicate content flags?

    | shauna7084
    0

  • Hi guys, Just want to check on this site migration strategy. Basically we have an Australian based ecommerce site which is going to launch globally. The company has two site. One is (http://www.domainUS.com – for US market) and one is Australian based (http://www.domain.com.au). Basically the plan is to have one single global .com site (like ASOS.com) on a new domain which would be domain.com and put both the current http://www.domainUS.com (US VERSION) and http://www.domain.com.au  (AUSTRALIAN VERSION) on the new domain: domain.com (global) To make it even more complicated the new global domain (domain.com) is in the process of being purchased (someone else has the domain) and won’t be available till January 2016. But the company wants to execute the new global setup in November 2015 temporary on the .com.au version The current migration plan is to create two different sub-folders one for US e.g. http:www.domain.com.au/us and one for AUD http://www.domain.com/au on the current domain Australian domain.com.au for the global launch in November 2015. Then once domain.com is ready in January 2016, then migrate to domain.com with the countries as sub-folder (as shown below in stage 3). I was wondering if you guys think this would be an ideal migration strategy given the circumstances. Link to screenshot of current migration strategy: http://c714091.r91.cf2.rackcdn.com/4c2aae21dcbd548f27d96840227b81bc6b8b00c592.png Any advice would be very much appreciated! Cheers, Chris

    | jayoliverwright
    0

  • I've been looking at a local business in London and they have multi-sites, each with multiple versions of the same address in their NAP - they're using the correct addresses, with variations in terms of order of address elements (with some missing out London, and some including London) For example, one listing puts the postcode after the city district - another before. Sometimes London is included in the address, though often not (the postal service doesn't include London in their "official version" of the addresses). So the addresses are never wrong - it's just the elements in the address are mixed up a little, and some include London, and some do not. Should I be concerned about this lack of address consistency, or should I try to exact match the various versions?

    | McTaggart
    0

  • Hi  guys, Does anyone have any experience of having (trying to rank) two separate blogs existing on one domain, for instance: www.companysite.com/service1/blogwww.companysite.com/service2/blogThese 2 pages (service 1 and service 2) offer completely different services (rank for different keywords).(for example, a company that provides 2 separate services: SEO service and IT service)Do you think it is a good/bad/confusing search engine practice trying to have separate blogs for each service or do you think there should be only one blog that contains content for both services?Bearing in mind that there is an already existing subdomain for a non-profit part of business that ranks for different keywords: non-profit.companysite.comand it will potentially have another blog so the URL would look like: non-profit.companysite.com/blogAny ideas would be appreciated!Thanks

    | kellys.marketing
    0

  • Hey Mozzers, I've been working at this for a while now, and I can't figure out why the rich snippet data is not getting pulled for our reviews and product rating. I've included a sample URL where we have reduced the schema.org markup: http://www.tripcentral.ca/vacations-packages_00_03_JN_gran-bahia-principe-coba.html | } | Any thoughts? I was told not to list multiple reviews, so I took them out. But it's still not being picked up in the SERPs, and we would really like the star rating data to appear. Any useful advice would be appreciated!

    | tripcentral
    0

  • Howdy, Working on a larger eComm site that 302s you based on your location. With that in mind should I canonicalize the final page. domain.com => 302 => domain.com/us/, domain.com/fr/, etc... (Should these all have a canonical pointing to the root domain.com?

    | blake.runyon
    0

  • I have over 1000 keywords to sort out , I need a tool that picks up on variations and symmons. Does anyone know of a very good tool that is available? If not I'll do it the long way round with Excel 😞

    | seoman10
    0

  • Hello Mozzers. We historically had Location specific landing pages on our eCommerce site. examples - site.co.ukj/cleaning-enquipment-london site.co.ukj/cleaning-enquipment-Manchester These all had unique content(600 words approx) and ranked in top 10 for many cities. I understand these would have been classed as doorway pages so we got rid of them (301'd back to the category pages) and now our rankings for these terms have tanked. We also have specific branch pages but we have kept these like many other companies with multiple branches do. It feels like by doing a good thing and tidying up everything , we are actually making our site worse. Everything else seems to be in place. Loads of new regular content , clean profile , mobile friendly, lots of citations etc etc. Any idea what could be going on here. Here's a link in our site - http://goo.gl/0yjSd8 thanks Pete

    | PeteC12
    0

  • So I currently have approximately 1000 of these URLs indexed, when I only want roughly 100 of them. Let's say the URL is www.example.com/page.php?par1=ABC123=&par2=DEF456=&par3=GHI789= All the indexed URLs follow that same kinda format, but I only want to index the URLs that have a par1 of ABC (but that could be ABC123 or ABC456 or whatever). Using URL Parameters tool in Search Console, I can ask Googlebot to only crawl URLs with a specific value. But is there any way to get a partial match, using regex maybe? Am I wasting my time with Search Console, and should I just disallow any page.php without par1=ABC in robots.txt?

    | Ria_
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.