Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi Folks, I have been checking how many pages our competitors have indexed in Google compared to our website and I noticed that one of our main competitors has over 2 million indexed pages and I have figured out that it is because they have language/country specific pages for every page on their website. That being said, these pages contain all of the same content and the language doesn't actually change, it remains in English. Now my question is this. Will this not in fact hurt their rankings, in terms of duplicate content? Or am I missing something here? The URL's essentially do something like www.competitor.com/fr/ for France for example but as I say the content is in English, and duplicates their main website. Seems odd to me but would love your opinions on this. Thanks. Gaz

    | PurpleGriffon
    0

  • We are going through a website redesign that involves changing URL's for the pages on our site. Currently all our pages are in the format domain.com/example.html and we are moving to stip off the .html file extension so it would just be domain.com/example We have thousands of pages as the site deals with news so building a redirect for each individual page isn't really feasible. My plan is to have a generic rewrite rule that redirects any page that ends .html to the stripped off version of this. A problem I can see with this is that it will also redirect pages that don't exist. So for example, domain.com/non-existant-page.html would 301 to domain.com/non-existant-page which would then return a 404 status. What would the SEO repercussions be for this? Obviously if a page doesn't exist already then it shouldn't show up in the search engine indexes and shouldn't be a problem but I'm a bit worried about how old pages that currently legitimately 404 will be treated when they start to 301 redirect to a 404 instead. Not sure if there any other potential issues from this that I've missed either? Thanks!

    | sbb024
    0

  • Hi all We provide online services, and as part of this we provide our clients with a javascript embeddable 'widget' to place on their website. This is fairyly popular (100s-1000s of inserts on websites). The main workings of this are javascript (they spit html iframe onto the page) but we also include both a <noscript>portion (which is purely customer focused, it deep links into a relevant page on our website for the user to follow)  and also a plain <p><a href=''></a></p> at the bottom, under the JS. This is all generated and inserted by the website owner. Therefore, after insertion we can dynamically update whatever the Javascript renders out, but the <noscript> and <a> at the bottom are there forever.</p> <p>Previously, this last plain link has been used for optimisation, with it randomly selecting 1 out of a bank of 3 different link anchor texts when the widget html is first generated.</p> <p>We've also recently split our website into B2B and B2C portions, so this will be linking to a newer domain with much established backlinks than the existing domain. I think we could get away with optimised keyword links on the old domain but the newer domain they will be more obvious.</p> <p>In light of recent G updates, we're afraid this may look spammy. We obviously want to utilise the link as best as possible, as it is used by hundreds of our clients, but don't want it to cause any issues. </p> <p>So my question, would you just focus on using brand name anchor text for this? Or could we mix it up with a few keyword optimised links also? If so, what sort of ratio would you suggest?</p> <p>Many thanks</p></noscript>

    | benseb
    0

  • Hello, I have a website made with asp.net and ranking quite well for a number of competitive keywords like in google top 10 results for more than a dozen competitive keywords. Recently in order for better user experience, I am having it developed so it is fully responsive for all screen resolutions. Basically all the design element / site text will remain the same including color scheme / layout etc outwardly but internally this will change  everything all the css / page html (tables converted to divs) etc. Now my question is: 1. Will this considered by bots a complete site overhaul and ranking will take a hit even if I stay with current platform i.e. asp.net? 2. While making design responsive I can also develop a wordpress theme, which will make it easier to work with the website as the site does not require any programming. So if I also change the platform like from MS IIS/asp to Apache / php how  will search engine bots take this? 3. If above in fact will result in ranking drop, how much time will it take for the rankings to get back to normal? Note that I use extensionless urls so the urls will remain the same as well even if we convert from asp to php. Sorry for long details but question is bugging me from weeks.

    | hpk
    0

  • Greetings MOZ Community: In Google Analytics under "Site Speed" under "Behavior" our home page has a page speed rank of 91 which I assume is pretty fast. However the "Average Page Load Time" is varies between 5 and 8 seconds, which seems very slow. My developers have made major efforts to optimize the home page URL (www.nyc-officespace-leader.com) for speed.  The page has a carousel which I assume may be slowing it down. Is the download speed of this page detrimental to SEO? Or is the favorable Page Speed Score good enough. I am particularly concerned because the most competitive phrases are ranked on the home page. As it stands I am having a lot of difficulty ranking in the top ten for these pages. My concern is that the slow download speed of the home page could be holding back ranking of these terms. If necessary I can always redesign the home page and remove the carousel or reduce the number of listings in the carousel to speed it up. Is this worth investing effort in or is the speed good enough? Thanks, Alan

    | Kingalan1
    0

  • On my site, there are two different category bases leading to the exact same page. My developer claims that this is a common — and natural — occurrence when using WordPress, and that there's not a duplicate content issue to worry about. Is this true? Here's an example of the correct url. and... Here's an example of the same exact content, but using a different url. Notice that one is coming from /topics and the other is coming from /authors base. My understanding is that this is bad. Am I wrong?

    | JasonMOZ
    1

  • It doesn't make sense to me that a 404 causes a loss in link juice, although that is what I've read.  What if you have a page that is legitimate -- think of a merchant oriented page where you sell an item for a given merchant --, and then the merchant closes his doors.  It makes little sense 5 years later to still have their merchant page so why would removing them from your site in any way hurt your site?  I could redirect forever but that makes little sense.  What makes sense to me is keeping the page for a while with an explanation and options for 'similar' products, and then eventually putting in a 404.  I would think the eventual dropping out of the index actually REDUCES the overall link juice (ie less pages), so there is no harm in using a 404 in this way.  It also is a way to avoid the site just getting bigger and bigger and having more and more 'bad' user experiences over time. Am I looking at it wrong? ps I've included this in 'link building' because it is related in a sense -- link 'paring'.

    | friendoffood
    0

  • I have an ecommerce site in English and a blog that is in Malay language. We have started the blog 3 weeks ago with about 20-30 articles written. Ecommerce is using MAgento CMS and Blog is wordpress. URL Structure: Ecommerce: www.example.com Blog: www.example.com/blog Blog category: www.example.com/blog/category/ However, google is indexing all pages including blog category but not individual post that is in Malay language. What could be the issue here? PLEASE help me!

    | WayneRooney
    0

  • Hi i want to block web archive/wayback machine from indexing my site and creating a record of it on their database. Any ideas on how to do this? Cheers,
    Superpak

    | Mikey008
    2

  • We're migrating our site to https and I have the following question: We have some old url's that we are 301ing to new ones. If we switch over to https then we will be forced to do a double-redirect for these url's. Will this have a negative SEO impact? If so, is there anything that we can do about it?

    | YairSpolter
    0

  • Hi all A query has recently been raised internally with regard to the use of canonical links. Due to CMS limitations with a client who's CMS is managed by a third party agency, canonical links are currently output with the port number attributed, e.g. example.com/page:80 ...as opposed to the correct absolute URL: example.com/page Note port number are not attributed to the actual page URLs. We have been advised that this canonical link functionality cannot be amended at present. My personal interpretation of canonical link requirements is that such a link should exactly match the absolute URL of the intended destination page, my query is does this extend to the attribution of port number to URLs. Is the likely impact of the inclusion of such potentially incorrect URLs likely to be the same as purely incorrect canonical links. Thanks

    | 26ryan
    0

  • hi guys! I'm currently having a dev version of my site (dev.website.com) that once everything is done i would move the dev to the public domain (website.com). But since is a total duplicate content of my real site would it affect the seo? if so, i tried setting the reading privacy in wordpress so google would not index it but im afraid when i live it in the future and revert the setting back to normal it would affect the site seo. any opinion and suggestion on this?

    | andrewwatson92
    0

  • Lets say B  i a duplicate page of A (main page). I understand I have to put canonical tag under  B to point to A. Do I also put canonical tag under the main page A? Is it necessary? I understand that A would then tell Google that it is preferred page of A? Is this a correct understanding?

    | andypatalak
    0

  • Howdy! I have a site that is crawling as "duplicate content pages" that is really just pagination. The rel next/prev is in place and done correctly but Roger Bot and Google are both showing duplicated content + duplicate page titles & meta's respectively. The only thing I can think of is we have a canonical pointing back at the URL you are on - we do not have a view all option right now and would not feel comfortable recommending it given the speed implications and size of their catalog. Any experience, recommendations here? Something to be worried about? /collections/all?page=15"/>

    | paul-bold
    0

  • Hi guys, i have 2 questions about my website of coupon codes: should i do redirect people of google to mywebsite, e.g. Someone is looking coupons for Sony or LG and arrive to brand Sony in my website but i show him offers for Sony in Amazon when he click in some offer, ¿that is correct? Footer images links. I saw many sites that put their logos in footer of online stores to get authority, ¿should i do that? Thank you so much.

    | pompero99
    0

  • Hi The pagination on our website www.offonhols.com is causing duplicate content problems.  Is the best solution adding add rel=”prev” / “next# to the hrefs As now the pagination links at the bottom of the page are just http://offonhols.com/default.aspx?dp=1
    http://offonhols.com/default.aspx?dp=2 
    http://offonhols.com/default.aspx?dp=3
    etc

    | offonhols
    0

  • We are a large catalog company with thousands of products across 2 different domains. Google clearly knows that the sites are connected. Both domains are fairly well known brands - thousands of branded searches for each site per month. Roughly half of our products overlap - they appear on both sites. We have a known duplicate content issue - both sites having exactly the same product descriptions, and we are working on it. We've seen that when a product has different content on the 2 sites, frequently, both pages get to page 2 of the SERPs, but that's as far as it goes, despite aggressive white hat link building tactics. 1. Is it possible to get the same product pages on page 1 of the SERPs for both sites? (I think I know the answer...) 2. Should we be canonicalizing (is that a word?) products across the sites? This would get tricky - both sites have roughly the same domain authority, but in different niches. Certain products and keywords naturally rank better on 1 site or the other depending on the niche.

    | AMHC
    0

  • We are running an A/B split test (Started on 12/12), and a few days after we started the test, we fell from position 9/10 (about 3 weeks on page 1) to position 11/13, and we've been there ever since. We are still running the test. We disallowed the test page in robots.txt, but that's all. no canonical no noindex
    -no Google Experiments code. Theoretically, Google could crawl the site, and find the page, but then the page is disallowed. The alternate page is not indexed. Could this explain the rankings drop?

    | AMHC
    1

  • I didn't find an answer on a search on this, so maybe someone here has faced this before. I am loading 20 images that are in the viewport and a bit below.  The next 80 images I want to 'lazy-load'.  They therefore are seen by the bot as a blank.gif file.  However, I would like to get some credit for them by giving a description in the alt tag.  Is that a no-no?  If not, do they all have to be the same alt description since the src name is the same?  I don't want to mess things up with Google by being too aggressive, but at the same time those are valid images once they are lazy loaded, so would like to get some credit for them. Thanks!  Ted

    | friendoffood
    0

  • Dear SEO Expert, We run a website www.guitarmonk.com. Moz has told us of some errors at our website viz.: Especially duplicate content etc and whatever in addition you suggest should be good for the website for certain relative important keywords. Regards

    | Guitarmonk
    0

  • Can anyone help regarding looking for an SEO

    | Veebs
    0

  • We have been doing a good amount of competitive research lately and have noticed sites that have been changing their TTLD quite often to escape manual penalties / DCMA filings. An example evolution: brandterm.com -> brandterm.bz -> brandterm.me These competitors are able to quickly rank for money keywords in the top 3 soon after another domain switch. What we have noticed is that while its obvious they received Google penalties they continue to 301 redirect the old domains to the new ones. We have experienced first hand that penalties travel along domains with 301 redirects. Does anyone have an explanation how these companies are able to achieve quickly high volume of organic search while 301-redirecting from burnt domains? The only option I see is to disavow all previous domains in GWT to be able to employ 301 redirects without risking carrying over the penalty. Are there other theories ppl can think of? T

    | petersocapro
    0

  • Hello~ Does anyone have any positive traffic results to share since implementing this? Thanks! MS

    | MargaritaS
    0

  • I have recently set up a mens style blog - the site is made up of articles pulled in from a CMS and I am wanting to keep the design as clean as possible - so no text other than the articles. This makes it hard to get a H1 tag into the page - are there any solutions/alternatives? that would be good for SEO? The site is http://www.iamtheconnoisseur.com/ Thanks

    | SWD.Advertising
    0

  • Hi does any one know of a good seo company that will get results, i.e., fix site issues and get the site improving in the serps.

    | Taiger
    0

  • I've pulled out all the stops and so far this seems like a very technical issue with either Googlebot or our servers. I highly encourage and appreciate responses from those with knowledge of technical SEO/website problems. First some background info: Three websites, http://www.americanmuscle.com, m.americanmuscle.com and http://www.extremeterrain.com as well as all of their sub-domains could potentially be involved. AmericanMuscle sells Mustang parts, Extremeterrain is Jeep-only. Sometime recently, Google has been crawling our americanmuscle.com pages and serving them in the SERPs under an extremeterrain sub-domain, services.extremeterrain.com. You can see for yourself below. Total # of services.extremeterrain.com pages in Google's index: http://screencast.com/t/Dvqhk1TqBtoK When you click the cached version of there supposed pages, you see an americanmuscle page (some desktop, some mobile, none of which exist on extremeterrain.com😞 http://screencast.com/t/FkUgz8NGfFe All of these links give you a 404 when clicked... Many of these pages I've checked have cached multiple times while still being a 404 link--googlebot apparently has re-crawled many times so this is not a one-time fluke. The services. sub-domain serves both AM and XT and lives on the same server as our m.americanmuscle website, but answer to different ports. services.extremeterrain is never used to feed AM data, so why Google is associating the two is a mystery to me. the mobile americanmuscle website is set to only respond on a different port than services. and only responds to AM mobile sub-domains, not googlebot or any other user-agent. Any ideas? As one could imagine this is not an ideal scenario for either website.

    | andrewv
    0

  • I have developed a few websites before where the homepage contains the content for the keywords I was targeting. This has been reasonably successful as I have found it easy enough to get links to the homepage. I am considering a new site in a totally different industry that I am thinking about structuring like this: mybrand.com (not necessarily targeting any keywords) mybrand.com/important-keyword-1/ (definitely want to target) mybrand.com/important-keyword-2 (equally important as 1st keyword) There will be several (30-ish) other pages targeting keywords but they are not as significant as the two mentioned above, more so they are about publishing informative information. The two important keywords are quite different but industry related. My questions are: should I be careful targeting keywords away from the homepage when the homepage gets the most links? Would I be better off building 2 different websites where the keyword content is captured in the homepage? Thanks,

    | BGu
    0

  • I have a website for desktop that does a lot of things, and have converted part of it do show pages in a mobile friendly format based on the users device.  Not responsive design, but actual diff code with different formatting by mobile vs desktop--but each still share the same page url name.  Google allows this approach. The mobile-friendly part of the site is not as extensive as desktop, so there are pages that apply to the desktop but not for mobile.  So the functionality is limited some for mobile devices, and therefore some pages should only be indexed for desktop users. How should that page be handled for Google crawlers?  If it is given a 404 not found for their mobile bot will Google properly still crawl it for the desktop, or will Google see that the url was flagged as 'not found' and not crawl it for the desktop? I asked a similar question yest, but it was not stated clearly.  Thanks,Ted

    | friendoffood
    0

  • We're rebuilding our entire website in angularJS. We've got it rendering fine in WMT, but does that mean that it's content is detectable? I've looked into prerender.io and that seems like a great solution to the problem of not seeing any static HTML, but is it really necessary? I'm looking into this as I'm having the argument currently with my devs, and they're all certain that Google renders angularJS fine.

    | localdirectories
    0

  • I'm using a WordPress site with the WordPress SEO plugin by Yoast. Why Google won’t display the right page title for my CATEGORY pages? This is s example category page I'm having a problem with: http://bit.ly/1DReQPP 
    In the source code of this category page you can see the title is:
    <title>Yoga Übungen - Mit visuellen Guides, Videos & viel Inspiration</title> Now when I check the title in the SERPS it only gives me the the category name 'Yoga Übungen'. See screenshot here: http://awesomescreenshot.com/05942buz60 
    This happens with ALL the category pages on my site. Google uses the category name instead of the title provided in the source code. I found an article from Yoast dealing with this issue: https://yoast.com/google-page-title/ It's correct, that Google sometimes chooses a different page title, but this article doesn't address the exclusive category problem. For 'normal' pages or posts, Google always shows the title which I've setup in Yoast (and which is in the source code). I don't understand what goes wrong for category pages? Do any of you have a similar problem or experience with that?

    | soralsokal
    0

  • We have the strangest problem. The blog for our website ranks very poorly: www.lifeionizers.com/blog = average position in SERPs = 200. The site itself has an average position in SERPs of 12. The blog has a few terms it ranks #1 for such as branded terms and: is mineral water alkaline = 1.3 kangen water vs alkaline water = 2.6 kangen water pyramid = 1.2 ph of redbull = 1.1 (Used by Google as answer in knowledge graph) But the blog ranks terribly for most search terms. This blog has about 440 pages of in-depth, well-written authoritative content. Readers are well engaged, the blog has a bounce rate of ~3.5% with average time on page of over 6 minutes. The problem can't be the quality of the content. Does Google levy penalties against specific subdirectories? Or is this a configuration problem? Bad links have been disavowed.

    | karasd
    0

  • One of my clients has about 40 franchisees who are going to have their own sites on a WordPress multisite installation. The question they're wondering: should these be on subdirectories (thesite.com/this-franchise) or on subdomains (this-franchise.thesite.com)? Which is best for SEO? Thanks!

    | ideasandpixels
    0

  • I know there has been some mention on Moz Q&A for .uk.com, but not for at least 3 years. So I wanted to see if any Mozzers out there knew if having a .uk.com domain would hinder our SEO long-term? Our company is finally now taking SEO seriously and we're planning some great stuff for the year ahead, but I have a feeling that our .uk.com domain may prevent us from out-ranking some of the bigger companies out there. Does anyone have any thoughts about this out there? Thanks 🙂

    | JamesPearce
    0

  • I have a large website with rental listings in 14 markets, listings are added and taken off weekly if not daily. There are hundreds of listings in each market and all have their own landing page with a few pages associated. What is the best process here? I could run one sitemap and make each market's landing page .8 priority in the sitemap or make 14 sitemaps for each market and then have one sitemap for the general and static pages. From there, what would be the better way to structure? Should I keep all the big main landing pages in the general static sitemap or have them be at the top of the market segmented sitemaps? Also, I have over 5,000 urls, what is the best way to generate a sitemap over 500 urls? Is it necessary?

    | Dom441
    0

  • I'm working on a site that has some non-productive pages without much of an upside potential, but that are linked-to externally. The site also has some productive pages, light in external links, in a somewhat related topic. What do you think of 301ing the non-productive pages with links to the productive pages without links in order to give them more external link love? Would it make much of a difference? Thanks... Darcy

    | 94501
    0

  • Hi, I have a online shop with categories such as: Trousers Shirts Shoes etc. But now I'm having a problem with further development.
    I'd like to introduce brand pages. In this case I would create new categories for Brand 1, Brand 2, etc... The text on categories and brand pages would be unique. But there will be an overlap in products. How do I deal with this from a duplicate content perspective? I'm appreciate your suggestions. Best, Robin

    | soralsokal
    0

  • Hi ive heard silos being mentioned in the past to help with rankings does this still apply? and what about breadcrumbs do i use them with the silo technique or instead of which ones do you think are better or should i not be using these anymore with the recent google updates?

    | juun
    0

  • Hello Moz'ers I know organic searches go up and down and there is no way to control that. When should I be worried about search results I.E. site is being de-listed or some other SEO problem Screen-Shot-2014-12-15-at-10.08.37-AM.png

    | ryanparrish
    0

  • I want to put blog on my site.  The IT department is asking that I use a subdomain (myblog.mysite.com) instead of a subfolder (mysite.com/myblog).  I am worried b/c it was my understanding that any links I get to my blog posts (if on subdomain) will not count toward the main site (search engines would view almost as other website).   The main purpose of this blog is to attract backlinks.  That is why I prefer the subfolder location for the Blog. Can anyone tell me if I am thinking about this right? Another solution I am being offered is to use a reverse proxy. Thoughts? Thank you for your time.

    | ecerbone
    0

  • Hi, We're new to SEO and trying to fix our domain canonical issue. A while back we were misusing the "link canonical" tag such that Google was tracking params (e.g. session ids, tagging ) all as different unique urls. This created a nightmare as now Google thinks there's millions of pages associated with our domain when the reality is really a couple thousand unique links. Since then, we've tried to fix this by: 1) specifying params to ignore via SEO webmasters 2) properly using the canonical tag. However, I'm still recognizing there's a bunch of outsanding search results that resulted from this mess. Any idea on expectation on when we'd see this cleaned up? I'm also recognizing that google is looking at http://domain.com and https://domain.com as 2 different pages even though we specify to only look at "http://domain.com"  via the link canonical tag. Again, is this just a matter of waiting for Google to update its results? We submitted a site map but it seems like it's taking forever for the results of our site to clear up... Any help or insight would greatly be appreciated!

    | sfgmedia
    0

  • When I put in site:www.qjamba.com  on a mobile device it comes back with some of my mobile-friendly pages for that site(same url for mobile and desktop-just different formatting), and that's great.  HOWEVER, it also shows a whole bunch of the pages (not identified by Google as mobile-friendly) that are fine for desktop users but are not supposed to exist for the mobile users, because they are too slow.  Until a few days ago those pages were being redirected for mobile users to the home page.  I since have changed that to 404 not founds. Do we know that Google keeps a mobile index separate from the desktop index?  If so, I would think that 404 should work.. How can I test whether the 404 not founds will remove a url so they DON'T appear on a mobile device when I put in site:www.qjamba.com (or a user searches) but DO appear on a desktop for the same command.

    | friendoffood
    0

  • Google's removal tool doesn't give a person the option to tell them which index - mobile friendly, or desktop/laptop - the url should be removed from. Why? I may have a fundamental misunderstanding.  The way I thought it works is that when you have a dynamically generated page based on the user agent, (ie, the SAME URL but different formatting for smartphones as for desktop/laptop)  then the Google mobile bot will index the mobile friendly version and the desktop bot will index the desktop version -- so Google will have 2 different indexed results for the same url.  That SEEMS to be validated by the existence of the words 'mobile-friendly' next to some of my mobile friendly page descriptions on mobile devices. HOWEVER, if that's how it works--why would Google not allow a person to remove one of the urls and keep the other?  Is it because Google thinks a mobile version of a website must have all of the identical pages as the desktop version?  What if it doesnt?  What if a website is designed so that some of the slower pages simply aren't given a mobile version?  Is it possible that Google doesn't really save results for a mobile friendly page if there is a corresponding desktop page-- but only checks to see if it renders ok?  That is, it keeps only one indexed copy of each url, and basically assumes the mobile title and actual content is the same and only the formatting is different?  That assumption isn't always true -- mobile devices lend themselves to different interactions with the user - but it certainly could save Google billions of dollars in storage. Thoughts?

    | friendoffood
    0

  • Hello Moz Community, I had a conversation with someone who claimed that implementing a DMCA protection badge, such as those offered at http://www.dmca.com/ for $10/mo, will improve a site's Google rankings.  Is this true? I know that if my content is stolen it can hurt my rankings (or the stolen content can replace mine), but I'm asking if merely implementing the badge will help my rankings. Thanks! Bill

    | Bill_at_Common_Form
    0

  • People can come to a site www.domain.com  in these 6 different ways. http://www.domain.com, www.domain.com, http://domain.com, domain.com https://www.domain.com, https://domain.com Obviously we don't want google to maintain an index for any more than one of these. What is the way to handle this? 301 redirects for all to resolve to www.domain.com? Or is that overkill? Or 302 redirects?   Seems like a pretty basic issue but I'm not finding simple answers.

    | friendoffood
    0

  • Out of curiosity, do any Mozzers use a monthly spreadsheet style SEO strategy that is set on a daily basis like this: Day 1 - purchase/write 3 articles
    Day 2 - comment on 5 blogs
    Day 3 - upload article 1 
    Day 4 - directory submissions
    Day 5 - blog promotion
    Day 6 - etc..... If so, do you find this to be the most effective way of working, with this rigid structure?

    | fertilefrog
    0

  • This search   -  site:www.qjamba.com/online-savings/automotix gives me this result from Google: Automotix online coupons and shopping - Qjamba
    https://www.qjamba.com/online-savings/automotix
    Online Coupons and Shopping Savings for Automotix. Coupon codes for online discounts on Vehicles & Parts products. and Google tells me there is another one, which is 'very simliar'.  When I click to see it I get: Automotix online coupons and shopping - Qjamba
    https://www.qjamba.com/online-savings/Automotix
    Online Coupons and Shopping Savings for Automotix. Coupon codes for online discounts on Vehicles & Parts products. This is because I recently changed my program to redirect all urls with uppercase in them to lower case, as it appears that all lowercase is strongly recommended. I assume that having 2 indexed urls for the same content dilutes link juice.  Can I safely remove all of my UpperCase indexed pages from Google without it affecting the indexing of the lower case urls?  And if, so what is the best way -- there are thousands.

    | friendoffood
    0

  • Hello, I have a very large website that has a good amount of "Duplicate Content" issues according to MOZ. In reality though, it is not a problem with duplicate content, but rather a problem with URLs. For example: http://acme.com/product/features    and    http://acme.com/Product/Features both land on the same page, but MOZ is seeing them as separate pages, therefor assuming they are duplicates. We have recently implemented a solution to automatically de-captialize all characters in the URL, so when you type acme.com/Products, the URL will automatically change to acme.com/products – but MOZ continues to flag multiple "Duplicate Content" issues. I noticed that many of the links on the website still have the uppercase letters in the URL even though when clicked, the URL changes to all lower case. Could this be causing the issue? What is the best way to remove the "Duplicate Content" issues that are not actually duplicate content?

    | Scratch_MM
    0

  • A third party canonicalizes to our content, and we've recently needed to re-direct that content to a new URL. The third party is going to take some time updating their canonicals, and I am wondering if search engines will still recognize the canonical even though there is a re-direct in place?

    | nicole.healthline
    0

  • Hi, We run a travel site with a number of programs, and each program has its own dedicate page, ie example.com/programs/program-xyz Some of these programs stop running and we no longer offer them, other-times they are on hold and will be reactivated later. Our old strategy was to 301-redirect these programs to another, relevant program. However, I believe that could be flawed. Would it not be a better solution to display the page as normal (with a 200 code) and instead of having the details of the program rather show some text saying the program has stopped and list a few suggestions. I just don't want to set off any spam-flags by pushing SE value via a 301 redirect to unrelated pages Here are some other scenarios I was thinking: For the program are only temporarily on-hold (ie not taking bookings for now) 302 redirect those to more appropriate pages For programs that are permanently on-hold (ie will never take bookings again) show a custom 404 or 410 page (With text with suggestions of different programs) Any suggestions or feedback on this would be most appreciated. -Jason

    | Clickmetrics
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.