Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • Happy Friday everyone! I just noticed that one of our Attorney Profile's url's is wrong. We used to have someone named "Dana Fortugno" as our Family Law attorney, but when he left, (over two years ago) we hired "Scott Finelli." The person who setup the site, just changed the information on the page not url. So instead of it saying "http://www.kempruge.com/scott-finelli-jd-llm/;" it says "http://www.kempruge.com/dana-fortugno-jd-llm/." I'm considering taking all the content on the page with the wrong url, copying it to a new page with the correct URL and 301 redirecting (what would now be a blank page) to the new page with the correct URL. Is this the best way to handle this? Also, I don't believe there are many SEO concerns regarding the pages specifically. The profile pages aren't what we rank for in any of our Family Law related keywords. I am worried about having a completely blank page that just 301 redirects as looking bad to google, but not sure if it would? As always, thank you for your time and any assistance you can provide. Ruben

    | KempRugeLawGroup
    0

  • Hi guys, I am about to send a reconsideration letter and still finalizing my disavow file. The format of the disavow is Domain:badlink.com (stripping out to the root domain) but what about those toxic links that are located in tumblr such as: badlink.tumblr.com? The issue is that there are good tumblr links we got so I don't want to add just tumblr.com so do you guys think I will have issues submitting badlink.tumblr.com and not tumblr.com? Thank you!

    | Ideas-Money-Art
    0

  • Hi,
    We seem to have duplicated rel=author tags (x 3) on WordPress pages, as we are using Yoast WordPress SEO plugin which adds a rel=author tag into the head of the page and Fancier Author Box plugin which seems to add a further two rel=author tags toward the bottom of the page. I checked the settings for Fancier Author Box and there doesn't seem to be the option to turn rel=author tags off; we need to keep this plugin enabled as we want the two tab functionality of the author bio and latest posts. All three rel=author tags seem to be correctly formatted and Google Structured Data Testing Tool shows that all authorship rel=author markup is correct; is there any issue with having these duplicated rel=author tags on the WordPress pages?
    I tried searching the Q&A but couldn't find anything similar enough to what I'm asking above. Many thanks in advance and kind regards.

    | jeffwhitfield
    0

  • Hi there, Recently, I modified the structure of my product page to load the images into an iframe, instead of using the img tag directly . The reason is because I wanteddd product videos(YouTube) to be shown in the same iframe. My question is: If the attributes of the images are correctly set, from a SEO perspective, Do you see any problem with that approach? I know Google bot wasn't very good crawling iframes in the past. Thanks a lot. Best regards.

    | footd
    0

  • So let's say I am working for bloggingplatform.com, and people can create free sites through my tools and those sites show up as myblog.bloggingplatform.com. However that site can also be accessed from myblog.com. Is there a way, separate from editing the myblog.com site code or files, for me to tell google to stop indexing myblog.bloggingplatform.com while still letting them index myblog.com without inserting any code into the page load? This is a simplification of a problem I am running across. Basically, Google is associating subdomains to my domain that it shouldn't even index, and it is adversely affecting my main domain. Other than contacting the offending sub-domain holders (which we do), I am looking for a way to stop Google from indexing those domains at all (they are used for technical purposes, and not for users to find the sites). Thoughts?

    | SL_SEM
    1

  • Right now I use a background image and CSS to tie the h1 tag to my logo on each page. However, I am concerned that may not be best practice. Plus, I am interested in using schema markup on my logo. So, my question is, if I use an image with alt text inside my h1 tag, will the alt text carry as much weight as a text-based h1?

    | Avalara
    0

  • A website that we maintain keeps ranking for the keyword 'homeless shelter'.  The company is UTILIS USA and they produce heavy duty shelters for military personnel.  They have nothing to do with homeless shelters but continue to receive traffic concerning the phrase.

    | ReviveMedia
    0

  • Hi, we're making our site into a static site but I would like to transfer the Google juice. Most of the links and database exist on subfolders though. Could I simply do 301 redirects on the subfolders and retain the value or does it have to be on the full domain?

    | Therealmattyd
    0

  • A year ago an SEO specialist evaluated my Wordpress site and said she had seen lower rankings for Wordpress sites--in general. We moved our site off any cms and design in html 5. Our blog, however, is still on Wordpress. I'm thinking about moving to the Ghost platform b/c I only a blog. The drawbacks are one author, no recent post lists, no meta tags. Is it worth it to move the site off Wordpress. Will it affect my rankings much if I have great content? Does anyone have experience with or opinions on Ghost?

    | RoxBrock
    0

  • Thanks in advance for any help, It seems our target keyword "digital marketing agency" is not being picked up by yoast or the on-page grader as having the keyword in our home page content. Exact Keyword Used in Document Text at Least Once comes back as a zero. When we run our website through Google Webmasters fetch tool the keyword shows up fine in the content. Any ideas how we can fix this?

    | DerekDenholm
    0

  • Hi, Our bandwidth provider has informed us of a forthcoming network outage this weekend, the net result being that around 20 websites will be unreachable for a total of about 1 hour inside a predetermined 3 hour window.In an ideal world we would like to provide a holding page or be able to respond with a "503 Service Temporarily Unavailable" HTTP code, however the complete absence of connectivity means we'll be unable to do this.Does anyone have any ideas about the SEO implications of this kind of downtime? It would be useful to know if there are any actions we can take prior to the outage that could mitigate the impact. We've considered repointing the DNS to other servers, but it's only something we'll do if the negative impact of not doing it is too great.Thanks in advance!

    | Dave392
    0

  • I have a client that has a lot of high ranking content embeded as slideshare files. The google results come through as slideshare rankings. Should I strip the content out of slideshare and get it into the native site? any information on the pros and cons of slideshare content would be appreciated.

    | bakergraphix_yahoo.com
    0

  • I'm having an odd error I'm trying to diagnose. Our Index Status is growing and is now up to 1,115. However when I look at Sitemaps we have 763 submitted but only 134 indexed. The submitted and indexed were virtually the same around 750 until 15 days ago when the indexed dipped dramatically. Additionally when I look under HTML improvements I only find 3 duplicate pages, and I ran screaming frog on the site and got similar results, low duplicates. Our actual content should be around 950 pages counting all the category pages. What's going on here?

    | K-WINTER
    0

  • We recently started working on a website and most of the work done so far is resolving onsite technical issues (duplicate content, duplicate titles, broken links, pagespeed, grammar, etc.). Everything done has been positive, but their position in SERPs has actually gone down. I'm having a look to see if anything I have done could have had a negative effect. However, when visiting their website on a mobile device, it automatically redirects you to iTunes so that you can download their app. My first instinct is that this is a horrendous idea as it would result in a massive bounce rate which would be impossible to track. I have tried convincing them to do it differently, but this is how they want it. However, when googlebot visits the website as a mobile it returns an error. I'm fairly sure that this would have a negative effect on search results but I could do with a second opinion.

    | maxweb
    0

  • Hi all, Is it that detrimental to SEO if you link to the CMS page ID of a URL rather than the text URL of a page even if when you look at the source code Google sees it as a text URL? Thanks! 🙂

    | Diana.varbanescu
    0

  • I've got a job board that pulls in feeds from various job sites and recruiters. Rather than losing visitors exiting my site, I thought I was being clever by opening all external links in modal windows, so the user can 'click to close' and effectively stay on my site. I'm now starting to think this may potentially have been a bad move for SEO. Is there any evidence to suggest this? Simon

    | simmo235
    0

  • I am working on a site which is currently being redesigned. The home page currently ranks highly for relevant search terms, although on the new site the content on this page will be removed. The solution I was considering, to preserve rankings, was to move the content on the home page to a new url, and use a 301 redirect to help preserve rankings for that particular page. The question I have therefore, is am I able to add new content to the home page, and have this page freshly indexed accordingly? Any thoughts or suggestions would be most welcome. Thanks, Matt.

    | MatthewA
    0

  • I have Pages with Duplicate Page Content in my Crawl Diagnostics Tell Me How Can I solve it Or Suggest Me Some Helpful Tools. Thanks

    | nomyhot
    0

  • Hello team, Need to bounce a question off the group.  We have a site that uses the .NET AJAX tool kit to toggle tabs on a page.  Each tab has content and the content is drawn on page load.  In other words, the content is not from an AJAX call, it is there from the start.  The content sits in DIV tags which the javascript toggles - that's all. My customer hired an "SEO Expert" who is telling them that this content is invisible to search engines.  I strongly disagree and we're trying to come to a conclusion.  I understand that content rendered async via an AJAX call would not be spidered, however just using the AJAX (Javascript) to switch tabs will not affect the spiders finding the content in the markup. Any thoughts?

    | ChrisInColorado
    0

  • Hello, For an upcomming campaign we noiced the search volume for the word photobook is much higher then photobooks. Our page is currently ranking quite strong for photobooks, but not for photobook.
    What would be the impact if we change the url, name and keywords to photobook?
    And if it would make sense to change, what would te correct steps be to do so?
    Thanks you very much for your thoughts on this matter!

    | ETonnard
    0

  • Hello Moz Community! Our company has its own blog (www.awarablogs.com) - the blog was created some time ago by means of a simple blog-engine. Now we see that the structure of the blog is bad for SEO (it has long URLs, many useless folders, subdomains and so on), so we'd like to simplify it. But the engine doesn't allow to change its structure in the way we 'd like to. Our webmaster suggested that we use "Alias". Will this method really help us make our blog SEO-friendly? Or is it better to choose another blog software like Wordpress? Thank you very much!

    | Awaraman
    0

  • I want create a video gallery in my site. Does anyone know about any video CMS or wordpress video plugin optimized for SEO?

    | Felip3
    0

  • http://www.reebok.com/en-US/reebokonehome/ This is a homepage for an instructor network micro-site on Reebok.com The robots.txt file was excluding the /en-US/ directory, we've since removed that exclusion, and resubmitted this URL for indexing via Google Webmaster but we are still not seeing it in the index. Any advice would be very helpful, we may be missing some blocking issue or perhaps we just need to wait longer?

    | PatrickDugan
    0

  • Hi People, My client has a site www.activeadventures.com. They provide adventure tours of New Zealand, South America and the Himalayas. These destinations are split into 3 folders in the site (eg: activeadventures.com/new-zealand, activeadventures.com/south-america etc....). The actual root folder of the site is generic information for all of the destinations whilst the destination specific folders are specific in their information for the destination in question. The Problem: If you search for say "Active New Zealand" or "Adventure Tours South America" our result that comes up is the activeadventures.com homepage rather than the destination folder homepage (eg: We would want activeadventures.com/new-zealand to be the landing page for people searching for "active new zealand"). Are there any ways in influence google as to what page on our site it chooses to serve up? Many thanks in advance. Conrad

    | activenz
    0

  • Hi Moz Community. I'm in need of some URL structure advice for product pages. We currently have ~4,000+ products and I'm trying to determine whether I need a new URL structure from the previous site owners. There are two current product URL structures that exist in our website: 1.http://www.example.com/bracelets/gold-bracelets/1-1-10-ct-diamond-tw-slip-on-bangle-14k-pink-gold-gh-i1-i2/ (old URL structure)
    2. http://www.example.com/gemstone-bracelet-prd-bcy-121189/ (new URL structure) The problem is that half of our products are still in the old structure (no one moved them forward), but at the same time I'm not sure if the new structure is optimized as much as possible. Every single gemstone bracelet, or whatever product will have the same url structure, only being unique with the product number at the end. Would it be better to change everything over to more product specific URLS. I.e. example.com/topaz-gemstone-dangle-bracelet. Thanks for your help!
    -Reed

    | IceIcebaby
    0

  • Hi Mozzers, After adding 3 new pages to example.com, when generating the xml sitemap,  Iwasn't able to locate those 3 new url. This is the first time it is happening. I have checked the meta tags of these pages and they are fine. No meta robots setup! Any thoughts or idea why this is happening? how to fix this? Thanks!

    | Ideas-Money-Art
    0

  • Hi Just checked a  diagnostic report in moz and we are getting duplicate page content for http://domain.co.uk and http://www.domain.co.uk attached is the screen shot Does anyone know how to fix this in magento? Thanks duplicate.png

    | tidybooks
    0

  • Hi: I recently learned about inline styles and that Google started penalizing sites for that in October. Then I was told that Wix and Flash don't work (or work well) either for SEO as the engines won't crawl them (I think). Does anyone know of a blog that goes over everything that doesn't work so that I could recognize it when I look at someone's code. Anyone know of such a resource? Cheers, Wes

    | wrconard
    0

  • Hey all, I am looking to get the opinions off the community to help settle a discussion / debate. We are looking at how a site is laid out and which is the preferred method. There are two options: www.site.com --> /category-page --> /product-page (With this option, you always have the domain name and then page, no matter where in the site you actually are, and how many clicks it took you to get there). Your URL to the end page here would be www.site.com/product-page www.site.com --> /category-page --> /category-page/product-page --> (With this option, you into a defined structure). Your URL to the end page here would be www.site.com/category-page/product-page If you have a moment, I would be interested to know your views on which you would consider to be your preferred method and why. Thanks, Andy

    | Andy.Drinkwater
    0

  • Hi This subdomain has about 4'000 URLs indexed in Google, although it's blocked via robots.txt: https://www.google.com/search?safe=off&q=site%3Awww1.swisscom.ch&oq=site%3Awww1.swisscom.ch This has been the case for almost a year now, and it does not look like Google tends to respect the blocking in http://www1.swisscom.ch/robots.txt Any clues why this is or what I could do to resolve it? Thanks!

    | zeepartner
    0

  • Hello Moz Community. I have a question about 404 crawl errors in WebmasterTools, a while ago we had an internal linking problem regarding some links formed in a wrong way (a loop was making links on the fly), this error was identified and fixed back then but before it was fixed google got to index lots of those malformed pages.  Recently we see in our WebMaster account that some of this links still appearing as 404 but we currently don't have that issue or any internal link pointing to any of those URLs and what confuses us even more is that WebMaster doesn't show anything in the "Linked From" tab where it usually does for this type of errors, so we are wondering what this means, could be that they still in google's cache or memory? we are not really sure. If anyone has an idea of what this errors showing up now means we would really appreciate the help. Thanks. jZVh7zt.png

    | revimedia
    1

  • hi mozzers I have a client that has just released a .com version of their .co.uk website. They have basically re-skinned the .co.uk version with some US amends so all the content and title tags are the same. What you do recommend? Canonical tag to the .co.uk version? rewrite titles?

    | KarlBantleman
    0

  • I was wondered if the site is serving different user experience when JS is disabled is sort of cloaking

    | John_Smith_
    0

  • So whilst searching for link opportunities, I found a website that has scraped content from one of our websites. The website looks pretty low quality and doesn't link back. What would be the recommended course of action? Email them and ask for a link back. I've got a feeling this might not be the best idea. The website does not have much authority (yet) and a link might look a bit dodgy considering the duplicate content Ask them to remove the content. It is duplicate content and could hurt our website. Do nothing. I don't think our website will get penalised for it since it was here first and is in the better quality website. Possibly report them to google for scraping? What do you guys think?

    | maxweb
    0

  • Hi, last december the company I work for and another company merged. The website of company A was taken offline and the home page was 302 redirected to a page on website B. This page had information about the merger and the consequences for customers. The deeper pages of website A were 301 redirected to similar pages on website B. After a while, the traffic from the redirected home page decreased and we thought it was time to change the redirect from a 302 into a 301 redirect to the home page. Because there are still a lot of links to the home page of website A and we wanted to preserve the link juice. Two weeks ago we changed the 302 redirect from website A into a 301 redirect to the home page of website B. Last week the Google webmaster tools account of website B showed the links from the 301 redirected website A. The total amount of links doubled and the top anchor text is the name of company A instead of company B. This, off course, could trigger an alarm at Google. Because we got a lot of new links with a different anchor text. A tactic used by spammers/black-hats. I am a bit worried that our change will be penalized by Google. But our change is legit. It is to the advantage of our customers to find us if they search for the name of company A or click on a link to website A. We didn´t change the change of address of domain A in Google webmaster tools yet. Is it a good idea to change the change of address of domain A into domain B? Are there other precautions we can take?

    | NN-online
    0

  • I Hired a SEO specialist last December from India.. I was paying him every month and was feeling he was not doing is job. He was insuring me he was a White hat seo but in practice he was not doing anything good for my business: -Writing low quality content
    -Getting constantly refused for article and press release post
    -Flooding my social media of followers who have nothing to do with my business
    -Always needed to push him so he was doing is job. He insured me I would start to see some results after 3 months.  But nothing.  He state that he send about 10 press release and article but those are no where to be found.. is defense is always "give it a couple of months."  I search google for last month result with my name. every month I was doing it and I was just seeing some little non important results. Anyway I didnt make my payment and I block him access to all my site and social media and here is what he wrote me: "if I don’t see $500 payment in my account by 6<sup>th</sup> May, 2014 believe me you will face consequences. You would face set back in your business, if I know White Hat very well, then I know BLACK HAT ALSO FAR MUCH BETTER. IT WOULD ONLY TAKE ME TO SETUP FEW THINGS ON THE SOFTWARES AND I WILL LINK EVERY ODD WEBSITE TO YOUR WEBSITE" This guy is against is own rule of stating that he is a white hat SEO.  My question is are we really at the mercy of those kind of Threat?  Anyone can put my website down?  Which recourse Do I have?  I have his name, business information, photo...

    | groupemedia
    0

  • Hi, I'd love to use schema markup, but I don't think there is one for the type of products we sell (mortgages). Does anyone know of a way of using schema markup for a page like this: https://www.turnkeymortgages.co.uk/mortgage/mortgage-rates/ The regulator for mortgages (FCA) in the UK has rules about how products are displayed, we have to show all the information in the table (see the link), and the compliance text (your home will be repossessed....) with every product. This is why I haven't tried to 'cobble' something together from the existing markup available - I can't fall foul of the regulator (or be misleading to customers). Anyone have any thoughts on this? Or, should I just wait until Schema come up with a markup suitable for such products.

    | CommT
    0

  • Hi, recently we got some editorial links on some of the articles from a few online journals and I've noticed anchor links all have these similar property within the <a>:</a> Mysite What does the skimlinks-unlinked and data-skimwords-word means? Are these normal organic links and valuable?

    | LauraHT
    0

  • This seems like a dumb question, but I'm not sure what the answer is. I have an ecommerce client who has a couple of subdirectories "gallery" and "blog". Neither directory gets a lot of traffic or really turns into much conversions, so I want to remove the pages so they don't drain my page rank from more important pages. Does this sound like a good idea? I was thinking of either disallowing the folders via robot.txt file or add a "no index" tag or 301redirect or delete them. Can you help me determine which is best. **DEINDEX: **As I understand it, the no index meta tag is going to allow the robots to still crawl the pages, but they won't be indexed. The supposed good news is that it still allows link juice to be passed through. This seems like a bad thing to me because I don't want to waste my link juice passing to these pages. The idea is to keep my page rank from being dilluted on these pages. Kind of similar question, if page rank is finite, does google still treat these pages as part of the site even if it's not indexing them? If I do deindex these pages, I think there are quite a few internal links to these pages. Even those these pages are deindexed, they still exist, so it's not as if the site would return a 404 right? ROBOTS.TXT As I understand it, this will keep the robots from crawling the page, so it won't be indexed and the link juice won't pass. I don't want to waste page rank which links to these pages, so is this a bad option? **301 redirect: **What if I just 301 redirect all these pages back to the homepage? Is this an easy answer? Part of the problem with this solution is that I'm not sure if it's permanent, but even more importantly is that currently 80% of the site is made up of blog and gallery pages and I think it would be strange to have the vast majority of the site 301 redirecting to the home page. What do you think? DELETE PAGES: Maybe I could just delete all the pages. This will keep the pages from taking link juice and will deindex, but I think there's quite a few internal links to these pages. How would you find all the internal links that point to these pages. There's hundreds of them.

    | Santaur
    0

  • The page in question receives a  lot of quality traffic but is only relevant to a small percent of my users. I want to keep the link juice received from this page but I do not want it to appear in the SERPs.

    | surveygizmo
    0

  • While searching for "Blog writing service reviews", I found that a web page that's not even optimized for the query is ranking within top 15 search results. Upon checking the source code, I found that the webpage has been optimized for product reviews services. Plus, the website is only 11 months old, got 7 digit Alexa rank and has PR 1. Why would Google rank such a page in top 15?

    | suskanchan
    0

  • Hi, our ecommerce link penalty was revoked by google back in Feb 26th 2013, but to this day we have not seen any improvement on our rankings. Due to 80% revenue loss we had to layoff quite a few people to stay alive. Situation now is more dire then ever for our company. We have millions of dollars invested in our business and google just busted it for some "low quality" or "spammy links" as they call it. We want to try to move to a different domain and do a 301 from the old domain to make sure our previous customers can still find us as a last effort to stay alive. But doing so we do not want to the bad links juice to flow to our new domain. Can we do a 301 with nofollow and will that have any negative impact or any impact at all.? any suggestion is greatly appreciated. Thank you Nick We are planning on moving to a different domain after 10 years, and laying off bunch of people due to loss of revenue.

    | orion68
    0

  • I have the URL from my former company parked on top of my existing URL.  My top keywords are showing up with the old URL attached to the metadsecription of my existing URL.  It was supposed to be 301 redirected instead of parked but my web developer insists this was the right way to do it and it will work itself out after google indexes the old URL out of existence.  Are there any other options?

    | Joelabarre
    0

  • I am in the sign/banner business, for years I have had a flash based web application that I developed which allows customers to design their own signs/banners online.  With the demise of flash i am prompted to begin developing an HTML5 based application to take it's place.  Developing this software is a rather expensive endeavor so many local sign shops, which don't sell on the web, don't bother to develop such an application, but what if i gave it to them?  I assume a fair amount would find great value in such an application thereby allowing their clients to communicate a design idea without having to drive to the store front.  The application would actually run embedded on my site thus earning me a link back to my site. Question is this: Is this a bad idea.  If dozens of sign shops are running my application embedded on their sites will the help or hurt me? Thanks.

    | RocketBanner
    0

  • Hello boys and girls. I'm auditing my link profile and came across links pointing to my sites, from a ban site (site: domain.com gives back no result).yet the links are no follow. should i try to remove them all? i know no follow should be enough, yet my links are set in a bad neighborhood. what would you recommend??

    | Tit
    0

  • Good day, I am wondering if anybody here has done something like this before.  I have a page in one of my sites that contains a number of different - but related - free resources.  The resources can be sorted in different ways once the user is on the page. Now I am starting an outreach campaign, and want to be able to send out custom URLS (which pretty much means they have different query strings after them like '?id=123' ) so that when a person clicks on the link to the page it brings up the stuff they are more likely to be interested in at the top.  I expect - hope - that some of these people will put links back to this page as a result of this.  Now all the links may be slightly different, but they will come to the same page and the content will look slightly different.  I will make sure to have the rel=canonical tag in place. Does anybody know if this would be in violation of Google Terms and Conditions.  I can't see how, but I wanted to see what the experts here on Moz think before moving forward. Thanks in advance.

    | rayvensoft
    0

  • Hi, We currently have an eCommerce store that only sells in one country. We are going to open separate stores that will target different countries. As far as product descriptions go my initial plan was to use the same descriptions but then block product pages from being indexed. However this has also ended up blocking us from being able to use Google's product listing ads (it rejects the data feed due to not being able to index the products). Is there a way to copy the descriptions and avoid a delicate content issues without blocking our product pages from being indexed? Thanks for your help!

    | pgicom
    0

  • I have a few small sites and landing pages on Wordpress that I want to load a lot quicker than they do.  It occurred to me that if there is not a lot of content management necessary, I should simply make the static web pages straight html instead of trying all the modifications necessary to get some Wordpress sites and themes to load quicker.    I have noticed the html sites I have load lighting fast on slow hosting service. Is this a good idea, can anyone think of drawbacks to it?  Security?  Responsiveness? SEO? And what about taking some company's sites with blog straight html so the home page loads quick, and then using Wordpress for the blog?

    | phogan
    0

  • Hi all, I have a Twitter account. I have some data i receive from a third party. When new data arrives (these are football lineups) I want to post this to my Twitter. The data arrives approximately 60-30 minutes before matches start. So if I use twitterfeed i can get the data checked each 30 minutes - but this can result in tweets with data for matches that already started. How can i post my data automatically...and almost instantly upon reception? Not sure if this issue is for the MOZ forum, but I am pretty certain that there are some good developers out there that could help me? -Rasmus

    | rasmusbang
    0

  • Hey Moz community, Thanks for taking time to answer my question. I'm working directly with a hospital that has several locations across the country. They've copied the same content over to each of their websites. Could I point the search engines back to a singular location (URL) using the rel=canonical tag? In addition, does the rel=canonical tag affect the search engine rankings of the URLs (about 13 of them) that use the rel=canonical tag? If I'm on track, is there an ideal URL (location) to decide has the original content? This is actually the first time I've ever needed to use rel=canonical (if applicable). Thanks so much. Cole

    | ColeLusby
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.