Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi everyone, i´ll try to explain a situation is happening to me, i´m goint to try to explain the case (im writing the sites without links for explication purposes. Site 1: Adventurerooms Site 2: Adventureroomsmallorca Site 3: Adventureroomsmadrid (the new site) What happen is that at first there was only Adventurerooms and Adventureroomsmallorca, Adventurerooms was for Madrid and linked to the one in Mallorca too, was kind of giving the information for Madrid but in first page split with a link to Mallorca. In a new strategy we create Adventureroomsmadrid for Madrid, and leave Adventurerooms for Spain (with links to Adventureroomsmadrid and Adventureroomsmallorca. We redirect the info for Madrid in Adventurerooms to Adventureroomsmadrid with 301 redirections. We work during this 3 months in Adventureroomsmadrid making content in the blog, and improving (now Adventureroomsmadrid is Moz 15 (perhaps even more), and Adventurerooms is Moz 10. Surprising Adventurerooms is getting better in its search rankings, even when we took away content from it and even without working well. Adventureroomsmadrid is also improving but not as much as Adventurerooms (i know that is a new site, only 3 months), but Adventurerooms gets better results with no content and only DA of 10. I hope i´ve explain the case with my english so the question is: "Is it posible to improve site rankings working only with an other site?" Thanks in advance

    | webtematica
    0

  • Hello Experts, When I search in google any keyword like abcd in search results for one website after meta description there are showing few links of website ( image attached ) Can you please let me know what is this & how to achieve such type of links? Thanks! mdJBLYb

    | wright335
    0

  • Hey all, I’m looking to move a site from non-www to www and was wondering if anyone knows of a list of things to check and update after making the switch in WordPress (i.e. updating preferred domain in GSC). Anyone ever done this before who can lend some advice? Thanks! Dan

    | danielreyes
    0

  • We upgraded our site to a new platform the first week of August.  The product listing pages have a canonical issue.  Page 2 of the paginated series has a canonical pointing to page 1 of the series.  Google lists this as a "mistake" and we're planning on implementing best practice (https://webmasters.googleblog.com/2013/04/5-common-mistakes-with-relcanonical.html)  We want to implement rel=next,prev. The URLs are constructed using a hashtag and a string of query parameters.  You'll notice that these parameters are  &parameter:value vs &parameter=value. /products#facet:&productBeginIndex:0&orderBy:&pageView:grid&minPrice:&maxPrice:&pageSize:& None of the URLs are included in any indexed URLs because the canonical is the page URL without the AJAX parameters.  So these results are expected. Screamingfrog only finds the product links on page 1 and doesn't move to page 2.  The link to page 2 is AJAX.  ScreamingFrog only crawls AJAX if its in Google's deprecated recommendations as far as I know. The "facet" parameter is noted in search console, but the example URLs are for an unrelated URL that uses the "?facet=" format.  None of the other parameters have been added by Google to the console.  Other unrelated parameters from the new site are in the console. When using the fetch as Google tool, Google ignores everything after the "#" and shows only the main URL.  I tested to see if it was just pulling the canonical of the page for the test, but that was not the case. None of the "#facet" strings appear in the Moz crawl I don't think Google is reading the "productBeginIndex" to specify the start of a page 2 and so on.  One thought is to add the parameter in search console, remove the canonical, and test one category to see how Google treats the pages. Making the URLs SEO friendly (/page2.../page3) is a heavy lift. Any ideas how to diagnose/solve this issue?

    | Jason.Capshaw
    0

  • I am looking at moving my site from HTTP to full HTTPS, so i will 301 redirect any HTTP requests to their HTTPS counterpart. All my pages in the Google index are HTTP, so will that 301 redirect reduce the value of the pages? Cheers

    | SEOhmygod
    0

  • Hi Mozer's, I have a site www.xyz.com and also geo targeted sub domains www.uk.xyz.com, www.india.xyz.com and so on. All the sub domains have the content which is same as the content on the main domain that is www.xyz.com. So, I want to know how can i avoid content duplication. Many Thanks!

    | HiteshBharucha
    0

  • Hi all, Last year, a website I monitor, got hacked, or infected with malware, I’m not sure which. The result that I got to see is 100’s of ‘not found’ entries in Google Search Console / Crawl Errors for non-existent pages relating to / variations of ‘Canada Goose’.  And also, there's a couple of such links showing up in SERPs. Here’s an example of the page URLs: ourdomain.com/canadagoose.php ourdomain.com/replicacanadagoose.php I looked for advice on the webmaster forums, and was recommended to just keep marking them as ‘fixed’ in the console.  Sooner or later they’ll disappear.  Still, a year after, they appear. I’ve just signed up for a Moz trail and, in Open Site Explorer->Top Pages, the top 2-5 pages are relating to these non-existent pages: URLs that are the result of this ‘canada goose’ spam attack.  The non-existent pages each have around 10 Linking Root Domains, with around 50 Inbound Links. My question is:  Is there a more direct action I should take here?  For example, informing Google of the offending domains with these backlinks. Any thoughts appreciated! Many thanks

    | macthing
    1

  • Hi all, Do you think that is possible to have duplicate content issues because we provide a unique image with 5 different URLs ? In the HTML code pages, just one URL is provide. It's enough for that Google don't see the other URLs or not ? Example, in this article : http://www.parismatch.com/People/Kim-Kardashian-sa-securite-n-a-pas-de-prix-1092112 The same image is available on: http://cdn-parismatch.ladmedia.fr/var/news/storage/images/paris-match/people/kim-kardashian-sa-securite-n-a-pas-de-prix-1092112/15629236-1-fre-FR/Kim-Kardashian-sa-securite-n-a-pas-de-prix.jpg http://resize-parismatch.ladmedia.fr/img/var/news/storage/images/paris-match/people/kim-kardashian-sa-securite-n-a-pas-de-prix-1092112/15629236-1-fre-FR/Kim-Kardashian-sa-securite-n-a-pas-de-prix.jpg http://resize1-parismatch.ladmedia.fr/img/var/news/storage/images/paris-match/people/kim-kardashian-sa-securite-n-a-pas-de-prix-1092112/15629236-1-fre-FR/Kim-Kardashian-sa-securite-n-a-pas-de-prix.jpg http://resize2-parismatch.ladmedia.fr/img/var/news/storage/images/paris-match/people/kim-kardashian-sa-securite-n-a-pas-de-prix-1092112/15629236-1-fre-FR/Kim-Kardashian-sa-securite-n-a-pas-de-prix.jpg http://resize3-parismatch.ladmedia.fr/img/var/news/storage/images/paris-match/people/kim-kardashian-sa-securite-n-a-pas-de-prix-1092112/15629236-1-fre-FR/Kim-Kardashian-sa-securite-n-a-pas-de-prix.jpg Thank you very much for your help. Julien

    | Julien.Ferras
    0

  • I've decided to write my own sitemap because frankly, the automated ones pull all kinds of out of I don't know where. So to get around that, manual it is. But I have some products appear in various categories, should I still list every product in each category in the sitemap, regardless of some being duplicates, or should I choose the most relevant category and list them there? I do have a canonical URL extension which should resolve any duplicate content I have.

    | moon-boots
    0

  • I have a div on my website with around 500 words of unique content in, automatically when the page is first visited the div has a fixed height of 100px, showing a couple of hundred words and fading out to white, with a show more button, which when clicked, increases the height to show the full content. My question is, does Google crawl the content in that div when it renders the page? Or disregard it? Its all in the source code. Or worse, do they consider this cloaking or hidden content? It is only there to make the site more useable for customers, so i don't want to get penalised for it. Cheers

    | SEOhmygod
    0

  • We have a client who before us, had a website that was blacklisted by Google. After we created their new website, we submitted an appeal through Google's Webmaster Tools, and it was approved. One year later, they are still unable to rank for anything on Google. The keyword we are attempting to rank for on their home page is "Day in the Life Legal Videos" which shouldn't be too difficult to rank for after a year. But their website cannot be found. What else can we do to repair this previously blacklisted website after we're already been approved by Google? After doing a link audit, we found only one link with a spam score of 7, but I highly doubt that is what is causing this website to no longer appear on Google. Here is the website in question: https://www.verdictvideos.com/

    | rodneywarner
    0

  • Hello Experts, 1) Do Website Engagement Rates Impact Organic Rankings? What does Website Engagement Rates means? Is it a visitor stay on my page long time? Can anyone guide me what things we can include in website engagement rates Specifically for "Ecommerce Sites" ? Can I consider - 1) Reviews 2) Video's  3) Good Images and informative description 4) video's 5) highlighting relevant blog posts at category pages  6) email subscription etc as website engagement? Thanks!

    | wright335
    1

  • So I have a seriously large amount of duplicate content problems on my Opencart site, and I've been trying to figure out the best way to fix them one by one. But is there a common, easy way of doing this? Because frankly, it is a nightmare otherwise. I bought an extension which doesn't appear to work (http://www.opencart.com/index.php?route=extension/extension/info&extension_id=20468&utm_source=ordercomplete&utm_medium=email&utm_campaign=wm), so now I'm at a loss.

    | moon-boots
    0

  • A site we're working on has hundreds of thousands of inventory pages that are generally "orphaned" pages. To reach them, you need to do a lot of faceting on the search results page. They appear in our XML sitemaps as well, but I'd still consider these orphan pages. To assist with crawling and indexation, we'd like to create HTML sitemaps to link to these pages. Due to the nature (and categorization) of these products, this would mean we'll be creating thousands of individual HTML sitemap pages, which we're hesitant to put into the index. Would the sitemaps still be effective if we add a noindex, follow meta tag? Does this indicate lower quality content in some way, or will it make no difference in how search engines will handle the links therein?

    | mothner
    0

  • Hi I wondered what the view is on content below the fold? We have the H1, product listings & then some written content under the products - will Google just ignore this? I can't hide it under a tab or put a lot of content above products - so I'm not sure what the other option is? Thank you

    | BeckyKey
    0

  • I am trying to get more keyword ideas for one of my project. For example the seed keyword is computer virus and the results which i get is keywords related to  the phrase computer virus such as virus in computer , virus threats but actually i am trying to get search details on actual threat names or types of viruses and i expect output such as malware, trojan etc.. ( Currently using ad words keyword planner to fetch keyword data ) Is there any way to achieve this ? Even if i  use "types of computer viruses" as seed keyword i am not getting the types of viruses people searching for instead i get keywords ideas such as computer viruses, computer threats etc... ? Can somebody suggest a solution ?

    | NortonSupportSEO
    0

  • Hi, Our website has many broken links/non-existing pages. But there might be many backlinks for such non-existing pages. We want to find-out all such backlinks pointing to our non-existing pages (404). All the tools I tried so far including Moz OSE have been listing only backlinks for current existing pages. So please guide me how to find the backlinks we are looking for. Thanks, Moz member

    | vtmoz
    0

  • This has been driving me slowly mad for ages - a site that consistently outranks my client site despite my clients site apparently being more SEO friendly on every level: Google UK search term "chimney sweeps london" My client site: http://apexchimneysweeps.co.uk The other site: http://www.firkinschimneysweeps.co.uk Can anyone shed any light on how this is happening? I really hope I'm being an idiot and missing something obvious! Comparisons:  (A = apex F = firkin) Domain age: A 2000 / F 2013 Domain Authority: A 16 / F 9 Page Authority: A 29 / F 22 I would say Apex page content is better and is certainly updated more often.

    | abisti2
    0

  • Hey there, My website All good but, in webmaster Search console some bad Queries(search terms) coming which is totally different from website. I want to make sure, is that harmful for my website traffic, as well as keywords Ranking?? How should i stop them to be crawl, ?? can any help for this query.?? i have attached screenshot of that, please check & help out, http://prntscr.com/cmusoq Thnx in advance.

    | poojaverify06
    0

  • So one of the big issues facing my website is that Moz seems to be picking up all of the ''Search'' and ''Tag'' pages, which is causing duplicate content. I cannot see any use for Google to index these pages, so is it better to create a No-Follow rule specific to Search and Tag?

    | moon-boots
    0

  • We have a domain solely used for print advertising that does a 301 redirect to a landing page (a department home page) on our "real" domain that is indexed on Google.  Example:  www.bmwrepairs.com redirects to www.repairshop.com/bmwrepairs. Is there a way to do a 301 redirect so that when they get redirected, the URL in the browser address bar remains www.bmwrepairs.com?

    | Jazee
    1

  • Hi Moz, We are currently doing SEO for a hand therapy company called the Hand Therapy Group. They rank well, however, one competitor, Sydney Hand Therapy, is ranking higher than them for the term "hand therapy Sydney" (which is one of our highly focused keywords) with three different URLs (their home page, contact page and about page) despite the latter two pages have no backlinks. I understand why Google might see their homepage as being more relevant because their name is Sydney Hand Therapy (even though the Hand Therapy Group have more backlinks) but why do the other two URLs rank so well? Any help/info/advice would be brilliant! Cheers!

    | wearehappymedia
    1

  • Most of the SEO suggestions (great quality content, long form content, engagement rate/time on the page, authority inbound links ) apply to content oriented site. But what should you do if you are an aggregator or a resource directory?  You aim is to send the user faster to other site they are looking for or provide ranking about the resources. In fact at a very basic level you are competing for search engine traffic because they are doing same things. You may have done a hand crafted, human created resource that is better than what algorithms are showing.  And your site  likely to have lot more outgoing links than content. You know you are better (or getting better) since repeat visitors keep coming back. So in these days of Search engines, what a resource directory or aggregator site do to rank? Because even directories need first time visitors till they start coming back again.

    | Maayboli
    0

  • I have implemented rel="next" and rel = "prev" across our site but google console is picking it up as duplications.  Also individual pages show up in search result too.  Here is an example linkhttp://www.empowher.com/mental-health/content/sizeismweightism-how-cope-it-and-how-it-affects-mental-healthhttp://www.empowher.com/mental-health/content/sizeismweightism-how-cope-it-and-how-it-affects-mental-health?page=0,3The second link shows up as duplicate.  What can i do to fix this issue?

    | akih
    0

  • Hello, I have a question regarding SSL Certificates I think I know the answer to but wanted to make sure. One of our clients’ site uses http for their pages but when they started creating Registration forms they created a full duplicate site on https (so now there are two versions of all of the pages). I know due to duplicate concerns this could be an issue and needs to resolved (as well as the pros and cons of both) but if they are already set up with https does it make sense to just move everything there or in some instances would it pay to keep some pages http (using canonical tags, redirects, htccess…etc)? – Most of the information I found related to making the decision prior to having both or describing the process but I couldn’t find anything that specifically related to if both are already present. I thought that the best approach because everything’s already set up is to just move everything over to the more secure one but was curious if anybody had any insight? Thank you in advance.

    | Ben-R
    0

  • Hi, We have gone through a change of company brand name including a new domain name.
    We followed google recommendations at: https://support.google.com/webmasters/answer/83106?hl=en and it seems to have worked really well, the new domain has replaced the old in the google search results. My question: Still most of our backlinks, both anchor text and links use the old brand name and domain and it´s a slow process trying to update all references. Although they get redirected fine to the new domain (also following google recommendations), I wonder if the current scenario is doing any harm, SEO wise (other than the missed visual exposure of the new brand name) ? ...since the old brand name is not present at the new site I´m thinking of including "New brand name - previously old brand name" somewhere just to provide some sort of connection to all old backlinks, would that be unnecessary? I should mention that the old brand name actually includes our most important keyword but the new brand name does not. Thanks!

    | Agguk
    0

  • For example: suppose you have a post "The Best Games to Play for YouTube Gamers in 2016" and you want to make this a yearly series. Should you 301 the 2016 version to the new 2017 one? Should you use the canonical attribute? If 2016 isn't in the URL, should you make the 2017 one the new URL?

    | Edward_Sturm
    0

  • Working on a website for a business with distinct lines of business, one is more B2C and one is B2B yet the type of service is related.  To think of an example, let's say it's for a photographer who does weddings, but also does real estate photography.  He wants to make sure he can market to each audience separately so when they go to his homepage the homepage content is oriented for the services that audience is looking for. If you use two separate websites, they have to be totally unique to avoid dupe content flags, and you also end up diluting each website's domain authority since you are spreading your inbound links between two different websites.  However would this be the optimum strategy then? One website hosted on: bozophotography.com A second domain: bozoweddings.com that has a 301 redirect to the wedding section home page on bozophotography.com A third domain: bozorealestatephotos.com that has a 301 redirect to the real estate section home page. So on certain advertising, business cards, etc, the business could choose which domain they want to publicize to insure the audience sees a home page related to that line of business. I suppose you could publicize it as a subdomain like: realestate.bozophotography.com or as a slash address: bozophotography.com/weddings but those seem much less professional, visually, than just having bozoweddings.com. There is rumor you don't quite get 100% of the link juice, but the main domain would be used the majority of the time so I really see no downside?

    | Jazee
    0

  • Hi Moz fellows, I'm new to Woocommerce and couldn't find help on Google about certain SEO-related things. All my past projects were simple 5 pages websites + a blog, so I would just no-index categories, tags and archives to eliminate duplicate content errors. But with Woocommerce Product categories and tags, I've noticed that many e-Commerce websites with a high domain authority actually rank for certain keywords just by having their category/tags indexed. For example keyword 'hippie clothes' = etsy.com/category/hippie-clothes (fictional example) The problem is that if I have 100 products and 10 categories & tags on my site it creates THOUSANDS of duplicate content errors, but If I 'non index' categories and tags they will never rank well once my domain authority rises... Anyone has experience/comments about this? I use SEO by Yoast plugin. Your help is greatly appreciated! Thank you in advance. -Marc

    | marcandre
    1

  • For example, when a person first lands on a given page, they see a collapsed paragraph but if they want to gather more information they press the "read more" and it expands to reveal the full paragraph. Does Google crawl the full paragraph or just the shortened version? In the same vein, what if you have a text box that contains three different tabs. For example, you're selling a product that has a text box with overview, instructions & ingredients tabs all housed under the same URL. Does Google crawl all three tabs? Thanks for your insight!

    | jlo7613
    0

  • So I have products appearing in several categories, all of which have the correct canonical url. But Moz is flagging up pages I never knew existed, and I don't understand why they exist at all and more so why my canonical fix isn't occurring for them, as below: SEO Friendly URL: http://thespacecollective.com/nasa-pin-sets/nasa-shuttle-mission-pin-set-no2 Weird URL to same product: http://thespacecollective.com/index.php?route=themecontrol/product&product_id=159 Is this a developer problem rather than an SEO problem?

    | moon-boots
    0

  • Hey guys! We got a rather large product range (books) on our eCommerce site (+150,000 titles). We get book descriptions as meta data from our publishers, which we display on the product pages. This obviously is not unique, as many other sites display the same piece of description of the book. It is important for us to rank on those book titles, so my question to You is: How would you go about it? I mean, it seems like a rather unrealistic task to paraphrase +150,000 (and growing) book descriptions. As I see it, there are these options: 1. Don't display the descriptions on the product pages (however then those pages will get even thinner!)
    2. Display the (duplicate) descriptions, but put no-index on those product pages in order not to punish the rest of the site (not really an option, though). 
    3. Hire student workers to produce unique product descriptions for all 150,000 products (seems like a huge and expensive task) But how would You solve such a challenge?
    Thanks a lot! Cheers, Tommy.

    | Jacob_Holm
    0

  • Hi, I'm trying to gather all the 404 crawl errors on my website after a recent hacking that I've been trying to rectify and clean up. Webmaster tools states that I have over 20 000+ crawl errors. I can only download a sample of 1000 errors. Is there any way to get the full list instead of correcting 1000 errors, marking them as fixed and waiting for the next batch of 1000 errors to be listed in Webmaster tools? The current method is quite timely and I want to take care of all errors in one shot instead of over a course of a month.

    | FPK
    0

  • A. Do nothing
    B. Redirect to legacy site (current domain)
    C. Create a placeholder with information about the rebranding
    D. Other... What do you think is best?

    | Maxaro.nl
    0

  • Does anyone know how to add canonical tags to product pages in Opencart? Is this possible to do in htaccess? If so, how specifically should it be written in? Please do not post any links to other pages which reference generic canonical information as I've read them all and none help. I'm looking for an Opencart specific answer, or a way to do it in htaccess.

    | moon-boots
    0

  • Hello, I'm curious what the difference is between internal links from the homepage and category pages. Make it sense to give some internal links from category pages (with a high PA) to an another page for a boost in the search results? Or is the link value too low in this case? Thanks in advance,
    Marcel

    | MarcelMoz
    1

  • Hello, Do you know what can happen when i change domain.com to www.domain.com? Will it have an influence to my link-building portfolio (external links to domain.com), position on google search, etc. Thank you for help.

    | Reyzer
    0

  • Hi Mozzers! I'm working on a site that is a bit of a mess (http://www.selectequipment.net/) and wanted to ask for some feedback on a couple of items. In addition to organizing the site by product category types, the client also has brand pages that include all products of a certain brand. One problem, however, is that I want to be able to target the relatively large number of consumers who are using searches of BRAND + PRODUCT type in the most optimal fashion. For example, someone looking for "Cutler Hammer Transformers". We have several products with different part numbers that would fit this bill and I'm wondering if we'd be okay just having several products (ie Cutler Hammer Transform 100xa, Cutler Hammer Transform 110xb) or if we'd be better off adding an organizational page for all "Cutler Hammer Transformers for Sale". There are a LOT of different combos that we'd need to do this for. Is it a good call?

    | RickyShockley
    0

  • Hi, We are redesigning our website the following way: Before: Page A with Content A, Page B with Content B, Page C with Content C, etc
    e.g. one page for each Customer Returns, Overstocks, Master Case, etc
    Now: Page D with content A + B + C etc.
    e.g. one long page containing all Product Conditions, one after the other So we are merging multiples pages into one.
    What is the best way to do so, so we don't lose traffic? (or we lose the minimum possible) e.g. should we 301 Redirect A/B/C to D...?
    Is it likely that we lose significant traffic with this change? Thank you,

    | viatrading1
    0

  • So here's my situation: My company's website usually receives around 80 organic visits/month and 50 direct visits/month from Mexico. However, in July we saw a small uptick to around 170 for each and then in the last 7 days we are in the middle of a massive spike which has put us up to 1400 visits for organic and 820 visits for direct in August. The traffic spike continues as we are almost up to 500 visits just today! Things to know: The visitors are purchasing from our store, staying on our site, browsing around, basically acting like real traffic. I was unable to identify any new links, press, and we did not do any specific Mexico optimization (spanish keywords). We sell a ball and it is called The One World Futbol, but it's always been called a futbol before so nothing new here. our website is www.oneworldplayproject.com. Everyone coming organically is searching our name, not keywords. We updated our shopping cart a few days before the massive traffic spike and significantly lowered the cost to ship to Mexico. Our Latin America director went to Mexico to work there for a month a few days before the spike and sent out a bunch of emails, texts, phone calls, what's app notifications to his large network. From what I am told by others here he has a vast network throughout Mexico, Central America and South America. We have also seen large traffic increases in other Latin American countries during this same time period just nothing like Mexico. We just hired an awesome social media coordinator who is extremely focused and is implementing a kick-ass social strategy We launched a branding campaign called #MakeLifePlayFull with press releases and ad spend behind it. PHEW! That was a lot of info for you to digest. So on the surface this seems like great news. BUT I want to understand WHY this is happening. Could it really just be the combination of all these things listed above or is it just a combination of our connected guy being in Mexico with better shipping costs? Why is it mainly happening in Mexico? Why is it so sustained? I suspect that if it is from our guy it would drop off quickly. Any thoughts on what to look at? I'm stumped.

    | Eric_OWPP
    0

  • Wooah, this one makes me feel a bit nervous. The cache version of the site homepage shows all the text, but I understand that is the html code constructed by the browser.  So I get that. If I Google some of the content it is there in the index and the cache version is yesterday. If I Fetch and Render in GWT then none of the content is available in the preview - neither Googlebot or visitor view.  The whole preview is just the menu, a holding image for a video and a tag line for it.  There are no reports of blocked resources apart from a Wistia URL.  How can I decipher what is blocking Google if it does not report any problems? The CSS is visible for reference to, for example, <section class="text-within-lines big-text narrow"> class="data"> some content... Ranking is a real issue, in part by a poorly functioning main menu. But i'm really concerned with what is happening with the render.

    | MickEdwards
    0

  • Hi, I have a page with high traffic that is showing a list of flea markets in a unique URL. We are redesigning our website and we have created a listing directory of flea markets, so the users can look up and find the information for each. Each flea market will have its own URL in the future, and the listing directory shows only summarized info of each flea market in the results. Before activating the new flea market section, I would like to make sure which is our best bet: Option 1: Create pages with same URL/content as the current ones, which we won't link from frontend, and besides that, use the new flea market section on a separate page. Option 2: Redirect the current page to the new flea market section. As an inaccurate reference because it depends on many variables and SEO doesn't have an actual number, I understand this is more or less how it would work: Example Option 1 (after 1 week of launch): Old Flea Market Pages SEO traffic: 10,000 visits/month New Copied Flea Market Pages traffic: 9,700 (maybe a bit below 100 because of design changes etc) New Flea Market Section traffic: 500 visits/month (then increase over time) Example Option 2 (after 1 week of launch): Old Flea Market Pages SEO traffic: 10,000 visits/month New Redirected Flea Market Pages traffic: 9,000 (in principle PageRank wouldn't be affected, but other rankings might) New Flea Market Section traffic: (joined above, then increase over time) According to this, Option 1 would give us more total future visits compared to redirecting, plus the new flea market pages would add to it. If redirecting, the new flea market section would add up some SEO juice to the old page, but not as much as Option 1 (not redirecting). Please confirm. Which option is the best one and why? Thank you, New 301 Redirection Rules: https://moz.com/blog/301-redirection-rules-for-seo

    | viatrading1
    0

  • Hi, On my website, many product pages were redirected over time to its product category, due to the product being unavailable. I understand with a 301 redirect, the final URL would have lost about 15% of the link juice. However - if after some time (e.g. 2 months, or 1 year) I remove the redirection - is the original page going to have any SEO juice, or did it already lose all of it? Thanks,

    | viatrading1
    0

  • Hi, In our website www.viatrading.com we have many products that might be in stock or not depending on availability. Until now, when a product was not available anymore, we took this page down (and redirected to its product category page). And, only if the product was available again, we re-activated the URL - this might be days, months or even years later. To make this more SEO-friendly, we decided now that while a product is not available, instead or deactivating/redirecting the page, we will leave it online and just add a message saying "This product is currently not available". If we do this, we will automatically re-activate about 500 products pages at once. 1. Just to make sure, is it harmful for SEO to keep activating/deactivating URLs this way? 2. Since most of these pages have been deindexed for a long time due to being redirected - have they lost all their SEO juice? 3. How can we better activate these old 500 pages - is it ok activating them all at once? Thank you,

    | viatrading1
    1

  • Hi all My name is Riccardo and i work for a web agency. I'am working on a new client website and i have found this kind of errors through MOZ (Image 1). I checked all the URLs; they work and they remind to the Homepage.
    The website is made with Wordpress. I have already tried to solve this problem with 301 redirect but, as i supposed, it didn't work.
    I think that is a problem related to Wordpress URL in Wordpress settings (Image 2). However i would like to know if anybody had the same problem or if there are other possibile causes. Thank you in advance! zDVL0pj aB7MeGe

    | advmedialab
    0

  • Moz keeps finding loads of pages with duplicate content on my website. The problem is its a directory page to different locations. E.g if we were a clothes shop we would be listing our locations: www.sitename.com/locations/london www.sitename.com/locations/rome www.sitename.com/locations/germany The content on these pages is all the same, except for an embedded google map that shows the location of the place. The problem is that google thinks all these pages are duplicated content. Should i set a canonical link on every single page saying that www.sitename.com/locations/london is the main page? I don't know if i can use canonical links because the page content isn't identical because of the embedded map. Help would be appreciated. Thanks.

    | nchlondon
    0

  • Hi everyone, I've read a lot about the impact of iFrames on SEO lately -- articles like http://www.visibilitymagazine.com/how-do-iframes-affect-your-seo/ for example. I understand that iFrames don't cause duplicate content or cloaked content issues, but what about thin content concerns? Here's my scenario: Our partner marketing team would like to use an iframe to pull content detailing how Partner A and my company collaborate from a portal the partners have access to. This would allow the partners to help manage their presence on our site directly. The end result would be that Partner A's portal content would be added to Partner A's page on our website via an iFrame. This would happen about across at least 100 URLs. Currently we have traditional partner pages, with unique HTML content. There's a little standalone value for queries involving the bigger partners' names + use case terms, but only in less than 10% of cases. So I'm concerned about those pages, but I'm more worried about the domain overall. My main concern is that in the eyes of Google I'd be stripping a lot of content off the domain all at once, and then replacing it with these shell pages containing nothing (in terms of SEO) but meta, a headline, navigation links, and an iFrame. If that's the case, would Google view those URLs as having thin content? And could that potentially impact the whole domain negatively? Or would Google understand that the page doesn't have content because of the iFrames and give us a pass? Thoughts? Thanks, Andrew

    | SafeNet_Interactive_Marketing
    0

  • Hi, I recently took on a client for local SEO and i started improving his on page optimization etc and now I'm continuing with link building. It's hard since it's a boring industry (medical waste disposal), but I have gotten some links. When I check up his link profile in OSE, it's still giving me a spam score of 7!!! and not showing the links I have acquired. I already removed the link it's showing up by contacting the website and it's not there anymore. The site is a very clean nice site, why am I getting such a high spam score? Thanks for your help! Rachel

    | Rachel_J
    0

  • I use Opencart and have found that a lot of my duplicate content (mainly from Products) which is caused by the Search function. Is there a simple way to tell Google to ignore the Search function pathway? Or is this particular action not recommended? Here are two examples: http://thespacecollective.com/index.php?route=product/search&tag=cloth http://thespacecollective.com/index.php?route=product/search

    | moon-boots
    0

  • One of my clients has just been bought by a much larger company and thus will be losing their website and brand name. My client's site has built up a lot of traffic and authority in its space, so we are very nervous about losing all of this after the sale has gone through. The purchasing company intends for my client's services to be represented on its own website, so I am wondering, from a technical standpoint, what the best way is of going ahead with this, since my client will continue to work with the new company and would like to keep us onboard. Should we doing an 80/20 analysis, recreate our most valuable pages (eg. 70%+ of traffic is to home page) on the new site, then 301 each of these pages individually to its equivalent on the new site, while retaining as much of the old pages' on-page content/structure as possible? One thing I am concerned about is the fact that a large chunk of traffic is from brand searches. Again, should we simply recreate the home page with a page title of e.g. "X company is now part of Y company" in order that we'll still rank highly for the old company's brand name? Any advice on how to go about this is much appreciated.

    | zakkyg
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.