Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hey guys Would you use the 'change of address' tool (https://support.google.com/webmasters/answer/83106?hl=en) for a site that was previously penalised, pointing it to the new URL? Cheers Jeremy

    | jeremycabral
    0

  • I've just started to work for a company who've purchased masses of domains with every conceivable permutation based on all their products with every extension possible e.g .biz  . eu. .net (including .co.uk and .com of course). I have two questions: 1. Is it worth keeping all these (they want to add more) domains or let them expire? 2. All the purchased domains are online - is there any point (they redirect with a 301)?

    | LJHopkins
    0

  • We recently acquired a new domain to replace existing as it better fits our brand. We have little/no organic value on existing domain so switching is not an issue. However the newly acquired domain was previously used in a different industry and has inbound links with significant spam scores. How can we let Google know that these links are not valid for our business and start rebuilding reputation of the domain? Disavow tool?

    | Marlette
    0

  • Hey Mozzers, I have a very long article page that supports several of my sub-category pages. It has sub-headings that link out to the relevant pages. However the article  is very long and to make it easier to find the relevant section I was debating adding inpage anchor links in a bullet list at the top of the page for quick navigation. PAGE TITLE Keyword 1 Keyword 2 etc <a name="'Keyword1"></a> Keyword 1 Content
    <a name="'Keyword2"></a> Keyword 2 Content Because of the way my predecessor wrote this article, its section headings are the same as the sub-categories they link out to and boost (not ideal but an issue I will address later). What I wondered is if having the inpage achor would confuse the SERPS because they would be linking with the same keyword. My worry is that by increasing userbility of the article by doing this I also confuse them SERPS First I tell them that this section on my page talk about keyword 1. Then from in that article i tell them that a different page entirely is about the same keyword. Would linking like this confuse SERPS or are inpage anchor links looked upon and dealt with differently?

    | ATP
    0

  • So I have 5 real estate sites. One of those sites is of course the original, and it has more/better content on most of the pages than the other sites. I used to be top ranked for all of the subdivsion names in my town. Then when I did the next 2-4 sites, I had some sites doing better than others for certain keywords, and then I have 3 of those sites that are basically the same URL structures (besides the actual domain) and they aren't getting fed very many visits.  I have a couple of agents that work with me that I loaned my sites to to see if that would help since it would be a different name. My same youtube video is on each of the respective subdivision pages of my site and theirs. Also, their content is just rewritten content from mine about the same length of content. I have looked over and seen a few of my competitors who only have one site and their URL structures arent good at all, and their content isn't good at all and a good bit of their pages rank higher than my main site which is very frustrating to say the least since they are actually copy cats to my site. I sort of started the precedent of content, mapping the neighborhood, how far that subdivision is from certain landmarks, and then shot a video of each. They have pretty much done the same thing and are now ahead of me. What sort of advice could you give me? Right now, I have two sites that are almost duplicate in terms of a template and same subdivsions although I did change the content the best I could, and that site is still getting pretty good visits. I originally did it to try and dominate the first page of the SERPS and then Penguin and Panda came out and seemed to figure that game out. So now, I would still like to keep all the sites, but I'm assuming that would entail making them all unique, which seems to be tough seeing as though my town has the same subdivisions. Curious as to what the suggestions would be, as I have put a lot of time into these sites. If I post my site will it show up in the SERPS? Thanks in advance

    | Veebs
    0

  • Hi everyone, We implemented HREFLANG code for our international sites. We are wondering is there an automated way to test is HREFLANG is working vs. manually browsing in each international search engine? Also, we implemented this a few days ago, and google webmaster tools stlil hasn't picked up that we have it implemented. I've heard it taking anywhere from 2-8 days. At what point would we see results. our site is http://www.datacard.com Is there an order that the site listings have to follow, for example should x-default be the last item listed? Thanks, Laura

    | lauramrobinson32
    0

  • I am considering using cloudflare for a couple of my sites.
    What is your experience? I researched a bit and there are 3 issues I am concerned about: google may consider site bad neighbourhood in case other sites on same DNS/IP are spammy.
    Any way to prevent this? Anybody had a problem? ddos attack on site on same DNS could affect our sites stability. blocking false positives. Legitimate users may be forced to answer captchas etc. to be able to see the page. 1-2% of legit visitor were reported by other moz member to be identified as false positive. 
    Can I effectively prevent this by reducing cloudflare basic security level? Also did you experience that cloudflare really helped with uptime of site? In our case whenever our server was down for seconds also cloudflare showed error page and sometimes cloudflare showed error page that they could not connect even when our server response time was just slow but pages on other domains were still loading fine.

    | lcourse
    0

  • I work for a reviews site, and some of the reviews that get published on our website also get published on other reviews websites. It's exact duplicate content -- all user generated. The reviews themselves are all no-indexed; followed, and the pages where they live are only manually indexed if the reviews aren't duplicate. We leave all pages with reviews that live elsewhere on the web nofollowed. Is this how we should properly handle it? Or would it be OK to follow these pages regardless of the fact that technically, there's exact duplicate UGC elsewhere?

    | dunklea
    0

  • I want to change my client’s ecommerce site to Shopify. The only problem is that Shopify doesn’t let you customize domains. I plan to: keep each page’s content exactly the same keep the same domain name 301 redirect all of the pages to their new url The ONLY thing that will change is each page’s url. Again, each page will have the exact same content. The only source of traffic to this site is via Google organic search and sales depend on the traffic. There are about 10 pages that have excellent link juice, 20 pages that have medium link juice, and the rest is small link juice. Many of our links that have significant link juice are on message boards written by people that like our product. I plan to change these urls and 301 redirect them to their new urls. I’ve read tons of pages online about this topic. Some people that say it won’t effect link juice at all, some say it will might effect link juice temporarily, and others are uncertain. Most answers tend to be “You should be good. You might lose some traffic temporarily. You might want to switch some of your urls to the new structure to see how it affects it first.” Here’s my question: 1) Has anyone ever done changed a url structure for an existing website with link juice? What were your results and do you have a definitive answer on the topic? 2) How much link juice (if any) will be lost if I keep all of the exact content the same but only change each page’s url? 3) If link juice is temporarily lost and then regained, how long will it be temporarily lost? 1 week? 1 month? 6 months? Thanks.

    | kirbyf
    0

  • Thank you for reading. After redesigning my website (5 months ago) in my crawl reports (Moz, Search Console) I still get tons of 404 pages which all seems to be the URLs from my previous website (same root domain). It would be nonsense to 301 redirect them as there are to many URLs. (or would it be nonsense?) What is the best way to deal with this issue?

    | Chemometec
    0

  • Firstly, apologies for the long winded question. I'm 'newish' to SEO We have a website built on Magento , www.excelclothing.com We have been online for 5 years and had reasonable success. Having used a few SEO companies in the past we found ourselves under a 'partial manual penalty' early last year. By July we were out of penalty. We have been gradually working our way through getting rid of 'spammy' links. Currently the website ranks for a handful of non competitive keywords looking at the domain on SEM RUSH. This has dropped drastically over the last 2 years. Our organic traffic over the last 2-3 years has seen no 'falling off a cliff' and has maintained a similar pattern. I've been told so many lies by SEO companies trying to get into my wallet I'm not sure who to believe. We have started to add content onto all our Category pages to make more unique although most of our Meta Descriptions are a 'boiler plate' template. I'm wondering.... Am I still suffering from Penquin ? Am I trapped by Panda and if so how can I know that? Do I need more links removed? How can I start to rank for more keywords I have a competitor online with the same DA, PA and virtually same number of links but they rank for 3500 keywords in the top 20. Would welcome any feedback. Many Thanks.

    | wgilliland
    1

  • Would content behind a drop down on this site Https://www.homeleisuredirect.com/pool_tables/english_pool_tables/ you have to click the - more about English pool tables text under the video Work just as well for SEO as content on the page like this site http://www.pooltablesonline.co.uk/uk-slate-bed-pool-tables.asp

    | BobAnderson
    0

  • Hello, We were having duplicate content in our blog (a replica of each post automatically was done by the CMS), until we recently implemented a rel=canonical tag to all the duplicate posts (some 5 weeks ago). So far, no duplicate content were been found, but we are still getting duplicate title tags, though the rel=canonical is present. Any idea why is this the case and what can we do to solve it? Thanks in advance for your help. Tej Luchmun

    | luxresorts
    0

  • Hi, I have a site were I write previews for sports match ups.  I notice when I don't put the date in the title I rank much better for specific keywords.  I also noticed that most people don't really put in the date when they do the search anyways, especially since google does a good job of showing the most recent pages anyways. The only reason I continue to put the date is because of this whole idea of not having page titles that are duplicate.  So many of our games will be Team A vs Team B Preview, and Im worried that the term "preview" will become so repetitive that google may not like it.  Any tips or ideas on how to approach this issue best? Thanks!

    | tarafaraz
    1

  • Hello! Is there any wisdom or non-wisdom in taking old websites and blogs that may not be very active, but still get some traffic, and redirecting them to a brand new website? The new website would be in the same industry, but not the same niche as the older websites. Would there be any SEO boost to the new website by doing this? Or would it just hurt the credibility of the new website?

    | dieselprogrammers
    0

  • I have created an index stack.  My home page is http://www.southernwhitewater.com The home page is the index itself and the 1st page http://www.southernwhitewater.com/nz-adventure-tours-whitewater-river-rafting-hunting-fishing My home page (if your look at it through moz bat for chrome bar} incorporates all the pages in the index. Is this Bad?  I would prefer to index each page separately.  As per my site index in the footer What is the best way to optimize all these pages individually and still have the customers arrive at the top to a picture.  rel= canonical? Any help would be great!! http://www.southernwhitewater.com

    | VelocityWebsites
    0

  • We sell wedding garters, niche I know! We have a site (weddinggarterco.com) that ranks very well in the UK and sell a lot to the USA despite it's rudimentary currency functions (Shopify makes US customers checkout in £gbp; not helpful to conversions). To improve this I built a clone (theweddinggarterco.com) and have faked a kind of location selector top right. Needless to say a lot of content on this site is VERY similar to the UK version. My questions are... 1. Is this likely to stop me ranking the USA site? 2. Is this likely to harm my UK rankings? Any thoughts very welcome! Thanks. Mat

    | mat2015
    0

  • We recently launched a redesign/redevelopment of a site but failed to put 301 redirects in place for the old URL's. It's been about 2 months. Is it too late to even bother worrying about it at this point? The site has seen a notable decrease in site traffic/visits, perhaps due to this issue. I assume that once the search engines get an error on a URL, it will remove it from displaying in search results after a period of time. I'm just not sure if they will try to re-crawl those old URLs at some point and if so, it may be worth it to have those 301 redirects in place. Thank you.

    | BrandBuilder
    0

  • Hi everyone! I'm after a second (or third, or fourth!) opinion here! I'm working on the website www.workingvoices.com that has a Panda penalty dating from the late March 2012 update. I have made a number of changes to remove potential Panda issues but haven't seen any rankings movement in the last 7 weeks and was wondering if I've missed something... The main issues I identified and fixed were: Keyword stuffed near duplicate title tags - fixed with relevant unique title tags Copies of the website on other domains creating duplicate content issues - fixed by taking these offline Thin content - fixed by adding content to some pages, and noindexing other thin/tag/category pages. Any thoughts on other areas of the site that might still be setting off the mighty Panda are appreciated! Cheers Damon.

    | Digitator
    0

  • Hi guys I have a duplicate content question I was hoping someone might be able to give me some advice on? I work for a small company in the UK and in our niche we have a huge product range and an excellent website providing the customer with a very good experience. We’re also backed up by a bespoke warehouse/logistic management system further enhancing the quality of our product. We get most traffic through PPC and are not one of the biggest brands in the industry and have to fight for marketshare. Recently we were approached by another company in our industry that have built up a huge and engaged audience over decades but can’t logistically tap into their following to sell the products so they have suggested a partnership. They are huge fans of what we do and basically want a copy of our site to be rebranded and hosted on a subdomain of their website and we would then pay them a commission of all the sales the new site received. So 2 identical sites with different branding would exist. Based on tests they have carried out we could potentially double our sales in weeks and the potential is huge so we are excited about the possibility. But…..how would we handle the duplicate content, would we be penalised? Would just one of the sites be penalised? Or if sales increase as much as we think they might, would it be worth a penalty as our current rankings aren’t great? Any advice would be great. Cheers Richard

    | Rich_995
    0

  • Hey Mozzers, I was looking for a little guidance and advice regarding a couple of pages on my website. I have used 'shoes' for this example. I have the current structure Parent Category - Shoes Sub Categories - Blue Shoes
    Hard Shoes
    Soft Shoes
    Big Shoes etc Supporting Article - Different Types of Shoe and Their Uses There are about 12 subcategories in total - each one links back to the Parent Category with the keyword "Shoes". Every sub category has gone from ranking 50+ to 10-30th for its main keyword which is a good start and as I release supporting articles im sure each one will climb. I am happy with this. The Article ranks no1 for about 20 longtails terms around "different shoes". This page attracts around 60% of my websites traffic but we know this traffic will not convert as most are people and children looking for information only for educational purposes and are not looking to buy. Many are also looking for a type of product we dont sell. My issue is ranking for the primary category "Shoes" keyword. When i first made the changes we went from ranking nowhere to around 28th on the parent category page targeted at "Shoes". Whilst not fantastic this was good as gave us something to work off. However a few weeks later, the article page ranked 40th for this term and the main page dropped off the scale. Then another week some of the sub category pages ranked for it. And now none of my pages rank in the top 50 for it. I am fairly sure this is due to some cannibalisation - simply because of various pages ranking for it at different times.
    I also think that additional content added by products on the sub category pages is giving them more content and making them rank better. The Page Itself
    The Shoes page itself contains 400 good unique words, with the keyword mentioned 8 times including headings. There is an image at the top of the page with its title and alt text targeted towards the keyword. The 12 sub categories are linked to on the left navigation bar, and then again below the 400 words of content via a picture and text link. This added the keyword to the page another 18 or so times in the form of links to longtail subcaterogies. This could introduce a spam problem i guess but its in the form of nav bars or navigation tables and i understood this to be a necessary evil on eCommerce websites. There are no actual products linked from this page. - a problem? With all the basic SEO covered. All sub pages linking back to the parent category, the only solution I can think of is to add more content by Adding all shoes products to the shoe page as it currently only links out the the sub categories Merging the "Different Type of Shoe and Their Uses" article into the shoe page to make a super page and make the article pages less like to produce cannibalistic problems. However, by doing solution 2, I remove a page bringing in a lot of traffic. The traffic it brings in however is of very little use and inflates the bounce rate and lowers the conversion rate of my whole site by significant figures. It also distorts other useful reports to track my other progress. I hope i have explained well enough, thanks for sticking with me this far, i havn't posted links due to a reluctance by the company so hopefully my example will suffice. As always thanks for any input.

    | ATP
    0

  • Hi Moz Fans. We are in the process of re-designing our product pages and we need to improve the page load speed. Our developers have suggested that we load the associated products on the page using Lazy Loading, While I understand this will certainly have a positive impact on the page load speed I am concerned on the SEO impact. We can have upwards of 50 associated products on a page so need a solution. So far I have found the following solution online which uses Lazy Loading and Escaped Fragments - The concern here is from serving an alternate version to search engines. The solution was developed by Google not only for lazy loading, but for indexing AJAX contents in general.
    Here's the official page: Making AJAX Applications Crawlable. The documentation is simple and clear, but in a few words the solution is to use slightly modified URL fragments.
    A fragment is the last part of the URL, prefixed by #. Fragments are not propagated to the server, they are used only on the client side to tell the browser to show something, usually to move to a in-page bookmark.
    If instead of using # as the prefix, you use #!, this instructs Google to ask the server for a special version of your page using an ugly URL. When the server receives this ugly request, it's your responsibility to send back a static version of the page that renders an HTML snapshot (the not indexed image in our case). It seems complicated but it is not, let's use our gallery as an example. Every gallery thumbnail has to have an hyperlink like:  http://www.idea-r.it/...#!blogimage=<image-number></image-number> When the crawler will find this markup will change it to
    http://www.idea-r.it/...?_escaped_fragment_=blogimage=<image-number></image-number> Let's take a look at what you have to answer on the server side to provide a valid HTML snapshot.
    My implementation uses ASP.NET, but any server technology will be good. var fragment = Request.QueryString[``"_escaped_fragment_"``];``if (!String.IsNullOrEmpty(fragment))``{``var escapedParams = fragment.Split(``new``[] { ``'=' });``if (escapedParams.Length == 2)``{``var imageToDisplay = escapedParams[1];``// Render the page with the gallery showing ``// the requested image (statically!)``...``}``} What's rendered is an HTML snapshot, that is a static version of the gallery already positioned on the requested image (server side).
    To make it perfect we have to give the user a chance to bookmark the current gallery image.
    90% comes for free, we have only to parse the fragment on the client side and show the requested image if (window.location.hash)``{``// NOTE: remove initial #``var fragmentParams = window.location.hash.substring(1).split(``'='``);``var imageToDisplay = fragmentParams[1]``// Render the page with the gallery showing the requested image (dynamically!)``...``} The other option would be to look at a recommendation engine to show a small selection of related products instead. This would cut the total number of related products down. The concern with this one is we are removing a massive chunk of content from he existing pages, Some is not the most relevant but its content. Any advice and discussion welcome 🙂

    | JBGlobalSEO
    0

  • Hi Moz Community, I'm trying to understand if there is really any material difference with going with one URL structure compared to the other. I assume the hyphen example below is what most would argue is the best option, but due to certain circumstances (I wont go into) I'm most likely going to be forced to use the sub directories URL option. I'm just concerned that going down this path will have a material SEO effect...looking for peoples thoughts? Keep in mind for this example: I'm using the Shopify eCommerce platform and am forced to use the word 'collection' in the url I sell shoes so the word ' Birkenstock ' within the URL represents the brand & 'Sandals ' represents the style. The key word search in this instance would be birkenstock sandals Example 1 http://companyname/collection/birkenstock/sandals  V http://companyname/collection/birkenstock-sandals Example 2 http://companyname/collection/sandals/birkenstock  V  http://companyname/collection/sandals-birkenstock Will be interesting to hear if people what difference if any each will bring. Thanks in advance for any insight.....

    | chewythedog
    0

  • We are in ecommerce, and there are a few review sites that are dominating the rankings for our products. The sites are very good - very well written content (2000+ words) and visually appealing sites. The 2 main culprits are clearly black hat. One site's backlinks are pure spam, and the other is buying footer and sidebar links. Will ratting them to Google have any impact? If not, any suggestions on how to compete? Our competing pages are product descriptions, and creating a 2000 word product description seems inappropriate. Also, all of these products are brand new, and due to extensive media spends, the search volume is very high. Since they are beating us to the punch by getting good content posted first, they are proving difficult to displace.

    | AMHC
    0

  • Hello, I have sometimes articles written about my product online, is there anything else I can do except make a good file name for it, perhaps I can ask the site owner to modify in the article to make it rank higher? Also on some small websites I can see that images rank very high for the specific search term that is difficult to rank for in images, if I were to contact the site with a sponsored post request, what I should make sure the site adds except filename to that sponsored post... I think there are also some other methods such as reddit to make images rank high on third party page, just need to find out how... thanks a lot

    | bidilover
    0

  • Need some advice on when to use canonical vs. redirects for navigation changes to a website. However, if there are other options i am open to them as well. We are consolidating some navigational paths and moving others We are renaming product pages (therefore creating new product pages, CMS platform requirements) Keep in mind we have desktop domain and a mobile domain Questions Do we redirect old URL's to the new product page URL's? Do we redirect old mobile URL's to new mobile URL's or to the desktop equivalent? Do we redirect all old product page URL's containing navigation elements to the new product page URL? If we have a category page being added to two different sections how do we determine the right canonical URL? (the URL will be different because the customer paths will be different) Do we need to make sure and redirect all old URL's to a new URL? If so, what is the best way to find all of the URL's?

    | seo32
    0

  • Hello, Recently one of my regional websites, targeted for Denmark (xxxx.dk (TLD)), received a manual penalty from Google, specified as “pure spam”. The reason for this (as I suspect) can be the fact that the Danish site’s content, is fully translated from English, on the main site (.com). To fix that problem with Google, I want to use the “alternate/hreflang” tags on both sites
    URL’s (the main and the regional), before submitting it for a second review on WMT. Following this, I would like to ask you few questions: 1. Is there any RISK in using alternate tags between two sites (a “healthy” site to the one that got the penalty), Can it harm the SEO of the main site (.com)?
    2. Once done, Will it resolve my problem with Google? Will they remove the manual penalty?
    3. Based on your experience, would you recommend me to rewrite all the content on the Danish site, instead of just translating it (the current status)? Would love to hear your opinion on those issues. Thanks a lot!

    | Kung_fu_Panda
    0

  • Hi all, An e-commerce site has recently moved protocol to https sitewide. The site ranked page one for some great terms and now appear to be page 2 or below. Brand terms seem unphased and are still very strong, on both Google and Bing. The following has been done; Everything 301'd from http to https Sitemap Edited Updated Webmaster Tools Robots.txt edited Crawled and Fetched all pages daily. Checked Paged are all follow,index. PPC Ads mass updated to new url's. Most terms were ranked 1 - 9 on Bing, and Page 1/2 on Google. HTTPS upgrade was done less than one week ago. The site is not payday loan related, nor was it hit by latest panda escapades. Everything on the site is relevant to the content. Has anybody else been in this position, what else can be done? I'd appreciate any help and advice. Thank You K9gB7hz

    | Whittie
    0

  • Hi, I am building this site for my boss http://charlesfridmanpr.wix.com/real-estate and am still working on it. I'm getting close to the stage where I want to redirect it to the URL we want to use, but in reading these forums, it says that because all of subpages (?) have a # in them, they will not be read or indexed by google. I am very new to this, and while it may not look like it, the website has taken me quite a while to design.  Is there a way to fix this?  We want to appear high up for a non competitive keyword. Thanks

    | Charlesfridmanpr
    0

  • Hello , I have site, in which client needs right click on All his pages, his traffic is very Good, But worried, if right click hurts its traffic, ?? any expert can help ?? Thx in Advance

    | ieplnupur
    0

  • Hey Mozzers, I think i know the answer to this one but i just wanted to check my thinking if you wouldnt mind. I have an ecommerce website with lots of very similar products, for example Blue widget
    Waterproof blue widget
    Blue widget with Alarm One of the pages is ranking top 10 for "blue widget", however the other intermittently swap with it, knocking that page out and itself into the top 10. Then a few weeks later it swaps back again. This seems like a clear case of keyword canablisation to me. And i am wondering on the best solution. 301: Obviously not an answer as i need all 3 products visible
    Canonical to one of the pages: Doesn't seem correct either, the products are similiar but not the same, all 3 could rank for different longtails etc I was suffering from something similiar on my closely related category pages and I combated that by interlinking them all with the relevant keywords to point to the relevant pages. Should i do the same for these products such as... 
    From 'Blue Widget' product link to "Blue widget with alarm" and "Waterproof Blue Widget"
    From Waterproof blue widget and blue widget with alarm link to "Blue Widget"     (using the anchor text in the ""). This should tell serps that all pages are about blue widget but the main one is the "blue widgets" page. Correct? As a follow up. Is this one of the reason ecommerce sights have related products options?

    | ATP
    0

  • Greetings Moz-Hive mind! I'm hoping you can help me on the internationalisation conundrum below; We currently have a website with three distinct 'locales' US, SEA and UK we automatically redirect customers using IP recognition to a locale which matches, we also determine their currency based on IP. The issue we currently have is a lot of duplicate content and no use of href lang or rel=canonical tags etc... My proposed structure would be to create a locale based directory for the three locales we offer. / - being US and most other Worldwide /uk - being UK /as - being Hong Kong and other Asian territories. How would you suggest we set up the href lang tags for these? Because technically there are going to be multiple language possibilities within. Our main customers are English only if this helps. Also as a secondary question, how should I set up the Google Search Console settings for each of these directories? Many thanks in advance.

    | Ashley-Jacada
    0

  • Hi ! I have 7 Domains that I bought that point to the same webspace as my main domain. In Open Site Explorer they are showed as spam links. So to solve the issue I redirected the links to an empty subdirectory on the same server which is different from the directory the main domain is linking to. But nevertheless the domains are still showing up as spam. Why might that be? What can I do to get rid of these domains? In fact I only need the main domain. Cheers, Marc

    | RWW
    0

  • Hello! I am now checking a website that has been migrated months ago from osCommerce to Prestashop.
    While I was checking crawl errors in search console I found a lot of 404s coming from the last website. The urls are mainly 4 types: popup_image.php?pID=125&osCsid=507c27261ba5ca2568f06ce5bad2ebc9 product-friendly-url-pr-125%3FosCsid.... product-friendly-url-p-125%3FosCsid..... products_new.php?page=228 I've have realized that the parameter pId, and the number that comes after pr- and p- is the product Id in the new website, so I think our team will be able to create an script to redirect those. My question is: Is it ok to send several urls to the same url?. I mean, the popup_image.php was not the product page, as its name says it's more like a popup page. We don't have now a pop up page for images, so I was thinking to send that url to the product page. the one with the pr-  was product review page the one with the p- was the product page I was thinking on redirecting the 3 of them to the product page? Should I? Or should I just redirect the last one (p-) and eliminate the others from the index? And... the ones with products_new.php?page=228 I was thinking to redirect all to the page 1 of new products. Is it ok? thank you!

    | teconsite
    0

  • We use a subdomain for our dev site. I never thought anything of it because the only way you can reach the dev site is through a vpn. Google has somehow indexed it. Any ideas on how that happened? I am adding the noindex tag, should I used canonical? Or is there anything else you can think of?

    | EcommerceSite
    0

  • I have a website which has a fair few link assets that are doing very well (a lot of really powerful sites have link to them with follow links) but my commercial pages are not doing as well as a lot of sites without any other investment than (mediocre) links direct to there commercial pages with at least 10% of them carrying the money anchor text. Even pages we have had a few links for with generalized real anchor text and reasonable links do not do as well as the above due to none of them carrying the money keyword? Is it me or does google still rely on links to the commercial page and keywords with anchor text to match the money term?

    | BobAnderson
    0

  • Hi everyone! I am checking now a website that works with Drupal, and I found that images have urls like this... http://www.brandname.com/sites/default/files/styles/directory_xyz/public/name-of-the-picture.png?itok=T89RpzrK I was wondering how an URL like that with the token at the and, can affect to SEO. I cound't find anything. Anyone knows? Thank you!

    | teconsite
    0

  • Hi everyone, I got a tough technical SEO question, that is bugging almost everyone in the (ecommerce) company at the moment. Due to a very "unhealthy" structure of Magento folders, with different countries using same folders in different store views, many of our URL's do change almost on a weekly basis and this is terrifying us. What happens is, that there is a "-numberx"(ex. /category/product-1.html) added to hundreds of URLs so that we are more and more concerned about the impact on SEO. I checked the redirect information with the Moz Toolbar and saw, the following information: http://prntscr.com/81v23e So, even though we had URLs with /category/product-1.html, /category/product-2.html,... the redirect seems to go straight to the last number. My question? -Can this be interpreted as one redirect and therefore it is "less" painful from an SEO point of view?
    -As we do not have a constant target URL, where does the link juice go if the target page constantly keeps changing (number goes still up) Any advice would be much appreciated. Thanks

    | ennovators
    0

  • Hi Guys, Just wondering what is the best way to find forums in your industry?

    | edward-may
    2

  • Hello Please comment on which you think is best SEO practice for each & any comments on link juice following through. Title text ( on Product Page ) <title>Brandname ProductName</title>
    OR
    <title>ProductName by Brandname</title> on category page <a <span="" class="html-attribute-name">itemprop="name" href="[producturl]">ProductName</a>
    <a <span="" class="html-attribute-name">itemprop="brand" href="[brandurl]>BrandName</a> OR <a <span class="html-attribute-name">itemprop="name" href="[producturl]">BrandName ProductName
    ( Leave Brand Link Out)</a <span> Product Page <a itemprop="name" href="[producturl]">ProductName
    <a itemprop="brand" href="[brandurl]>BrandName</a itemprop="brand" href="[brandurl]></a itemprop="name" href="[producturl]"> OR <a itemprop="name" href="[producturl]">BrandName ProductName
    ( Leave Brand Link Out)</a itemprop="name" href="[producturl]"> Thoughts?

    | s_EOgi_Bear
    0

  • Hi, We have had a strongly ranking site since 2004. Over the past couple of days, our Google traffic has dropped by around 20% and some of our strong pages are completely disappearing from the rankings. They are still indexed, but having ranked number 1 are nowhere to be found. A number of pages still remain intact, but it seems they are increasingly disappearing. Where should we start to try and find out what is happening? Thanks

    | simonukss
    0

  • Hi Guys I have a question...I am currently working on a website that was hit by a spam attack. The website was hacked and 1000's of adult censored pages were created on the wordpress site. The hosting company cleared all of the dubious files - but this has left 1000's of dead 404 pages. We want to fix the dead pages but Google webmaster only shows and allows you to download 1000. There are a lot more than 1000....does any know of any Good tools that allows you to identify all 404 pages? Thanks, Duncan

    | CayenneRed89
    0

  • Hello, while I was checking this site; http://www.disfracessimon.com/disfraces-adultos-16.html I found that the pagination is working this way http://www.disfracessimon.com/disfraces-adultos-16.html#/page-2
    http://www.disfracessimon.com/disfraces-adultos-16.html#/page-3 and content is being loaded using AJAX. So, google is not getting the paginated results. Is this a big issue or there is no problem?
    Should I create a link for See All Products or there is not a big issue? Thank you!

    | teconsite
    0

  • I have created an index stack.  My home page is http://www.southernwhitewater.com My home page (if your look at it through moz bat for chrome bar} incorporates all the pages in the index. Is this Bad?  I would prefer to index each page separately.  As per my site index in the footer What is the best way to optimize all these pages individually and still have the customers arrive at the top and links directed to the home page ( which is actually the 1st page).  I feel I am going to need a rel=coniacal might be needed somewhere. Any help would be great!!

    | VelocityWebsites
    0

  • Does a pop up like the one on this site www stressfreeprint co uk (top left corner about us, who we are) count as an external link or would link juice not flow to it. I like to have a few pages that i don't want to waste link juice on but would still like to have them and hope this is the answer.

    | BobAnderson
    0

  • Hello everyone! I have an SEO question that I cannot solve given the parameters of the project, and I was wondering if someone could provide me with the next best alternative to my situation. Thank you in advance. The problem: Two eCommerce stores are completely identical (structure, products, descriptions, content) but they are on separate domains for currency and targeting purposes. www.website-can.com is for Canada and www.website-usa.com is for US. Due to exchange rate issues, we are unable to combine the 2 domains into 1 store and optimize. What's been done? I have optimized the Canadian store with unique meta titles and descriptions for every page and every product. However I have left the US store untouched. I would like to gain more visibility for the US Store but it is very difficult to create unique content considering the products are identical. I have evaluated using canonicals but that would ask Google to only look at either the Canadian or US store, , correct me if i'm wrong. I am looking for the next best solution given the challenges and I was wondering if someone could provide me with some ideas.

    | Snaptech_Marketing
    0

  • Our site is hosted on a secure network (I.E. Our web address is - https://www.workbooks.com). Will a backlink pointing to: http://www.workbooks.com provide less value than a link pointing to: https://www.workbooks.com ? Many thanks, Sam

    | Sam.at.Moz
    0

  • A couple of years ago the ranking on our site dropped over night. I believe someone working here at the time purchased links about that time. We have been doing lots of work on the site since then to improve it. We can not get our rankings back up on google searches. Can anyone give us some advise about what to do or where to go for some help that we can trust.

    | CostumeD
    0

  • I see in webmaster tools that I have a bad URL linking from the mydomain.domproof.com. What is this??

    | AGMContainerControls
    0

  • Hi All, We have a web provider who's not willing to remove the wildcard line of code blocking all agents from crawling our client's site (user-agent: *, Disallow: /). They have other lines allowing certain bots to crawl the site but we're wondering if they're missing out on organic traffic by having this main blocking line. It's also a pain because we're unable to set up Moz Pro, potentially because of this first line. We've researched and haven't found a ton of best practices regarding blocking all bots, then allowing certain ones. What do you think is a best practice for these files? Thanks! User-agent: * Disallow: / User-agent: Googlebot Disallow: Crawl-delay: 5 User-agent: Yahoo-slurp Disallow: User-agent: bingbot Disallow: User-agent: rogerbot Disallow: User-agent: * Crawl-delay: 5 Disallow: /new_vehicle_detail.asp Disallow: /new_vehicle_compare.asp Disallow: /news_article.asp Disallow: /new_model_detail_print.asp Disallow: /used_bikes/ Disallow: /default.asp?page=xCompareModels Disallow: /fiche_section_detail.asp

    | ReunionMarketing
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.