Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • A competitor with maybe a couple of hundred pages throughout their website has recently added an image alt tag of a keyword to their company logo which links back to there homepage. The homepage is not particularly optimized towards this keyword and do not use the keyword within the text content at all apart from within the page title, So the image and link destination are not relevant towards the keyword. They seem to rank well for this competitive keyword in our industry using what i see as a slightly unethical internal linking strategy. We are currently working on some solid content to compete with this, but i was just wanting some advice on the situation, How does Google not see this as manipulative having the same link in the template from each page?

    | Antony_Towle
    0

  • I have been pondering whether I am using this tag correctly or not. We have a custom solution which lays out products in the typical eCommerce style with plenty of tick box filters to further narrow down the view. When I last researched this it seemed like a good idea to implement rel=canonical to point all sub section pages at a 'view-all' page which returns all the products unfiltered for that given section. Normally pages are restricted down to 9 results per page with interface options to increase that. This combined with all the filters we offer creates many millions of possible page permutations and hence the need for the Canonical tag. I am concerned because our view-all pages get large, returning all of that section's product into one place.If I pointed the view-all page at say the first page of x results would that defeat the object of the view-all suggestion that Google made a few years back as it would require further crawling to get at all the data? Alternatively  as these pages are just product listings, would NoIndex be a better route to go given that its unlikely they will get much love in Google anyway?

    | motiv8
    0

  • Hi we have a business/service related website, 2 of our main pages lost their page rank from 3 to 0 and are not showing any backlinks in google. What could be the possible reason. Please guide me.

    | Tech_Ahead
    0

  • Hi Mozzers, I come asking for help. I've had a client who's reported a staggering increase in errors of over 18,000! The errors include duplicate content and page titles. I think I've found the culprit and it's the News & Events calender on the following page: http://www.newmanshs.wa.edu.au/news-events/events/07-2013 Essentially each day of the week is an individual link, and events stretching over a few days get reported as duplicate content. Do you have any ideas how to fix this issue? Any help is much appreciated. Cheers

    | bamcreative
    0

  • I have a client whose Google Places listing is not showing correctly. We have control of the page, and have the address verified by postcard. Yet when we view the listing it shows a totally different address that is miles away and on a totally different street. We have relogged into manage the business listing and all of the info is correct. We dragged the marker and submitted it to them that they had things wrong and left a note with the right address. Why would this happen and how can we fix it? Right now they rank highly but with a blatantly wrong address.

    | Atomicx
    0

  • Hi all, A client wanted a few pages noindexed, which was no problem using the meta robots noindex tag. However they now want associated images removed, some of which still appear on pages that they still want indexed. I added the images to their robots.txt file a few weeks ago (probably over a month ago actually) but they're all still showing when you do an image search. What's the best way to noindex them for good, and how do I go about implementing it? Many thanks, Steve

    | steviephil
    0

  • Hi all! We've recently developed a German version of our website with German translation and now we have just purchased a .nl domain, but with this one, we want all of the copy to remain in English. Is it ok to redirect our .nl domain to our current .com website or will this give us bad SEO points? Thank you!

    | donaldsze
    0

  • I have made a content placeholder for a keyword that will  gain significant search volume in the future. Until then I am trying to optimize the page to rank when the game launches and the keyword gains volume. http://hiddentriforce.com/a-link-between-worlds/walkthrough/ Is there anything I can do to improve the optimization for the phrase 'a link between worlds walkthrough' A lot of my competitors are already setting up similar placeholder pages and doing the same thing. I have 2 fairly large gaming sites that will place a banner for my walkthrough on their site. I did not pay for the links. I do free writing/ other services in exchange for this. I have been sharing the link socially. It has almost 200 likes and a handful of shares, tweets, g+ votes

    | Atomicx
    0

  • We received an enquiry on one of our landing pages and I am trying to track down where that user come from? Whether he came from social networks or search engines and if it is from search engine which keywords he used etc.. Does anyone know if there is any way I could see that?

    | Rubix
    0

  • Hi All, I have a client who has duplicated an entire section of their site onto another domain about 1 year ago.  The new domain was ranking well but was hit heavily back in March by Panda. I have to say the set up isn't great and the solution I'm proposing isn't ideal, however, as an agency we have only been tasked with "performing SEO" on the new domain. Here is an illustration of the problem: http://i.imgur.com/Mfh8SLN.jpg My solution to the issue is to 301 redirect the duplicated area of the original site out (around 150 pages) to the new domain name, but I'm worried that this could be could cause a problem as I know you have to be careful with redirecting internal pages to external when it comes to SEO. The other issue I have is that the client would like to retain the menu structure on the main site, but I do not want to be putting an external link in the main navigation so my proposed solution is as follows: Implement 301 redirects for URLs from original domain to new domain Remove link out to this section from the main navigation of original site and add a boiler plate link in another area of the template for "Visit xxx for our xxx products" kind of link to the other site. Illustration of this can be found here: http://i.imgur.com/CY0ZfHS.jpg I'm sure the best solution would be to redirect in URLs from the new domain into the original site and keep all sections within the one domain and optimise the one site.  My hands are somewhat tied on this one but I just wanted clarification or advice on the solution I've proposed, and that it wont dramatically affect the standing of the current sites.

    | MiroAsh
    0

  • How do you deal with DMCA takedown notices related to product descriptions? With Google it is simple enough for any person to submit a DMCA takedown notice irrespective if the owner holds right to the content. One such example is this http://www.chillingeffects.org/notice.cgi?sID=1012391. Although Google dealt in that particular case properly (and did not remove content), we find that nowadays more and more competitors use the DMCA takedowns as an easy way to de-index competitive content. Since the person registering the DMCA takedown does not require to provide any proof of copyright, de-indexing happens quite quickly. Try this URL: http://www.google.com/transparencyreport/removals/copyright/domains/mydomain.com/ (replace your domain) to see if you have been affected. I would like your opinion if you have been affected by takedowns on product descriptions - in my mind if product descriptions are informative and relate to the characteristics of the product then takedowns should be denied.

    | MagicDude4Eva
    1

  • Hi all, I recently took over an e-commerce start-up project from one of my co-workers (who left the job last week). This previous project manager had uploaded ~2000 products without setting up a robot.txt file, and as a result, all of the product pages were indexed by Google (verified via Google Webmaster Tool). The problem came about when he deleted the entire product database from our hosting service, godaddy and performed a fresh install of Prestashop on our hosting plan. All of the created product pages are now gone, and I'm left with ~2000 broken URL's returning 404's. Currently, the site does not have any products uploaded. From my knowledge, I have to either: canonicalize the broken URL's to the new corresponding product pages, or request Google to remove the broken URL's (I believe this is only a temporary solution, for Google honors URL removal request for 90 days) What is the best way to approach this situation? If I setup a canonicalization, would I have to recreate the deleted pages (to match the URL address) and have those pages redirect to the new product pages (canonicalization)? Alex

    | byoung86
    0

  • Good morning, Over the last three months i have gone about replacing and removing all the duplicate content (1000+ page) from our site top4office.co.uk. Now it been just under 2 months since we made all the changes and we still are not showing any improvements in the SERPS. Can anyone tell me why we aren't making any progress or spot something we are not doing correctly? Another problem is that although we have removed 3000+ pages using the removal tool searching site:top4office.co.uk still shows 2800 pages indexed (before there was 3500). Look forward to your responses!

    | apogeecorp
    0

  • We have 5 domain alias of our existing sites
    All 5 domain alias are domain alias of our main site. It means, all domain alias will have exactly same site and contents
    Like Main domain: www.mywebsite.com
    DomainAlias: www.myproduct.com,  www.myproduct2.com,  www.myproduc3.com
    And if anybody will open our site www.myproduct.com, it will open same website which I have in primary site what can i do to rank all website without any penalty....i s there any way? This is domain alias of in hosting industry Thanks

    | unibiz
    0

  • Hi, One of our websites hit by Penguin update and I now know where the links are coming from. I have chance to remove the links from those incoming links but I am a little confused whether i should just remove the links from incoming links or disavow the links? Thanks

    | Rubix
    0

  • So i have been working with a company running their SEO for close to two years now. Since i started to engage with them they have always used a very simple pop up for the first time an end user visits their website (via javascript and cookies). The pop up simply ask them if they would like to download a solutions brochure  from their website. So as far as pop ups go, it is at least relevant. The client loves this pop up, i do not. For a while we have always held spots #1-3 for a lot of our keywords but we have started to drop to lower on the first page. So i have been researching to see if some of the new algorithm changes are targeting sites with this type of functionality. If i have some data i could definitely get them to remove it. So the question is, do pop-ups hurt your organic ranking? Thanks for the input! Kyle

    | kchandler
    1

  • Hi, Need to know does rel next and previous is more appropriate for content based articles and not blog listings.. Like an article spread across 3 pages - there it makes sense for rel next and previous as the content of the article is in series However, for blog listing page, for pages 1, 2, 3, 4 where every page is unique as the blog has all independent listings or separate articles - does rel next and previous wont of much help Our blog - http://www.mycarhelpline.com/index.php?option=com_easyblog&view=latest&Itemid=91 This is what been said by the developer "The whole idea of adding the "next" and "previous" tag in the header is only when your single blog post has permalinks like: site.com/blog/entry/blog-post.html
    site.com/blog/entry/blog-post.html?page=1
    site.com/blog/entry/blog-post.html?page=2 " The link in the head is only applicable when your content is separated into multiple pages and it doesn't actually apply on listings. If you have a single blog post that is broken down to multiple pages, this is applicable and it works similarly like rel="canonical" Can we safely ignore rel next and previous tag for this blog pagination for the listing pages !!

    | Modi
    0

  • Hi, So I have my great content (that contains a link to our site) that I want to distribute to high quality relevant sites in my niche as part of a link building campaign.  Can I distribute this to lots of sites?  The reason I ask is that those sites will then have duplicate content to all the other sites I distribute the content to won;t they?  I this duplication bad for them and\or us? Thanks

    | Studio33
    0

  • Dear friends, I just get from Google two "Unnatural inbound links" notifications via Google Webmaster Tools, the first is for our WWW version of the site and the second is for the NON-WWW version. My question, I should send two identical reconsideration request for WWW and NON-WWW or treat them as different sites? Thank you Claudio

    | SharewarePros
    0

  • I've got a website that has lots of valuable content and tools but it's been hit too hard by both Panda and Penguin. I came to the conclusion that I'd be better off with a new website as this one is going to hell no matter how much time and money I put in it. Had I started a new website the first time it got hit by Penguin, I'd be profitable today. I'd like to move some of that content to this other domain but I don't want to do 301 redirects as I don't want to pass bad link juice. I know I'll lose all links and visitors to the original website but I don't care. My only concern is duplicate content. I was thinking of setting the pages to noindex on the original website and wait until they don't appear in Google's index. Then I'd move them over to the new domain to be indexed again. Do you see any problem with this? Should I rewrite everything instead? I hate spinning content...!

    | sbrault74
    1

  • if we search site:domain.com vs www.domain.com, We see: 130,000 vs 15,000 results. When reviewing the site:domain.com results, we're finding that the majority of the URLs showing are blocked by robots.txt. They are subdomains that we use as production environments (and contain similar content as the rest of our site). And, we also find the message "In order to show you the most relevant results, we have omitted some entries very similar to the 541 already displayed." SEER Interactive mentions that this is one way to gauge a Panda penalty: http://www.seerinteractive.com/blog/100-panda-recovery-what-we-learned-to-identify-issues-get-your-traffic-back We were hit by Panda some time back--is this an issue we should address? Should we unblock the subdomains and add noindex, follow?

    | nicole.healthline
    0

  • We've got a series of articles on the same topic and we consolidated the content and pasted it altogether on a single page. We linked from each individual article to the consolidated page. We put a noindex on the consolidated page. The problem: Inbound links to individual articles in the series will only count toward the authority of those individual pages, and inbound links to the full article will be worthless. I am considering removing the noindex from the consolidated article and putting rel=canonicals on each individual post pointing to the consolidated article. That should consolidate the PageRank. But I am concerned about pointing****a rel=canonical to an article that is not an exact duplicate (although it does contain the full text of the original--it's just that it contains quite a bit of additional text). An alternative would be not to use rel=canonicals, nor to place a noindex on the consolidated article. But then my concern would be duplicate content and unconsolidated PageRank. Any thoughts?

    | TheEspresseo
    0

  • For my client I need to add some structure to its pages. The deepest pages are about restaurants and are sorted per city and then per province as a larger silo. I want to do this: Homepage > Provinces > Cities > Restaurant page This structure is optimal, but I as a usability freak I prefer making the experience cool for the users. I want to add interactive pictures that are cool for the user and hopefully are readable for the google bots, I want to do it like this: The homepage shows a map of my country that has the twelve provinces outlined, that light up when you hover over them. Then when you click a province you get to the province page. On the province page you see a large image of the province and see all cities where there are restaurants, when you hover over a city it grows a little and when you click it you arrive at the city page, at that page you will find a list of all restaurants that are available in the city. What I need to know is, is it possible for google to see these pictures as a nice site structure? Or do I need to add the ugly footer links and have pages with lists of links...? And what is the smartest way to structure this, flash?

    | Lebron27
    0

  • Can someone assist to analyse is the Schema tag and RDF Microdata is correct - http://www.mycarhelpline.com/index.php?option=com_newcar&view=product&Itemid=2&id=106&vid=361 - http://www.mycarhelpline.com/index.php?option=com_newcar&view=product&Itemid=2&id=22&vid=6 Reason - am asking is there are many sites reported on that the rich snippet though shows but in actual the RDFMicrodata does not show in on search engines many thanks

    | Modi
    0

  • Hi, We have 4 different sites on four different domains. each domain serves a different market and sells different products. However, all 4 sites have the same logo. we are thinking of implementing the logo schema.org for our logo - http://googlewebmastercentral.blogspot.ca/2013/05/using-schemaorg-markup-for-organization.html Does any one know if Google might see the identical logos (even if Google cant actually see an Image like a human) for different domains as something spammy? Many thanks

    | CeeC-Blogger
    0

  • Our site is about 4 months old now, although the domain is older. We are adding fresh new content, building good facebook/twitter/Goolge+ and undertaking good PR - but our ranking does not seem to be improving at all. Have I missed something obvious???? Thanks.

    | jj3434
    0

  • Hello My Moz report is showing I have an error for too many links on my sitemap and blog. The links on both pages are relevant and I'm not sure if this has to be sorted out, as I would have thought Google would expect sitemaps and blogs to have lots of links.  If I were to reduce the number of links how much of a positive affect would it have on my site? If any of you feel it is best practice to reduce number of links on these particular pages, do you have any suggestions on how I can tackle this? http://www.dradept.com/blog.php http://www.dradept.com/sitemap.php Thank you Christina

    | ChristinaRadisic
    0

  • I am having an internal debate on the need to use nofollow tags on sponsored internal links that link to internal pages. One thought is based on this Matt Cutts video  (Should internal links use rel="nofollow"?) in which he says that there is never a need to use a nofollow tag on an internal link. The other school of thought is that paid links with follow tags are a violation of Google policy and it does not matter if they link internally or externally. Matt was just not thinking of this scenario in his short video. Would love to hear if anyone has had any manual action from Google based on their internal links.

    | irvingw
    0

  • We are considering adding roughly 1,300 pages to a 2,300 page website within the drug rehab niche.  Our website is generating roughly 10,000 uniques from Search / month. **Is there a way to estimate the change in traffic to the existing content on the site when we add 30-40% pages in the form of a directory?  ** **Is there a way to estimate the effect of the existing traffic and links to our newly added part of the site (the directory)? **

    | alltreatment
    0

  • Hi mozzers, Our Spanish homepage doesn't seem to be indexed or cached in Google, despite being online for over a month or two. All Spanish subpages are indexed and have started to rank but not the homepage. I have submitted sitemap xml to GWTools and have checked there's no noindex on the page - it seems to be in order. And when I run site: command in Google it shows all pages except homepage. What could be the problem? Here's the page: http://www.bosphorusyacht.com/es/

    | emerald
    0

  • Hello here, I am trying to find out how I can filter out pages in Google Analytics according to their bounce rate. The way I am doing now is the following: 1. I am working inside the Content > Site Content > Landing Pages report 2. Once there, I click the "advanced" link on the right of the filter field. 3. Once there, I define to "include" "Bounce Rate" "Greater than" "0.50" which should show me which pages have a bounce rate higher of 0.50%.... instead I get the following warning on the graph: "Search constraints on metrics can not be applied to this graph" I am afraid I am using the wrong approach... any ideas are very welcome! Thank you in advance.

    | fablau
    0

  • One of the warnings from SEO Moz says that we have "too many on page links" on a series of pages on my website. The pages it's giving me these warnings on are on my printing sample pages. I'm assuming that it's because of my left navigation. You can see an example here: http://www.3000doorhangers.com/door-hanger-design-samples/deck-and-fence-door-hanger-samples/ Any suggestions on how to fix this warning? Thanks!

    | JimDirectMailCoach
    0

  • Dear All, I hope you can help me with another question about doing SEO for a large site: 1  - My domain is 11 year old, all time was a parking domain
    2 - We have 10,000 articles - unique content (500-1500 words)
    3 - the remaining are automated content, however, they are also unique with data (numbers, figure) We are going to launch it in 2 weeks, and intend to do the following things: Stage 1: first 2 months - only post 10,000 articles with unique content, NO using automated ones.
    Link building: get 5-10 authority links pointing to it, either article writings or link pages (authority links Yahoo directory/Dmoz) Stage 2: month 3 to 6: gradually put the automated content online while still posting unique and well written articles.
    Link building: Start building links with PR websites, article submission. Do you think there are any problems with this plan? and if 5-10 links can improve our site ranking, given it has a lot of unique content? Thank you very much. BR/Tran

    | SteveTran2013
    1

  • We are currently ranked #2 locally (NJ) for "IT support", and #27 for "it support". This is a fairly recent development - like today. I know "it" is a stop word, but I have never seen this before. The funny thing is that this is the only "IT" search where this happens. Most of my keywords contain "IT". What's up with this?

    | CsmBill
    0

  • Dear Mozzers I hope you can help me with the following problems: My site is up and running for a year now and may be there has been problem with the homepage, because it ranks on first page for a competitive keyword on Google.com and Google.com.au only, however in other countries it just shows up as internal page and does not rank well. google.com/Google.com.au: homepage ranks top 10  (example.com)
    Other countries (.co.uk, .ca..ect) an internal page shows up, example.com/internalpage.html - shows on page 3-4. I can not find the homepage of example.com anywhere around top 1000. Can you please tell me what are the potential problems. Thank you very much. BR/Tran

    | SteveTran2013
    0

  • Currently, we manage a site that generates content from a database based on user search criteria such as location or type of business.  ..Although we currently rank well -- we created the website based on providing value to the visitor  with options for viewing the content - we are concerned about duplicate content issues and if they would apply. For example, the listing that is pulled up for the user upon one search could have the same content as another search but in a different order.  Similar to hotels who offer room booking by room type or by rate. Would this dynamically generated content count as duplicate content? The site has done well, but don't want to risk a any future Google penalties caused by duplicate content.  Thanks for your help!

    | CompucastWeb
    1

  • We have a client with many archived newsletters links that contain tracking code at the end of the URL. These old URLs are pointing to pages that don't exist anymore. Is there a way to set up permanent redirects for these old URLs with tracking code? We have tried and it doesn't seem to work. Thank you!

    | BopDesign
    0

  • We have a site offering a voip app in 4 languages. Users are currently 302 redirected from the root page to /language subpages, depending on their browser language. Discussions about the sense of this aside: Is it correct to use a 302 redirect here or should users be 301 redirected to their respective languages? I don't find any guideline on this whatsoever...

    | zeepartner
    1

  • Hey guys /gals I have a question please.  I have a computer repair business that does extremely well in search and is on the front page of google for anything computer repair related. However, I am currently re-branding my company and have completely redesigned every aspect of the UI and the SEO Site structure as well as the fact that I have completely written vastly different content and different title tag lines and meta descriptions for each page. So basically when doing a migration we know that we want to keep our content, titles, headlines and meta descriptions the same as to not lose our page rank. Seeing that I have completely went against the grain in all directions on a much needed company re-branding and everything is completely different from the old site is it even worthwhile 301 redirecting my old urls to the new ones that would (best) correspond with the new? In the plainest English, would I do better at Ranking the New Website QUICKER without doing 301 redirects from the OLD to the NEW? In an EXTREME instance like what I have done, would the Domain Migration IMPEDED me ranking the new site seeing how nothing is the same? I have build a Rock solid SILO Site Architecture on the New site which is WordPress using the Thesis Framework and the old domain is built on JOOMLA 1.5 Thank fellas Marshall

    | MarshallThompson
    0

  • Hi all, I have a site where star ratings are being used in rich snippets. Up until about 8 weeks ago these were displaying in SERPs as normal. They have since stopped being displayed in SERPs even though Google's Rich Snippets testing tool says that the markup is correct and they display within the test tool environment. I'm just wondering if anybody else has had the same problem and if there's a solution? Thanks, Elias

    | A_Q
    1

  • Hi, We have a blog that is killing our SEO. We need to Disallow Disallow: /Blog/?tag*
    Disallow: /Blog/?page*
    Disallow: /Blog/category/*
    Disallow: /Blog/author/*
    Disallow: /Blog/archive/*
    Disallow: /Blog/Account/.
    Disallow: /Blog/search*
    Disallow: /Blog/search.aspx
    Disallow: /Blog/error404.aspx
    Disallow: /Blog/archive*
    Disallow: /Blog/archive.aspx
    Disallow: /Blog/sitemap.axd
    Disallow: /Blog/post.aspx But Allow everything below /Blog/Post The disallow list seems to keep growing as we find issues. So rather than adding in to our Robot.txt all the areas to disallow.  Is there a way to easily just say Allow /Blog/Post and ignore the rest.  How do we do that in Robot.txt Thanks

    | Studio33
    0

  • As the title says have CSS, info graphics, video and social bookmarks been overdone to the point that they are not worth doing for links, even if they are low quality links in the sense that they only pass a small amount of juice this is fine but I want to know are these links bad (the same as 99% of directories) that can get you penalised. I am in a pretty uncompetitive niche but need to build links up fast (this will also diversify my link graph) so would these types of links be ok or what would you suggest for low value links not low quality.

    | BobAnderson
    0

  • On Sunday 26th May, for about 40 minutes, we had about 25-30 direct visits from San Jose (we are a UK site). During this time our rankings increased dramatically and then as soon as the direct visits disappeared, our rankings went back to how they were prior to them visiting the site.

    | Jonnygeeuk
    0

  • Bit skeptical, as due to dynamic url and some other linkage issue, google has crawled url with backslash and asterisk character ex - www.xyz.com/\/index.php?option=com_product www.xyz.com/\"/index.php?option=com_product Now %5c is the encoded version of \ - backslash & %22 is encoded version of asterisk Need to know for command :- User-agent: *   Disallow: \As am disallowing all backslash url through this - will it only remove the backslash url which are duplicates or the entire site,

    | Modi
    0

  • Hello, I own virtualsheetmusic.com website and we have several thousands of media files (Mp3 and MIDI files) that potentially Google can index. If that's the case, I am wondering if that could cause any "duplicate" issues of some sort since many of such media files have  exact file names or same meta information inside. Any thoughts about this issue are very welcome! Thank you in advance to anyone.

    | fablau
    0

  • I've been looking at a lot of Silo illustrations and reading a lot on the optimal silo structure lately.  In many of the illustrations I see the Silos are all linking down or up in the structure, but not much sidelong action.  But I read about how you are supposed to have a "mini sitemap" on each page in the silo that links to every other page in the silo.  Is this really a good idea?  Seems to me you would only want to link up & down in the structure, or at most have links to the "next" & "previous" parts of the silo (sideways).  Having all those links on a page would just dilute the link juice wouldn't it?  I hardly ever see illustrations for linking sideways between pages in a silo, yet there seems to be a lot of talk about it, which is correct?

    | DownPour
    0

  • We became a charity in December and redirected everything from resistattack.com to resistattack.org. Both sites weren't up at the same time, we just switched over. However, GWT still shows the .com as a major backlinker to the .org. Why? More importantly, our site just got hit for the first time by an "unnatural link" penalty according to GWT. Our traffic dropped 70% overnight.  This appeared shortly after a friend posted a sidewide link from his site that suddenly sent 10,000 links to us. I figured that was the problem, so I asked him to remove the links (he has) and submitted a reconsideration request. Two weeks later, Google refused, saying.. "We've reviewed your site and we still see links to your site that violate our quality guidelines. Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes." We haven't done any "SEO link building" for two years now, but we used to publish a lot of articles to ezinearticles and isnare back in 2010/2011. They were picked up and linked from hundreds of spammy sites of course, none of which we had anything to do with. They are still being taken and new backlinks created. I just downloaded GWT latest backlinks and it's a nightmare of crappy article sites. Should I delete everything from EZA/isnare and close my account?  Or just wait longer for the 10,000 links to be crawled and removed from my friends site? What do I need to do about the spammy article sites? Disavow tool or just ignore them? Any other tips/tricks?

    | TellThemEverything
    0

  • Hi All, When I started my site (an eCommerce site) I copied (or tried) a lot of things from the best eCommerce sites I thought were out there. Sites like Zappos, ZALES, Overstock, BlueNile etc. I got hit pretty hard with latest algo changes and I posted my question at Google Webmaster Help forum I received answers from Gurus that we are keyword stuffing etc. (mainly with internal links to product pages but other issues as well). My answer was a link to Zappos and other sites showing that what we do is nothing compared to them. I also showed dozens of SEO "errors" like using H1 tag 10 times per page, not using canonicals and many other issues. The Guru's answer was "LOL" - who am I to compare myself to Zappos. So the question is... Can we take them for example or are they first simply because they are the biggest?

    | BeytzNet
    0

  • Hi, I have a classifieds website and I am wondering about the life of a page with an ad. An announcement has therefore a limited life, so : Is a 404 pages? a 301 redirect to the section? let the content without redirection? What is your opinion? Sorry for my english, i'm french 😉 Thanks. A.

    | android_lyon
    0

  • Hi All, The fun thing about our industry is that unlike poker - most cards are open. While trying to learn what the big guys are doing I chose to focus on www.Zappos.com - one of the largest sports wear (especially shoes). I looked how they categories, interlink and on their product pages. I have a question about duplication in an age where it is SO important.
    If you look in their running sneakers category you'd see that they show the same item (in different color) as two separate items - how are these pages no considered duplication? It gets even worse - If you look inside a shoe page (a product page) in the tab "About the Brand" you'd learn that all shoes from Nike (just an example) the about the brand is exactly the same. This is about 90% of the page for hundreds of Nike shoes pages - and goes the same for all other brands. How come they are ranked so high and not penalized in the era of Panda?
    Is it as always - big brands get away with anything and everything? Here are two example shoe pages:
    Nike Dart 10 (a)
    Nike Dart 10 (b) Thanks!

    | BeytzNet
    2

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.