Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • I'm not sure what the correct terminology would be for this but I'm calling it an ad wall. Essentially an ad overlay when someone enters a website. I see this most commonly on certain news websites. For example when you click on a link to an article on ign or forbes.com you get an ad that you have to close or skip to read the article. What are the SEO considerations if implementing something like this? I'm wondering if there are any similar to a pay wall in the sense that you want to let crawlers in to see your content and rank it but users get an ad or redirected to an ad and then back to the article page. This link currently does it for me for example http://www.forbes.com/sites/tjmccue/2012/05/22/spacex-launches-with-15-dreams-onboard/ I set my user agent to google bot and go right through to the article but if it is set to the browser default I get to an ad page I have to skip first. Is this the infamous "white hat cloaking"? Are the other ways to implement the same idea (a modal window that opens via javascript for example) that are more or less risky? I'm mainly interested in doing this based on referrer: people who type a URL directly don't see it but clicking on a link they do see it, for example.

    | IrvCo_Interactive
    0

  • Hi, If I create blog posts inside my website and link it back to my website, Would it still rank? I understand that its better to get links from the established domain. But I just wonder what kind of impact would my site have if the blog posts within my site link back to itself for ranking. Please let me know. thanks

    | zsyed
    0

  • Like many people, I get a lot of alerts about duplicate content, etc.  I also don't know if I am hurting my  domain authority because of the forum.  It is a pretty active forum, so it is important to the site. So my question is, right now there could be 50 pages like this <domain>/forum/index.php/topic/6043-new-modular-parisian-restaurant-10243-is-here/
    <domain>/forum/index.php/topic/6043-new-modular-parisian-restaurant-10243-is-here/page-1
    <domain>/forum/index.php/topic/6043-new-modular-parisian-restaurant-10243-is-here/page-2
    <domain>/forum/index.php/topic/6043-new-modular-parisian-restaurant-10243-is-here/page-3
    all the way to:
    <domain>/forum/index.php/topic/6043-new-modular-parisian-restaurant-10243-is-here/page-50</domain></domain></domain></domain></domain> So right now the rel canonical links are set up just like above, including the page numbers. I am not sure if that is the best way or not.  I really thought that all the of links for that topic should be
    <domain>/forum/index.php/topic/6043-new-modular-parisian-restaurant-10243-is-here/ that way it would passing "juice" to the main topic/link.   </domain> I do have other links setup for:
    link rel='next',link rel='up',link rel='last' Overall is this correct, or is there a better way to do it?

    | BrickPicker
    0

  • Hi All, I have a question about Google correctly accessing a site that has a 301 redirect to https on the homepage. Here’s an overview of the situation and I’d really appreciate any insight from the community on what the issue might be: Background Info:
    My homepage is set up as a 301 redirect to a https version of the homepage (some users log in so we need the SSL). Only 2 pages on the site are under SSL and the rest of the site is http. We switched to the SSL in July but have not seen any change in our rankings despite efforts increasing backlinks and out put of content. Even though Google has indexed the SSL page of the site, it appears that it is not linking up the SSL page with the rest of the site in its search and tracking. Why do we think this is the case? The Diagnosis: 1) When we do a Google Fetch on our http homepage, it appears that Google is only reading the 301 redirect instructions (as shown below) and is not finding its way over to the SSL page which has all the correct Page Title and meta information. <code>HTTP/1.1 301 Moved Permanently Date: Fri, 08 Nov 2013 17:26:24 GMT Server: Apache/2.2.16 (Debian) Location: https://mysite.com/ Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 242 Keep-Alive: timeout=15, max=100 Connection: Keep-Alive Content-Type: text/html; charset=iso-8859-1 <title>301 Moved Permanently</title> # Moved Permanently The document has moved [here](https://mysite.com/). * * * <address>Apache/2.2.16 (Debian) Server at mysite.com</address></code> 2) When we view a list of external backlinks to our homepage, it appears that the backlinks that have been built after we switched to the SSL homepage have been separated from the backlinks built before the SSL. Even on Open Site, we are only seeing the backlinks that were achieved before we switched to the SSL and not getting to track any backlinks that have been added after the SSL switch. This leads up to believe that the new links are not adding any value to our search rankings. 3) When viewing Google Webmaster, we are receiving no information about our homepage, only all the non-https pages. I added a https account to Google Webmaster and in that version we ONLY receive the information about our homepage (and the other ssl page on the site) What Is The Problem? My concern is that we need to do something specific with our sitemap or with the 301 redirect itself in order for Google to read the whole site as one entity and receive the reporting/backlinks as one site. Again, google is indexing all of our pages but it seems to be doing so in a disjointed way that is breaking down link juice and value being built up by our SSL homepage. Can anybody help? Thank you for any advice input you might be able to offer. -Greg

    | G.Anderson
    0

  • Hello Mozzers, I often seem to work on websites with several types of sitemaps - e.g. an html sitemap - an xml sitemap - almost always with identical structure and content. Does anybody know the thinking behind this? Currently looking at site with php and xml sitemap sitting alongside one another. I'm guessing one is for site users to read (and also to aid indexing) and the other for search engines, to further aid indexing. Does Google have any preferences? Is there anything you should be wary of re: Google, if there are multiple sitemaps?

    | McTaggart
    0

  • I'm looking for some guidance on an issue I believe we created for ourselves and if we undo what we did. We recently added attributed search to our sites. This of course created a bunch of dynamically generated URLS. For various reasons, it was decided to take some of our existing static URLs and 301 redirect them to their dyanamic counterpart. Ex .../Empire-Paintball-Masks-0Y.aspx now redirects to .../Paintball-Masks-And-Goggles-0Y.aspx?Manufacturer=Empire Many of these stat URLS had top 3 rankings for their associated keywords. Now, we don't rank for anything. I realize that 301 redirecting is the way to go...if you NEED to. My guess is our drop in keyword ranking is directly tied to what we did. I'm looking for an solid argument to be made to my boss as to why we should not have done this and that it, more than likely has resulted in dropped keyword rankings and organic traffic. I welcome any input. Also, if we decided to revert back (remove all 301 redirects and de-index all dynamic URLS), what is the likely hood we can recapture some of this lost organic traffic? Can I disallow indexing in a robot.txt file to remove, say anything with a '?' in the URL? Would the above URL example (which was ranking in the top 3 in SERPs), have a good chance of finding its way back? thanks

    | Istoresinc
    1

  • Hello Guys, I have a few questions regarding google manual penalties for unnatural link building. They are "partial site" penalties, not site wide. I have two sites to discuss. 1. this site used black hat tactics and bought 1000's of unnatural backlinks. This site doesn't rank for the main focus keywords and traffic has dropped. 2. this site has the same penalty, but has been all white hat, never bought any links or hired any seo company. It's all organic. This sites organic traffic doesn't seem to have taken any hit or been affected by any google updates. Based on the research we've done, Matt Cutts has stated that sometimes they know the links are organic so they don't penalize a website, but they still show us a penalty in the WMT. "Google doesn't want to put any trust in links that are artificial or unnatural. However, because we realize that some links may be outside of your control, we are not taking action on your site's overall ranking. Instead, we have applied a targeted action to the unnatural links pointing to your site." "If you don't control the links pointing to your site, no action is required on your part. From Google's perspective, the links already won't count in ranking. However, if possible, you may wish to remove any artificial links to your site and, if you're able to get the artificial links removed, submit areconsideration request. If we determine that the links to your site are no longer in violation of our guidelines, we’ll revoke the manual action." Check that info above at this link: https://support.google.com/webmasters/answer/2604772?ctx=MAC Recap: Does anyone have any experience like with site #2? We are worried that this site has this penalty but we don't know if google is stopping us from ranking or not, so we aren't sure what to do here. Since we know 100% the links are organic, do we need to remove them and submit a reconsideration request? Is it possible that this penalty can expire on its own? Are they just telling us we have an issue but not hurting our site b/c they know it's organic?

    | WebServiceConsulting.com
    0

  • Hey Everyone, our page has multiple domains and I'm wondering how it affects search rankings today. I saw some stuff from almost a year ago, but I'm not sure if something has changed. We currently have our root domain "www.xyz.com" and started moving some pages over to a different sub-domain "web.xyz.com" because of usability and ease of adjusting content. How much will this affect our seo? Thanks!

    | josh123
    0

  • I am Using Micro Niche Finder KeyWord: Send SMS online domain available: .org Avg. Monthly searches: 1600 Adword cost : 2.12 Measure of Back link: 633 SOC: 20 (green) Most of the Back link using Majestic SEO appear to be coming from sendhub.com and massmailsoftware.com website. I researched sites with these keywords and those sites appear to have tiered pricing plans and appear to be charging money for their services. My plan, put some effort in organic SEO and rank this site to page 1. Get some optin users and start communicating with them to see what is their pain point. How can I tell on what it takes to rank this page to page 1 on google? Would it take lot of blog articles for back link and if so… how many? Is a rough estimate possible. Thanks.

    | zsyed
    0

  • does having the "web" prefix in the domain name, such as in web.pennies.com/copper hurt SEO?

    | josh123
    0

  • I have a website that deals with personalized jewelry, and our main keyword is "Name Necklace".
    3 mounth ago i added new page: http://www.onecklace.com/name-necklaces/ And from then google index only this page for my main keyword, and not our home page.
    Beacuase the page is new, and we didn't have a lot of link to it, our rank is not so well. I'm considering to remove this page (301 to home page), beacause i think that if google index our home page for this keyword it will be better. I'm not sure if this is a good idea, but i know that our home page have a lot of good links and maybe our rank will be higher. Another thing, because google index this internal page for this keyword, it looks like our home page have no main keyword at all. BTW, before i add this page, google index our main page with this keyword. Please advise... U5S8gyS.png j50XHl4.png

    | Tiedemann_Anselm
    0

  • Hi, I was window shopping at Gemvara and noticed something interesting... They rank very high for long-tail phrases such as "rose gold engagement rings" and in their pagination pages for that category not only they filled canonical to the main category page (which is logic) but also they "NOINDEX NOFOLLOW" the pages... Is that recommended? Thanks

    | BeytzNet
    0

  • I'm working on this site: www.aldodavico.com - who is a real estate agent in Miami.  Any ideas/best practices for SEO for a site like this one?  It's got about 500 pages.  I've never deal with such a huge site before.

    | mrodriguez1440
    0

  • Can you break up a search query across a sentence and have Google still recognize which query you are targeting? Let's say I'm trying to rank a page for the phrase "best haircuts calgary". Is Google's algorithm advanced enough to look at page title "Best Haircuts - Where To Get Them In Calgary" and know it's targeting the query "best haircuts calgary"? If it can't do this right now, I could see it advancing to this at some point in the future, which would then change the game quite a bit in terms of how creative you can get creating pages for queries.

    | reidsteven75
    0

  • The site is  - http://shop.riversideexports.com We checked webmaster tools, nothing strange. Then we manually resubmitted using webmaster tools about a month ago. Now only seeing about 15 pages indexed. The rest of the sites on our network are heavily indexed and ranking really well. BUT the sites that are using a sub domain are not. Could this be a sub domain issue? If so, how? If not, what is causing this? Please advise. UPDATE: What we can also share is that the site was cleared twice in it's lifetime - all pages deleted and re-generated. The first two times we had full indexing - now this site hovers at 15 results in the index. We have many other sites in the network that have very similar attributes (such as redundant or empty meta) and none have behaved this way. The broader question is how to do we get the indexing back ?

    | suredone
    0

  • Hey, I'm trying to redirect all instances of "/archive_details.php?id=*" to "/public-affairs-job-archive.php". Is the below code correct? Redirect 301 /archive_details.php?id=* /public-affairs-job-archive.php Thanks, Luke.

    | NoisyLittleMonkey
    0

  • Hi all. I've been checking out http://www.unthankbooks.com/ as it seems to have some indexing problems. I ran a server header check, and got a 200 response. However, it also shows the following: X-Robots-Tag:
    noindex, nofollow It's not in the page HTML though. Could it be being picked up from somewhere else?

    | Blink-SEO
    0

  • Hi all! I asked the question below a little while back and got some great responses. Most said that the link profile needed A LOT of work. This got me thinking. As it is quite a low competition term, and I have control over a few of its inbound links, would it be easier to move to a new domain and start again? That is, no 301, just move the site and update it in WMT? Hi everybody. I've been working on this page for some time, http://www.double-glazing-forum.com/anglian-windows.aspx. Until several months ago, it ranked really well for the terms 'Anglian windows' and 'Anglian windows reviews'. However, following a Google update it tanked and has got worse ever since. Here's what I've done to try and fix it. Added 800 words of unique copy Added YouTube videos Replaced scraped press releases with unique descriptions that link to the source Analysed the backlink profile and uploaded a disavow file containing all bad links Contacted webmaster to remove them where possible Getting a bit low on ideas now, so any help would be great!

    | Blink-SEO
    0

  • Hi! I have a new client, the former agency added the client property with the agency account so we had to create a new GA account (as you can’t transfer ownership at the account level) but we also kept access to the former account to keep historical data. We were granted owner access to the GWT (which is more flexible, you can remove owners and creators) and we now want to remove former agency users. We have 3 adresses. One was verified with delegation method (no pb for removal), one with meta tag (no pb) and one with Google Analytics. Here it becomes tricky as Google says regarding GA verif method “If this account was verified using a Google Analytics tracking code, you should make sure that the user you want to unverify is no longer an administrator on the Analytics account. Otherwise, removal may not be permanent”. The thing is that this user has the same email address as the one used to create the agency GA account (no ownership transfer) so I basically can’t remove admin rights. The other possibility, as Google mentions when I try to unlink this user, is “remove the administrator status in Google Analytics or delete the Google Analytics tracking code on the website”. But we don’t want to remove the code as we still want to track data with the former account for historical analysis purposes. Has anyone ever faced this situation? Do you know how to handle this? Do you think that unlinking the GWT and the GA accounts will unverify the GA method? Many thanks in advance ! Ennick

    | ennick
    0

  • I'm in the process of trying to clear up a spammy link profile for a site I'm working on. I'm using the excellent data from MOZ and the list of links from Google Webmaster Tools to come up with a list of sites and Remove'em to manage the process and before I go to Google I want to make sure the file I am going to submit for the disavow process is as strong as possible. I am aware that I need to contact webmasters about three times to do the removal request properly.  How long between requests should there be and how long should I wait between submitting a final removal request and submitting the file to the disavow tool? Any advice welcome.  Thanks.

    | johanisk
    0

  • Hey Guys,I'm a bit stuck. My on-page grade indicated the following two issues and I need to find how how to fix both issues.If you have a solution, could you please let me know how to address these issues? It's all a bit intimidating at the moment!!Thank you so much..****************************************************************************************************************************************Appropriate Use of Rel Canonical If the canonical tag is pointing to a different URL, engines will not count this page as the reference resource and thus, it won't have an opportunity to rank. Make sure you're targeting the right page (if this isn't it, you can reset the target above) and then change the canonical tag to reference that URL. Recommendation: We check to make sure that IF you use canonical URL tags, it points to the right page. If the canonical tag points to a different URL, engines will not count this page as the reference resource and thus, it won't have an opportunity to rank. If you've not made this page the rel=canonical target, change the reference to this URL. NOTE: For pages not employing canonical URL tags, this factor does not apply. No More Than One Canonical URL Tag The canonical URL tag is meant to be employed only a single time on an individual URL (much like the title element or meta description). To ensure the search engines properly parse the canonical source, employ only a single version of this tag. Recommendation: Remove all but a single canonical URL tag

    | StoryScout
    1

  • I am trying to rank for the phrase "a link between worlds walkthrough" I am on page 1 but there are several results that just outranks me and I cannot see any reason that they would be doing so. My site is hiddentriforce.com/a-link-between-worlds/walkthrough/ For that page I have 5 linking domains, varied anchor text that spans from things like "here" to a variety of related phrases. All of the links come from really good sites My page has 1400 likes, 90 shares, and about 20 each in tweets and +'s DA of 44 PA of 37 The 4 and 5 ranked sites both have WAY less social interactions, lower PA and DA, less links, etc Yet they outrank me why?

    | Atomicx
    0

  • Hi, Can anyone give me insight into how people are getting away with naming their business by the SEO search term, creating a BS Google + page, then having that page rank high in the search results.  I am speaking specifically about the results you get when you Google: "Los Angeles DUI Lawyer". As you can see from my attached screenshot (I'm doing the search in Los Angeles), the FIRST listing is a Google + business.  Strangely, the phone number listed doesn't actually take you to a DUI attorney, but rather to some marketing group that never answers the phone. Can anyone give me insight into why Google even allows this?  I just find it odd that Google cares so much about the user experience, but have the first result be something completely misleading.  I know it sounds like I'm just jealous (which I am, a little), but I find it disheartening that we work so hard on SEO, and someone takes the top spot with an obvious BS page. UupqBU9

    | mrodriguez1440
    0

  • I need to know that what Link Building or SEO Strategies should be adopt after latest hummingbird update. I am really much confuse about it. Kindly Help. Thanks

    | irfan20012
    0

  • Just wondering how I would go about creating something like this http://www.slideshare.net/coolstuff/the-brand-gap?from_search=1

    | BobAnderson
    0

  • http://schema.org/significantLink Schema.org has a definition for "non-navigation links that are clicked on the most." Presumably this means something like the big green buttons on Moz's homepage. But does anyone know how they affect anything? In http://moz.com/blog/schemaorg-a-new-approach-to-structured-data-for-seo#comment-142936, Jeremy Nelson says " It's quite possible that significant links will pass anchor text as well if a previous link to the page was set in navigation, effictively making obselete the first-link-counts rule, and I am interested in putting that to test." This is a pretty obscure comment but it's one of the only results I could find on the subject. Is this BS? I can't even make out what all of it is saying. So what's the deal with significantLinks and how can we use them to SEO?

    | NerdsOnCall
    0

  • Hi everybody. Bit of an interesting question. I have a client that wants to have the following pages on their site indexed: example.fr/home.html on Google.fr for users based in France
    example.com/fr/home.html on Google.com for users not based in France. So they wish to have both pages indexed in the end but not displayed to the same geographic users. Not entirely sure the best way to go about this, so any tips would be much appreciated!

    | Blink-SEO
    0

  • Anyone good with GCS. I want to add Google custom searches in my site but with my site CSS.
    I need results from GCS but want to display with my website CSS. Website is in OSCommerce and php.

    | csfarnsworth
    0

  • I've been going over all the top ranking factors, running my site through Moz analytics and page graders, just researching the heck out of this, and I'm trying to figure out where we're going wrong. The site is www.imageworkscreative.com - and we're being outranked by newer sites.  We need to rank for terms like web design va, custom web design, web design firm, etc.  We publish blog updates on average once a week, and promote those via social media and a few syndication services. We were using a link builder who wasn't following our instructions regarding competitor backlinks ... pretty sure her work ended up hurting more than it helped. I'd like to hire a consultant or high-quality link builder to help get things going, for us and for our clients.  Any recommendations would be much appreciated, as well as any advice for us in terms of overall issues that need to be resolved. Thanks!

    | ScottImageWorks
    0

  • Hey, I am a confused canonical and here's why - please help! I have a master website called www.1099pro.com and then many other websites that simply duplicate the material on the master site (i.e  www.1099A.com, www.1099T.com, www.1099solution.com, and the list goes on).  These other domains & pages have been around for long enough that they have been able to garner some page authority & domain authority that it makes it worthwhile to redirect them to their corresponding pages on www.1099pro.com. The problem is two-fold when trying to pass this link-juice: I do not have access to the web-service that hosts the other sites/domains and cannot 301 redirect them The other sites/domains are setup so that whatever changes I make to www.1099pro.com are automatically distributed across all the other sites.  This means that when I put on www.1099pro.com it also shows up on all the other domains. It is my understanding that having on a site such as www.1099solution.com does not pass any link juice and actually eliminates that page from the search results.  Is there any way that I can pass the link juice?

    | Stew222
    0

  • Hi is there any advantage to using <cite class="vurls">goo.gl/</cite> to shorten a URL for Twitter instead of other ones?  I had a thought that <cite class="vurls">goo.gl/</cite> might allow google to track click throughs and hence judge popularity.

    | S_Curtis
    0

  • As per google penguin, all the low quality back links are going to affect the website SERPS hugely. So, we need to find all the bad back links and then remove them one by one. What I would like to know is, what tool do you use to find all the bad back links ? And how do we know which is a bad back link or bad website, where our link should not be there ? Then what service what do you suggest for back links removal. I contacted LinkDelete.com and they quoted me 97$ for a month to remove all links in less than 3 weeks. 
    Let me know, what you suggest.

    | monali123
    0

  • Is it me or you don't get (and not supposed to) email notifications when your site gets a manual action penalty? I mean I can see it in the Google Webmaster Tools interface, but I never get notifications in my Inbox.  Is that how it works or I just need to set it up somehow?

    | VinceWicks
    0

  • Hello, I have a website with tags (which have the noindex tag) on each article post. I've been told that I should noindex/nofollow these tag pages, because they are getting link juice passed to them, and since they aren't getting indexed, it's wasting link juice to those pages, when the link juice could be passed to a page that is actually getting indexed. What are your thoughts on this? Also, what would be the point to noindex/follow a page, if you are noindexing that page? Isn't it just wasting link juice? What is the proper SEO way to optimize tags.

    | WebServiceConsulting.com
    0

  • Hello, We have a site that is not updated very often - currently we have a script running to create/update the XML sitemap every time a page is added/edited or deleted. I have a few questions about best practices for creating XML sitemaps. 1. If the site is not updated for months on end - is it a bad idea to force the script to update i.e. changing the dates once a month? Will google noticed nothing has changed just the date i.e. all the content on the site is exactly the same. Will they start penalising you for updating an XML sitemap when there is nothing new about the website?
    2. Is it worth automating the XML file to link into Bing/Google to update via webmaster tools - as I say even if the site is never updated?
    3. Is the use of "priorities" necessary?
    4. The changefreq - does that mean Google/Bing expects to see a new file ever month?
    5. The ordering of the pages - the script seems pretty random and put the pages in a random order - should we make it order the pages with the most important ones first? Should the home page always be first?
    6. Below is a sample of how our XML sitemap appears - is there anything that we should change? i.e. all marked up properly? This XML file does not appear to have any style information associated with it. The document tree is shown below.
    <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"><url><loc>http://www.domain.com</loc>
    <lastmod>2013-11-06</lastmod>
    <changefreq>monthly</changefreq></url>
    <url><loc>http://www.domain.com/contact/</loc>
    <lastmod>2013-11-06</lastmod>
    <changefreq>monthly</changefreq></url>
    <url><loc>http://www.domain.com/sitemap/</loc>
    <lastmod>2013-11-06</lastmod>
    <changefreq>monthly</changefreq></url></urlset> Hope someone can help enlighten us to best practices

    | JohnW-UK
    0

  • This question is about the new custom URLs for Google+ Local Business pages: Has anyone heard any success stories with requesting a custom URL different than the two reserved ones offered by Google via contacting a Google Rep by phone? And what advantages might there be for a local business to go with a very long custom URL such asgoogle.com/+rosenbergAndDalgrenLLPFortLauderdale as opposed to just google.com/+RdLawyers? Does having the city name in the URL offer any <acronym title="Search Engine Optimization">SEO</acronym> benefit? Thanks!

    | gbkevin
    0

  • I have no idea how this happened, but our sitemap was http://www.kempruge.com/sitemap.xml, now it's http://www.kempruge.com/category/news/feed/ and google won't index it. It 404's. Obviously, I had to have done something wrong, but I don't know what and more importantly, I don't know how to find it in the backend of wordpress to change it. I tried a 301 redirect, but GWT still 404'd it. Any ideas? And, it's been like this for a few weeks, I've just neglected it, so I can't just reset the site without losing a lot of work. Thanks, Ruben

    | KempRugeLawGroup
    0

  • We have a site that got hit by a non-manual penalty in July really hard. I submitted a disavow file for the site placeyourlinks.com which had a bunch of clearly spammy links to the site listed in Webmaster tools. But the site itself was down for a long time so I couldn't see where the links even were. Then those links disappeared from the links file. I thought the urls were removed or the site was seen as being blank. But now they're back...and the site itself is shown as just being a blank page. I don't know what to do since I don't want to disavow those links again if it wasn't even addressed the first time and there is obviously no way to contact the site. Help! Also, I've done a bunch of work on the site to increase the amount of content while I was waiting to see what happened with the link disavow. But now all that is done and our rankings are still waaaay down. I'm considering getting really, really aggressive with link removal and disavowing if needed but I'm not sure what I should focus on removing/disavowing. Really bad sites with only one or two links? Sites that have a lot of links to the site? Sites with keyword stuffy anchor text? Any help on this would be much appreciated.

    | Fuel
    0

  • I have multiple product landing pages that feature very similar, but not duplicate, content and am wondering if this would affect my rankings in a negative way.  The main reason for the similar content is three-fold: Continuity of site structure across different products Similar, or the same, product add-ons or support options (resulting in exactly the same additional tabs of content) The product itself is very similar with 3-4 key differences. Three examples of these similar pages are here - although I do have different meta-data and keyword optimization through the pages. http://www.1099pro.com/prod1099pro.asp http://www.1099pro.com/prod1099proEnt.asp http://www.1099pro.com/prodW2pro.asp

    | Stew222
    0

  • Hello All, Just started to work on a new clients site that has been hit with multiple google penalties. I was looking at their backlink profile and noticed they have numerous links from what seem to be very low quality directory websites. My question is, when building citations and looking for directories to submit to, what makes one directory more credible then another one? If most of them are just publishing links and business information, why does google consider one credible and the other spammy? Clearly with some it's easy to tell if they are credible or not, but with some it is not as easy. Should you only really be submitting to the best of the best or are some lower lever ones ok too? Have read a few things on this topic, but most is older and just want to hear what people have to say on this today. Thanks.

    | Whebb
    0

  • Hello, A client is having one of their daily blogs published on a industry news site along with on their own site. This is a clear-cut case of having a canonical tag implemented on the client's site on each blog page, right? Thanks

    | Martin_S
    0

  • My client wants to have keywords added on every product  with the product name , apparently some seo guru told him that hummingbird is all about key phrases and long tail keywords. As i know hummingbird lends to understand the intent and contextual meaning of the query. The issue is if I  add the keywords on for e.g oak furniture on all of my product title,And we are using zen-cart platform and  it will change the internal anchor text on the product listing page.  It  will cause a Cannibalization issue. Question1. I just need help to reply to client that adding keyword can cause detrimental to ranking. Question 2. If i am wrong then do we need to re optimise the site. I have read http://moz.com/blog/how-to-solve-keyword-cannibalization Many thanks.

    | Adnan.Hassan.Khan
    0

  • Google has released a new penalty called "Image mismatch". Which actually penalizes sites that show images to Google than are not the same as the ones offered to users when accessing the site. Although I agree with those sites that the image is completely different that the one shown in image search, lately I've seen lots of big sites using some king of watermark or layer that reads something like "To see the high quality of this image, click here" in order to "force" the user to visit the site hosting the image. Considering the latest changes to Google's image search, which made lots of sites lose their "image search traffic", are these techniques considered part of the new penalty Google is applying? Or does it only apply to the first scenario when the image is completely different? You can read more on this new penalty here.

    | FedeEinhorn
    0

  • Hi there,
    I know that it takes time and I have already submitted a URL removal request 3-4 months ago.
    But I would really appreciate some kind advice on this topic. Thank you in advance to everyone who contributes! 1) De-indexing archives Google had indexed all my:
    /tag/
    /authorname/
    archives. I have set them as no-index a few months ago but they still appear in search engine.
    Is there anything I can do to speed up this de-indexing? 2) De-index /plugins/ folder in wordpress site They have also indexed all my /plugins/ folder. So I have added a disallow /plugin/ in my robots.txt 3-4 months ago, but /plugins/ still appear in search engine. What can I do to get the /plugins/ folder de-indexed?
    Is my disallow /plugins/ in robots.txt making it worse because google has already indexed it and not it can't access the folder? How do you solve this? 3) De-index a subdomain I had created a subdomain containing adult content, and have it completely deleted it from my cpanel 3months ago, but it still appears in search engines. Anything else I can do to get it de-indexed? Thank you in advance for your help!

    | Ltsmz
    0

  • I have a large ecommerce website which is structured very much for SEO as it existed a few years ago. With a landing page for every product/town nationwide (its a lot of pages). Then along came Panda... I began shrinking the site in Feb last year in an effort to tackle duplicate content. We had initially used a template only changing product/town name. My first change was to reduce the amount of pages in half by merging the top two categories, as they are semantically similar enough to not need their own pages. This worked a treat, traffic didn't drop at all and the remaining pages are bringing in the desired search terms for both these products. Next I have rewritten the content for every product to ensure they are now as individual as possible. However with 46 products and each of those generating a product/area page we still have a heap of duplicate content. Now i want to reduce the town pages, I have already started writing content for my most important areas, again, to  make these pages as individual as possible. The problem i have is that nobody can write enough unique content to target every town in the UK via an individual page (times by 46 products), so i want to reduce these too. QUESTION: If I have a single page for "croydon", will mentioning other local surrounding areas on this page, such as Mitcham, be enough to rank this page for both towns? I have approx 25 Google local place/map listings and grwoing, and am working from these areas outwards. I want to bring the site right down to about 150 main area pages to tackle all the duplicate content, but obviously don't want to lose my traffic for so many areas at once. Any examples of big sites that have reduced in size since Panda would be great. I have a headache... Thanks community.

    | Silkstream
    0

  • Hi, I'm planning to build a website that will present games previews for different sports. I think that the date should be included in the URL as the content will be valuable until the kick off f the game. So first i want to know if this is the right approach and second the URL structure i have imagined is /tips/sport/competition/year/month/day Ex : /tips/football/premier_league/2013/11/05 Is this a good structure ? Guillaume.

    | betadvisor
    0

  • Hello everyone! I validated the sitemap with different tools (w3Schools, and so on..) and no errors were found. So I uploaded into my site, tested it through GWT and BANG! all of a sudden there is a parsing error, which correspond to the last, and I mean last piece of code of thousand of lines, . I don't know why it isn't reading the code and it's giving me this as there are no other errors and I haven't got a clue about what to do in order to fix it! Thanks

    | PremioOscar
    0

  • Hi, At http://www.general-hypnotherapy-register.com/regional-hypnotherapy-directory/ we have a load of town and county pages for all of the hypnotherapists on the site a) I have checked all of these links and they are spiderable. b) About a month back I noticed after the site changes, not entirely sure why, but the site was generating rogue pages, eg http://www.general-hypnotherapy-register.com/hypnotherapists/page/5/?town=barnsley instead of http://www.general-hypnotherapy-register.com/hypnotherapists/?town=barnsley We have added meta no index, no follow to these rogue pages around 4 weeks ago..however these pages still have a google cache date of Oct 4th predating these meta changes c) There are examples of the pages we do want, indexed, and ranking too on page 1, site:www.general-hypnotherapy-register.com/hypnotherapists eg http://www.general-hypnotherapy-register.com/hypnotherapists/?town=ockham however these pages are few and far between, these have a recent google cache date of Nov 1 **d) **The xml sitemap has all of the correct URLS, but in webmaster tools, the amount of pages indexed has been stubbornly flat at 2800 out of 4400 for 4 weeks now e) Query Paramaters: for ?town and ?county in webmaster tools, are set to Yes/Specifies Would love any suggestions, Thanks. Mark.

    | Advantec
    0

  • Hey, Were adding schema to a website and I was wondering how it would be best to tackle a business that has two location. Would it be better to put it on two different pages or on one page using one or two itemscopes. Thanks, Luke.

    | NoisyLittleMonkey
    0

  • Do you have any suggestiongs? I do not know local websites where I can get some easy backlinks. I guess a record in Google Places.would be great as well. Any sound suggestion will be appreciated. Thanks!

    | stradiji
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.