Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi All, I'm currently in the process of creating a reconsideration request for an 'Impact Links' manual penalty. So far I have downloaded all LIVE backlinks from multiple sources and audited them into groups; Domains that I'm keeping (good quality, natural links). Domains that I'm changing to No Follow (relevant good quality links that are good for the user but may be affiliated with my company, therefore changing the links to no follow rather than removing). Domains that I'm getting rid of. (poor quality sites with optimised anchor text, directories, articles sites etc.). One of my next steps is to review every historical back link to my website that is NO LONGER LIVE. To be thorough, I have planned to go through every domain (even if its no longer linking to my site) that has previously linked and straight up disavow the domain (if its poor quality).But I want to first check whether this is completely necessary for a successful reconsideration request? My concerns are that its extremely time consuming (as I'm going through the domains to avoid disavowing a good quality domain that might link back to me in future and also because the historical list is the largest list of them all!) and there is also some risk involved as some good domains might get caught in the disavowing crossfire, therefore I only really want to carry this out if its completely necessary for the success of the reconsideration request. Obviously I understand that reconsideration requests are meant to be time consuming as I'm repenting against previous SEO sin (and believe me I've already spent weeks getting to the stage I'm at right now)... But as an in house Digital Marketer with many other digital avenues to look after for my company too, I can't justify spending such a long time on something if its not 100% necessary. So overall - with a manual penalty request, would you bother sifting through domains that either don't exist anymore or no longer link to your site and disavow them for a thorough reconsideration request? Is this a necessary requirement to revoke the penalty or is Google only interested in links that are currently or recently live? All responses, thoughts and ideas are appreciated 🙂 Kind Regards Sam

    | Sandicliffe
    0

  • (adult website) https://www.google.com.br/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=robertinha Why Google is not reading my description of Yoast plugin? Vídeos de sexo - Vídeos porno
    www.robertinha.com.br/
    Robertinha.com.br. lupa. facebook twitter plus. Página Inicial; Última Atualização: terça, 14 abril 2015. Página Inicial. Categorias. Amadoras (227) · Coroas (6) ... If I site: meusite.com.br work, he read correctly, but the site search not.
    I do not understand https://www.google.com.br/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=site:robertinha.com.br Vídeos de sexo - Vídeos porno
    www.robertinha.com.br/
    Vídeos de sexo grátis: assista agora mesmo vídeos porno com gatas, gostosas, safadas fazendo muito sexo.

    | stroke
    0

  • Our site does not have an SSL certificate. I have read that in the process of adding one of URLs need to be redirected and that some link equity can be lost. Implementing an SSL certificate sounds somewhat complicated and far from risk free. Is there a tangible SEO benefit to upgrading to SSL? Will doing so help SEO in a tangible manner that justifies the cost, time and aggravation? Thanks, Alan

    | Kingalan1
    2

  • So we use http://webcache.googleusercontent.com/search?q=cache:x.com/#!/hashbangpage to check what googlebot has cached but when we try to use this method for hashbang pages, we get the x.com's cache... not x.com/#!/hashbangpage That actually makes sense because the hashbang is part of the homepage in that case so I get why the cache returns back the homepage. My question is - how can you actually look up the cache for hashbang page?

    | navidash
    0

  • Hello Mozzers, I'm looking at a niche party services directory (b2c). However, they're not using nofollow tags on backlinks from their paid entries (free entries only get phone numbers and not backlinks). If they suddenly switch all the paid-for backlinks in their directory to nofollow, might that have some kind of negative impact. Switching sounds like the best way forward, but I want to avoid any unintended consequences. Thanks in advance, Luke

    | McTaggart
    0

  • We have a client who had been ranking in Google’s top ten organic results for 2 of his major keywords last year. Currently Bing and Yahoo ranks his site #1 for both of these terms; the ranking pages that are appearing were specifically targeted with these words. As of now, the client appears to have a 50 plus penalty for these two keywords. Appearing #76 for one term and # 60 for the other. We were thinking of submitting a reconsideration request through Google Webmaster tools, but discovered that you aren’t allowed to do that unless a Manual Action has appeared, which in this case has not. The only problem we’ve had with the site from an SEO standpoint is that we recently discovered a website that had copied some of the product descriptions verbatim. The client contacted the site owner who took it down immediately (about a month ago), but we still have not seen any improvement in rankings for these keywords. Does anyone have any ideas on how to communicate this Google and get the suspected penalty lifted if a reconsideration request is apparently not available?

    | roundabout
    0

  • We received an HTML5 recommendation that we should change onpage text copy contained in 'section" to be listed in 'main' instead, because this is supposedly better for SEO. We're questioning the need to ask developers spend time on this purely for a perceived SEO benefit.  Sure, maybe content in 'footer' may be seen as less relevant, but calling out 'section' as having less relevance than 'main'? Yes, it's true that engines evaluate where onpage content is located, but this level of granular focus seems unnecessary.  That being said, more than happy to be corrected if there is actually a benefit. On a side note, 'main' isn't supported by older versions of IE and could cause browser incompatibilities (http://caniuse.com/#feat=html5semantic). Would love to hear others' feedback about this - thanks! 🙂

    | mirabile
    0

  • Hello! My URL of www.morganlindsayphotography.com I have set up in google webmaster tools.  I also have added the instance of morganlindsayphotography.com (non www version) My blog is located at www.morganlindsayphotography.com/blog/ My question is do I add a sitemap to the "<a>www.morganlindsayphotography.com/blog/</a>"    as well as the <a>www.morganlindsayphotography.com</a>   ? Thank you!

    | 393morgan
    0

  • Is it possible to migrate to a new domain name without negatively impacting SEO? Our existing domain name (www.nyc-officespace-leader.com) is a bit spammy. It has been used for almost 10 years. We would like to migrate it to www.metro-manhattan.com. The metro-manhattan domain has been registered about 5 years and it redirects to the nyc-officespace-leader.com domain. The nyc-officespace-leader.com has a domain authority of 23 and a page authority of 32. The metro-manhattan domain has a domain authority of 7 and a page authority of 23. Is it possible to make this transition without losing domain authority and page rank? I would think that having two domains might loo spammy to Google and this change would be a positive in the long term. We do understand that the redirects for each page would need to be done carefully. Thanks, Alan

    | Kingalan1
    0

  • Hi, I have a bit of an issue... Around a year ago we launched a new company. This company was launched out of a trading style of another company owned by our parent group (the trading style no longer exists). We used a lot of the content from the old trading style website, carefully mapping page-to-page 301 redirects, using the change of address tool in webmaster tools and generally did a good job of it. The reason I know we did a good job is that although we lost some traffic in the month we rebranded, we didn't lose rankings. We have since gained traffic exponentially and have managed to increase our organic traffic by over 200% over the last year. All well and good. However, a mistake has recently occurred whereby the old trading style website domain was deleted from the server for a period of around 2-3 weeks. It has since been reinstated. Since then, although we haven't lost rankings for the keywords we track I can see in webmaster tools that a number of our pages have been deindexed (around 100+). It has been suggested that we put the old homepage back up, and include a link to the XML sitemap to get Google to recrawl the old URLs and reinstate our 301 redirects. I'm OK with this (up to a point - personally I don't think it's an elegant solution) however I always thought you didn't need a link to the xml sitemap from the website and that the crawlers should just find it? Our current plan is not to put the homepage up exactly as it was (I don't believe this would make good business sense given that the company no longer exists), but to make it live with an explanation that the website has moved to a different domain with a big old button pointing to the new site. I'm wondering if we also need a button to the xml sitemap or not? I know I can put a sitemap link in the robots file, but I wonder if that would be enough for Google to find it? Any insights would be greatly appreciated. Thank you, Amelia

    | CommT
    0

  • Here's a quick background of the site and issue. A site lost half of its traffic over 18 months ago and its believed to be a Panda penalty. Many, many items were already taken care of and crossed off the list, but here's something that was recently brought up. There are 30,000 pages indexed in Google,but there are about 12,000 active products. Many of these pages in their index are out of stock items. A site visitor cannot find them by browsing the site unless he/she had bookmarked and item before, was given the link by a friend, read about it, etc. If they get to an old product because they had a link to it, they will see an out of stock graphic and not allow to make the purchase. So, efforts have been made about 1 month ago to 301 old products to something similar, if possible, or 410 them. Google has not been removing them from the index. My question is how to make sure Google sees that these pages are no longer there and remove from the index? Some of the items have links to them and this will help Google see them, but what about the items which have 0 external / internal links? Thanks in advance for your assistance. In working on a site which has about 10,000 items available for sale. Looking in G

    | ABK717
    0

  • Hi, I'm in a bit of a quandary. I have this page: https://www.commercialtrust.co.uk/compare-products/ As you can see we have provided filters to only display Fixed rate, Tracker rate, Variable rate, High LTV and HMO products for users. At the moment our canonical tags all point to the main Comparison page, but in order for the search feature to work dynamic urls are created. So for example on the fixed rate page (https://www.commercialtrust.co.uk/compare-products/fixed-rates/) when a user puts in their search criteria the url ends up looking like this: https://www.commercialtrust.co.uk/compare-products/fixed-rates/?PrevTab=HMO&PVal=250000&Amt=100000&Tme=20&SearchId=5508 Now, my quandary is this - should I make the canonical tag for the filtered products (fixed, tracker etc) like this: https://www.commercialtrust.co.uk/compare-products/fixed-rates/ or should I keep it at https://www.commercialtrust.co.uk/compare-products/ ? The comparison page shows all products, ordered by the lowest rate and with a pre-set search, limited to 20 - so not all products will be displayed on the page - and some products (like the high LTV ones) are not displayed on the main comparison landing page anyway... Thanks, Amelia

    | CommT
    0

  • Hi all! We have an old site wordpress based, with great ranking and PR 7, called www.europe-internship.com which is going to be migrated into our new Django site www.eurasmus.com (specifically eurasmus.com/en/europe-internships)
    The new one is a much more advanced version that we will keep developing. We have been migrating the information already but we are planning to apply the 301s in the next weeks to start passing the SEO value to our new site and traffic. We have all the url structures and everything checked and technically we are ready for it.
    Therefore, we are almost ready. I have 2 questions: The new site includes more services, like accommodation, information...not only internships. Therefore, should we point the most relevant urls from our previous site to our home to share the value or just to the internships section? I am afraid that if the bounce rate goes higher from the 301 we could loose some value... 2)Should we point all the urls at the same time to the new site? Home, vacancies, blog pages, etc... or start gradually doing it to see how it goes till we make it to all the pages including the home? The old site still makes some money and I am not sure how quick will be to pass the SEO value, so in the way we may loose few thousand euros...We understand that, but we want to check what would be the best in your opinion. Let me know what you think and your opinion! Thank you in advance!

    | Eurasmus.com
    0

  • Hi, Late last year the company I work for launched two new websites that, at the time, we believed were completely separate from our main website. The two new websites were set up externally and were not well-planned from an SEO perspective (LOTS of duplicate content) - hence, they have struggled to rank on Google. Since the launch of the new websites we have also noticed that our main website (that previously ranked very well) has suffered a decline in visitation and search engine rank. We initially attributed this to a number of factors, including the state of the market, and ramped up our SEO efforts (seeing minor improvement). We have since realised that these two new websites have been set up as subdomains of our main website, with MOZ displaying the same domain authority and root domain backlink profile. My question is, do poor quality subdomains affect the ranking performance of a root domain? I have not yet managed to find a definitive answer. Please let me know if more information is required - I am quite new to the whole SEO concept. Thanks! Amy

    | paulissai
    0

  • Hello, Can someone give me advice on this specific situation: For now we have a website www.website.com/ Because of some specific business situation we want to move to .ca version but also we want to keep website.com - for U.S customers. Here's how I imagined to do this: 301 Redirect from www.website.com to website.ca. Because at this time website.com redirects to www.website.com I would remove the redirect and just keep it like website.com (so this will be new domain). Is this is the right solution? Regards, Nenad

    | Uniline
    0

  • Can anyone share what they feel is the best strategy to follow for a single service site? Would you optimise and target the homepage for the primary service they offer or target a page one level lower and leave the homepage to target the Brand name? Links to any references or case studies would also be greatly appreciated, thank you!

    | Marketing_Today
    0

  • I´d love to know your thoughts about this particular issue: Vtex is top3 e-commerce system in brazil. ( issue is huge) the system do not use 4XX responde codes If there is a error page, they just redirect it to a search page with 200 code. in Google index we can find a lot of "empty" pages ( indexed error pagess) We can´t use noindex for them Example:
    http://www.taniabulhoes.com.br/this-is-a-test
    OR
    http://www.taniabulhoes.com.br/thisisatest Any suggestions?

    | SeoMartin1
    0

  • Imagine these 2 scenarios for an ecommerce listing. 1. A listing that only closes once stock runs out 2. A listing that relists every 7 days assuming stock has run out and doing a 301 redirect to the latest version of that listing (imagine it relists several times) You might ask why on earth we would have the 2nd scenario, but we are an auction site where some listings can't be bid on. In other words those Buy Now only listings are also part of the auction model - they close after 7 days. For me it is a no-brainer that scenario 1 is better for SEO, and I have my ideas on why this is better for SEO than the second scenario such as age, SERP CTR, link equity not being diluted by 301 redirects not changing every 7 days when the listing relists multiple times etc. I was wondering if someone could articulate better than I possibly could why scenario 1 is better for SEO, and why scenario 1 would rank better in the SERPS....would it? Many thanks! Cheers, Simon

    | sichristie
    0

  • Hi We all know that the mobile update is coming on the 21st April and if your site isnt mobile friendly in Googles eye you will be removed from the mobile index. Will this affect tablets. Most of our pages are mobile friendly but there are a few which arent. However these are tablet friendly. I havent heard Google mention about tablet rankings. Thanks Andy

    | Andy-Halliday
    0

  • Hi everyone My question concerns moving from an old to a new domain name without losing all previous SEO efforts. I am aware that a properly executed 301 redirect is the answer and way to go as well as telling Google about it in Webmaster Tools. However, what is the situation, if you do not own the old domain name anymore? If you have no means of getting back the old domain name and wanting to basically mask/switch the already existing website to the new domain name, will search engines penalise the "new site" as a duplicate, since the "old site" is still in the search engine rankings? I know that not being able to execute a proper 301 redirect and starting out with a new domain means a fresh start, but what is the best way to minimise the negative impact (if any)? Basically dropping the sites' current content and starting out new in favour of the new domain name is not really an option. Even if you were to take the content from the old site and place it on another site, this would surely be seen as duplicate too. Anyone thinks that Webmaster Tools/Google is savvy enough to spot the difference when the "old site" gets removed and the "new one" added instead (in Webmaster Tools). I read something along the lines about having your host point the DNS from the old site to the new one. Could something like be helpful? Thanks all in advance for your help and input!

    | Hermski
    0

  • I have subscribed to Majestic SEO for three years and am considering cancelling it as it cost $600/year. Is this a false economy? Will I be losing essential data? Also, MOZ seems really deficient in the link tracking department. Its data seems plain wrong, showing far fewer domains pointing at my site than either Google Webmaster Tools or Majestic SEO. If I get rid of Majestic SEO are there any free/low cost tools that could take it's place? Thanks, Alan

    | Kingalan1
    0

  • Hi guys.  I've been building a new site because i've seen a real SEO opportunity out there.  I'm a mixing professional by trade and so I wanted to take advantage of SEO to help gain more work.  Here's the site: www.signalchainstudios.co.uk I'm curious about domain age.  This site fairly well optimised for my keywords, and my site got pretty good content on it (i think so anyway).  But it's no where to be seen on the SERP's (link at all).   Is this just a domain age issue? I'd have though it might be in the top 50 because my site's services are not hard to rank for at all! Also what about traffic?  Does Google want to see an 'active' site before it considers 'promoting' it up the ranks?  Or are back links and good content the main factor in the equation? Thanks in advance.  I love this community to bits 🙂 Isaac.

    | isaac663
    1

  • Hi, to follow up on my previous post (http://moz.com/community/q/low-on-google-ranking-despite-error-free), I was wandering if someone can tell me whether we are penalised by Google or not? Since the last 6 months, we see a rise in organic visitors coming from Bing, yahoo but Google remains the same. Despite the advice given in previous post, I just feel that something else must be wrong. Perhaps more inbound links with high PR? Socially, we are pretty much engaging 50-60% of our audience, yet no link flow will count for our organic ranking sadly enough... Hopefully someone can have a look at our site www.mercadonline.es in more detail? Ask me in a PM for more info! Thank you Ivordg

    | ivordg
    0

  • I see no reason why PDF couldn't be considered duplicate content but I haven't seen any threads about it. We publish loads of product documentation provided by manufacturers as well as White Papers and Case Studies. These give our customers and prospects a better idea off our solutions and help them along their buying process. However, I'm not sure if it would be better to make them non-indexable to prevent duplicate content issues. Clearly we would prefer a solutions where we benefit from to keywords in the documents. Any one has insight on how to deal with PDF provided by third parties? Thanks in advance.

    | Gestisoft-Qc
    1

  • OSE provides Spam analysis for website link profile, Do Moz have a tool to check the link quality before placing a link? How to do Spam Link Analysis before posting a link?

    | bondhoward
    1

  • Hey everyone. A while ago, I remember reading that Matt Cutts said that you can just disavow domains, and that the Google Webmaster Tools team doesn't read for comments (like if webmasters had been reached out to). Is this ringing any bells? I'm trying to find this tidbit again. Thanks!
    Charles

    | Charles_Murdock
    0

  • Hi friends !! I have a huge question . Which is the best tool for SEO? I am using a lot of tools but I would like to know more ways to position my website in the top . I hope that you can help me! Regards , Carlos Zambrana

    | CarlosZambrana
    1

  • One of our web pages will not rank on Google.  The website as a whole ranks fine except just one section...We have tested and it looks fine...Google can crawl the page no problem.   There are no spurious redirects in place. The content is fine.  There is no duplicate page content issue.  The page has a dozen product images (photos) but the load time of the page is absolutely fine.  We have the submitted the page via webmaster and its fine.  It gets listed but then a few hours later disappears!!!  The site has not been penalised as we get good rankings with other pages.  Can anyone help?  Know about this problem?

    | CayenneRed89
    0

  • We have a healthcare microsite that is in a subfolder off a hospital site.They wanted to keep their TLD and redirect from the subfolder URL. Even with good on-page SEO, link building, etc., they're not organically ranking as well as we think they should be. ie. They have http://our-business-name.com vs. http://hospital.org/our-business-name/ For best SEO value, are they better off having only their homepage as TLD and not redirect any interior pages but display as subfolder URL? ie. Keep homepage as http://our-business-name.com but use hospital urls for interior pages http://hospital.org/our-business-name/about/ Or is there some better way to handle this?

    | IT-dmd
    0

  • Hi, Our site https://soundbetter.com has been live for 2 years now, and as of yet we haven't yet been able to get our PageRank above 3/10. We have thousands of unique pages and plenty of original contextual content, we avoid duplicate content best we can, follow google's best practices for site structure, deal with any issues that come up in webmaster tools, have schema.org markup, avoid link spamming, have inbound links from authority sites (though OSE doesn't show most of them for some reason), lots of social shares to our pages and the domain has been owned by us for 12 years. Any thoughts on why we would still have a PR of 3? Thanks for helping

    | ShaqD
    0

  • I work for a Theater show listings and ticketing website. In our show listings pages (e.g. http://www.theatermania.com/broadway/this-is-our-youth_302998/) we split our content into separate tabs (overview, pricing and show dates, cast, and video). Are we shooting ourselves in the foot by separating the content? Are we better served with keeping it all in a single page? Thanks so much!

    | TheaterMania
    0

  • Hi Guys, I have a question relating to Z-Blocks in Magento. Our Magento store uses a lot of Z-Blocks, these are bits of content that are switched off and on depending on a customer’s user group. This allows us to target different offers and content to new customers (not logged in) and existing customers (logged in). Does anyone have any experience in how this impacts SEO? Thanks in advance!

    | CarlWint
    0

  • I've seen all the length recommendations and understand the reasoning is that they will be cut off when you search the time but I've also noticed that Google will "move" the meta description if the search term that the user is using is in the cached version of the page. S I have a case where Google is indexing the pages but not caching the content (at least not yet). So we see the meta description just fine on the Google results but we can't see the content cache when checking the Google cached version. **My question is: **In this case, why would it be a bad idea to make a slightly lengthier (but still relevant) meta description with the intent that one of the terms in that description could match the user's search terms and the description would "move" to highlight that term in the results.

    | navidash
    0

  • Morning Moz Fans: My URL is: http://goo.gl/Dhbjwj According to MOZ, which we are tracking this URL with, somewhere between the 3rd Feb and 10th the domain went from being fairly well indexed to being dropped to pages further back than 6-7, for pretty much everything, even the company name was only registering at the bottom of page one. Around this time we were transferring the website from .php into wordpress, so we were creating new pages, called by the same names and all the same content but we created the wordpress area in a sub domain of the website. Again around this time we had an issue with the blog area and had to take it down for 4-5 weeks due to some errors which meant google wouldn't have been able to crawl these pages properly, but the rest of the website was up and running. We also discovered recently that the company have and use this domain http://goo.gl/5JvDUH So my question is, what do you think caused the problem? has it been premaritally penalised? is there a way I can get google to specifically look at it and is there any more i can do?Thank you for your help

    | popcreativeltd
    0

  • I have a question!
    We are about to migrate from one version of mobile where we used a different desktop to a mobile version (non-adaptive) which has the same URL URL desktop. We must make a 302 redirect and remove the Canonical and linkages by code? That's the way to do migration?
    Thank You !!

    | romaro
    0

  • Hi there Thanks for reading my post.  İ am fairly new to SEO and dont know much coding.  İ purchased an opencart theme and am working with a developer to modify it to make it more user friendly.  The website is responsive and İ modified the menu for the desktop but now it doesnt have categories but just products. So it doesnt have URL for categories but just filter.  So the developer recommended to add the mobile menu  which has categories, and subcategories back to the desktop menu.  İm not sure if this is a kosher approach to seo.  Here is the link: As you can see there is a menu in the top menu and menu on the Main Menu.  THoughts?  will this be problem for duplicate content?  The Main menu keywords are crucial and is what the website is revolving around. New website

    | socratic-goat777
    0

  • Hey all, I have a community site where users are uploading photos and videos. Launched in 2003, back then it wasn't such a bad idea to use keywords/tags in the URLs, so I did that. All my content pages (individual photo/video) are looking like this: www.domain.com/12345-kw1-kw2-kw3-k4-k5 and so on. Where the 12345 is the unique content ID and the rest are keywords/tags added by the uploader. I would like to get rid of of the keywords after the ID in the URL. My site is well coded, so this can be easily done by changing a simple function, so my content page URLs become this: www.domain.com/ID What is the best course of action? 301 the KW URLs to non-KW version? Canonical? I really want to do this the proper way. Any advice is highly appreciated. Thanks in advance.

    | mlqsko
    0

  • Hi! I'm working on a site that got hit by a manual penalty some time ago. I got that removed, cleaned up a bunch of links and disavowed the rest. That was about six months ago. Rankings improved, but the big money terms still aren't doing great. I recently ran a Searchmetrics anchor text report though, and it said that direct match anchors still made up the largest part of the overall portfolio. However, when I started looking at individual links with direct anchors, nearly every one had been removed or disavowed. My question is, could an anchor text penalty be in place because these removed links have not been reindexed? If so, what are my options? We've waited for this to happen naturally, but it hasn't occurred after quite a few months. I could ping them - could this have any impact? Thanks!

    | Blink-SEO
    0

  • Hi all, I am led to believe that link juice does not pass through more than one 301 redirect, however what about a 301 and then a canonical meta tag? Here is an example: subdomain.site.com/uk/page/ -> 301 -> **www.**site.com/uk/page/ www.site.com**/uk/**page/ -> canonical -> www.site.com/page/ Thanks,
    Chris

    | Further
    0

  • Hello Guys, I am using google merchant xml which updates on every morning 12 am, now many times we run campaign which starts at 8 am and for that product price differ due to that google reject such products. So my query is, is here anyway to set campaign start and end time in xml? Note - In API we can update product price anytime but in xml it can be done at one time only that is 12 am only. right?

    | varo
    0

  • Hi Everyone, I'm currently doing quite a large back link audit on my company's website and there's one thing that's bugging me. Our website used to be split into two domains for separate areas of the business but since we have merged them together into one domain and have 301 redirected the old domain the the main one. But now, both GWT and Majestic are telling me that I've got 12,000 backlinks from that domain?  This domain didn't even have 12,000 pages when it was live and I only did specific 301 redirects (ie. for specific URL's and not an overall domain level 301 redirect) for about 50 of the URL's with all the rest being redirected to the homepage. Therefore I'm quite confused about why its showing up as so many backlinks - Old redirects I've done don't usually show as a backlink at all. UPDATE: I've got some more info on the specific back links. But now my question is - is having this many backlinks/redirects from a single domain going to be viewed negatively in Google's eyes? I'm currently doing a reconsideration request and would look to try and fix this issue if having so many backlinks from a single domain would be against Google's guidelines. Does anybody have any ideas? Probably somthing very obvious. Thanks! Sam

    | Sandicliffe
    0

  • I am in the midst of a major redesign of my site, including revamping existing articles . I have a couple of hundred articles and I am reviewing all aspects of these articles, including titles, URLs, content, etc. I am putting together a process as I move each article across to the new site and have SEO very much in mind. I'd appreciate any feedback on this. First off, let me be clear that I consider the quality of the content paramount. Anything suggested below is considered "supporting" (that content) from an SEO perspective. But, since I am moving this content across, I may as well take the opportunity to clean things up. The existing articles don't have particularly good SEO-related attributes, in terms of their titles, URLs, use of keywords and so on. So, I plan to do the following for each article. For illustrative purposes (our site serves the wedding industry), I will use an article about how to involve children at a wedding. Questionsunder each bullet. Use the "Keyword Difficulty" feature on Moz Pro to research a specific keyword for each article. In the example case I used "involving children in our wedding". Honestly, I am not really sure what to do with this feature 🙂 I've read everything from "focus on the long tail" to "don't fear highly competitive keywords". So, my current thinking is merely to use it as interesting information for they keyword I choose but not actually make any specific decisions from that ie. make sure the keyword is relevant to the article as the first priority and use the tool to check out search volume. Not sure what I should read into a zero for recent Bing searches. Is that really an important factor? I'm assuming the Google information is not available from Google (it would be displayed here otherwise, I'm guessing) Use a title that uses these keywords. In this case, I simply went with "Involving children in our wedding". Same for URL - /wedding-guests/involving-children-in-our-wedding If I have a reasonable, short and human-friendly term like this (I can do this with virtually every article quite easily), is there any reason why the URL and the title should not be the same? In short, the title and URL are both a relatively concise "mini-sentence" Make sure the meta description of the article is easy-to-read (for humans) and uses the keyword (sentence) Make sure that the theme (we are moving to WordPress) uses H1 for the page header/title and H2 for sections within the document Implement 301 redirects from the old URL (old site) to the new URL This seems like a pretty obvious approach for articles where the URL has changed (which will be most of them). But what do I do with articles that I am going to remove. Should I redirect (301) to a related article (so at least the visitor ends up on a page that is generally relevant) or just let this "fall through" as a non-existent page (401)? As I say, I have 200+ articles to go through I want to make sure I am taking this advantage to clean things up. Anything leaping out as missing/problematic? Thanks in advance Mark

    | MarkWill
    0

  • I have listed my domain in several Ask the Community requests. These have resulted in links from the Ask the Community posts showing up in MOZ site explorer. So actual links have been detected. Are these links harmful to my link profile? The content is not at all related to commercial real estate which is the subject of our website. Thanks, Alan

    | Kingalan1
    0

  • GWT calls pages that have "noindex, follow" tags "access denied errors." How is it an "error" to say, "hey, don't include these in your index, but go ahead and crawl them." These pages are thin content/duplicate content/overly templated pages I inherited and the noindex, follow tags are an effort to not crap up Google's view of this site. The reason I ask is that GWT's detection of a rash of these access restricted errors coincides with a drop in organic traffic. Of course, coincidence is not necessarily cause. Should I worry about it and do something or not? Thanks... Darcy

    | 94501
    0

  • In years past I was told not to disavow links in Bing unless the site had an issue. This was driven home when a site we were working on disavowed the links in google and saw the site recover after a few months, then when they disavowed the same links in Bing and the rankings dropped 20% over the next few months. The reasoning was that Bing was looking more at the qty of links, and didn't analyze links the way Google does. So even though you might disavow links in Google you might not want to disavow those same links in Bing. Does this still hold true in 2015? I want to get the community's opinion on this topic, should the same links be disavowed in Bing that are disavowed in Google? Why or why not?

    | K-WINTER
    1

  • Hi There, I found out about 6 months ago that I have been getting black hat SEO'd by another company. There are around 350 spammy domains pointing to my home page and product page. I have disavowed a lot of them. Is there anything else I can do? http://bareblends.com.au/ http://bareblends.com.au/the-optimum-9400-blender Thanks!

    | Oscarmj
    0

  • Besides crawling with a bot, what kind of info about site construction, html, etc does Google get via users via alternate methods? Thanks... Darcy

    | 94501
    1

  • Hi there SEOs and other respected online marketing enthusiasts! Since a short while Google Webmaster Tools is flagging my href lang return tags as missing. However, these tags are available in the html and there should be no issue whatsoever. I often see other bogus flags raised by Google but this one is bothering us in particular. This is the code that we use for all our 18 languages: All pages link to all other languages variables. Is there something I'm missing? Or is it just an error on Google's site - and if so, how do I flag it as fixed? href lang no return tag incorrectly error in webmaster tools

    | pimarketing
    0

  • Hi all, I work in a space where we have many products (specifically components and transistors) that are almost identical minus one or two changes in voltage, die size, etc. I'm curious if others in the tech / manufacturing space have had to tackle this issue? There are only so many ways to describe 5-6 products that are identical save one feature. My gut tells me to offer up more head terms in the title tags to draw searchers and get specific in the meta descriptions, but I'm open to ideas.

    | evan_cree
    0

  • Hello, We have been notified in the weekly MOZ rankings that we no longer rank for our brand name, which is almost unthinkable as nobody would target it. When I googled it, there is a Chinese site ranking at number one and using our brand name and meta data. What is very strange is that although our internal pages rank in the secondary positions, our homepage does not appear at all for the brand name. Therefore, I am wondering if somehow we have had some sort of hack and they are somehow redirecting to their own site? The brand name is "uksoccershop". Has anyone ever encountered anything like this before or have any idea on what we should do? Thanks for your help.

    | simonukss
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.