Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Is it legit to show different content to http request having different referrer? case a: user view one page of the site with plenty of information about one brand, and click on a link on that page to see a product detail page of that brand, here I don't want to repeat information about the brand itself case b: a user view directly the product detail page clicking on a SERP result, in this case I would like to show him few paragraph about the brand Is it bad? Anyone have experience in doing it? My main concern is google crawler. Should not be considered cloaking because I am not differentiating on user-agent bot-no-bot. But when google is crawling the site which referrer will use? I have no idea, does anyone know? When going from one link to another on the website, is google crawler leaving the referrer empty?

    | max.favilli
    0

  • First of all, I fully appreciate that I may be over analysing this, so feel free to highlight if you think I’m going overboard on this one. I’m currently trying to optimise the URLs for a group of new pages that we have recently launched. I would usually err on the side of leaving the urls as they are so that any incoming links are not diluted through the 301 re-direct. In this case, however, there are very few links to these pages, so I don’t think that changing URLs will harm them. My main question is between short URLs vs. long URLs (I have already read Dr. Pete’s post on this). Note: the URLs I have listed below are not the actual URLs, but very similar examples that I have created. The URLs currently exist in a similar format to the examples below: http://www.company.com/products/dlm/hire-ca My first response was that we could put a few descriptive keywords in the url, with something like the following: http://www.company/products/debt-lifecycle-management/hire-collection-agents - I’m worried though that the URL will get too long for any pages sitting under this. As a compromise, I am considering the following: http://www.company/products/dlm/hire-collection-agents My feeling is that the second approach will give the best balance between having the keywords for the products and trying to ensure good user experience. My only concern is whether the /dlm/ category page would suffer slightly, but this would have ‘debt-lifecycle-management’ in the title tag. Does this sound like a good approach to people? Or do you think I’m being a little obsessive about this? Any help would be appreciated 🙂

    | RG_SEO
    0

  • Hey All, Was just looking through some google pages on best practices for meta descriptions and came across this little tidbit. "Include clearly tagged facts in the description. The meta description doesn't just have to be in sentence format; it's also a great place to include structured data about the page. For example, news or blog postings can list the author, date of publication, or byline information. This can give potential visitors very relevant information that might not be displayed in the snippet otherwise. Similarly, product pages might have the key bits of information—price, age, manufacturer—scattered throughout a page. A good meta description can bring all this data together. For example, the following meta description provides detailed information about a book. " This is the first time I have seen suggested use of structured data in meta descriptions.  Does this totally replace a regular meta description or will it work in conjunction with the regular meta description? If I provide both structured data and text, will the SERP display text and the structured data the way it was previously displayed? Or will the 150 -160 character limit take precedence and just cut off all info after that?

    | Whebb
    0

  • I am about to move my Thailand-focused travel website into a new, broader Asia-focused travel website. The Thailand site has had a sad history with Google (algorithmic, not penalties) so I don't want that history to carry over into the new site. At the same time though, I want to capture the traffic that Google is sending me right now and I would like my search positions on Bing and Yahoo to carry through if possible. Is there a way to make all that happen? At the moment I have migrated all the posts over to the new domain but I have it blocked to search engines. I am about to start redirecting post for post using meta-refresh redirects with a no-follow for safety. But at the point where I open the new site up to indexing, should I at the same time block the old site from being indexed to prevent duplicate content penalties? Also, is there a method I can use to selectively 301 redirect posts only if the referrer is Bing or Yahoo, but not Google, before the meta-refresh fires? Or alternatively, a way to meta-refresh redirect if the referrer is Google but 301 redirect otherwise? Or is there a way to "noindex, nofollow" the redirect only if the referrer is Google? Is there a danger of being penalised for doing any of these things? Late Edit: It occurs to me that if my penalties are algorithmic (e.g. due to bad backlinks), does 301 redirection even carry that issue through to the new website? Or is it left behind on the old site?

    | Gavin.Atkinson
    0

  • My site used to be entirely HTTPS.  I switched months ago so that all links in the pages that the public has access to are now http only.  But I see now that when I do a site:www.qjamba.com, the results include many pages with https in the beginning (including the home page!), which is not what I want.  I can redirect to http but that doesn't remove https from the indexing, right? How do I solve this problem? sample of results: Qjamba: Free Local and Online Coupons, coupon codes ... **<cite class="_Rm">https://www.qjamba.com/</cite>**One and Done savings. Printable coupons and coupon codes for thousands of local and online merchants. No signups, just click and save. Chicnova online coupons and shopping - Qjamba **<cite class="_Rm">https://www.qjamba.com/online-savings/Chicnova</cite>**Online Coupons and Shopping Savings for Chicnova. Coupon codes for online discounts on Apparel & Accessories products. Singlehop online coupons and shopping - Qjamba <cite class="_Rm">https://www.qjamba.com/online-savings/singlehop</cite>Online Coupons and Shopping Savings for Singlehop. Coupon codes for online discounts on Business & Industrial, Service products. Automotix online coupons and shopping - Qjamba <cite class="_Rm">https://www.qjamba.com/online-savings/automotix</cite>Online Coupons and Shopping Savings for Automotix. Coupon codes for online discounts on Vehicles & Parts products. Online Hockey Savings: Free Local Fast | Qjamba **<cite class="_Rm">www.qjamba.com/online-shopping/hockey</cite>**Find big online savings at popular and specialty stores on Hockey, and more. Hitcase online coupons and shopping - Qjamba **<cite class="_Rm">www.qjamba.com/online-savings/hitcase</cite>**Online Coupons and Shopping Savings for Hitcase. Coupon codes for online discounts on Electronics, Cameras & Optics products. Avanquest online coupons and shopping - Qjamba <cite class="_Rm">https://www.qjamba.com/online-savings/avanquest</cite>Online Coupons and Shopping Savings for Avanquest. Coupon codes for online discounts on Software products.

    | friendoffood
    0

  • Due to the way most auto dealership website populate inventory pages, should you allow inventory to be indexed at all? The main benefit us more content. The problem is it creates duplicate, or near duplicate content. It also creates a ton of crawl errors since the turnover is so short and fast. I would love some help on this. Thanks!

    | Gauge123
    0

  • Here are a couple of scenarios I'm encountering where Google will crawl different content than my users on initial visit to the site--and which I think should be ok.  Of course, it is normally NOT ok,  I'm here to find out if Google is flexible enough to allow these situations: 1. My mobile friendly site has users select a city, and then it displays the location options div which includes an explanation for why they may want to have the program use their gps location.  The user must choose the gps, the entire city, or he can enter a zip code, or choose a suburb of the city, which then goes to the link chosen.  OTOH it is programmed so that if it is a Google bot it doesn't get just a meaningless 'choose further' page, but rather the crawler sees the page of results for the entire city (as you would expect from the url),   So basically the program defaults for the entire city results for google bot, but for for the user it first gives him the initial ability to choose gps. 2. A user comes to mysite.com/gps-loc/city/results   The site, seeing the literal words 'gps-loc' in the url goes out and fetches the gps for his location and returns results dependent on his location.   If Googlebot comes to that url then there is no way the program will return the same results because the program wouldn't be able to get the same long latitude as that user. So, what do you think?  Are these scenarios a concern for getting penalized by Google? Thanks, Ted

    | friendoffood
    0

  • I have an e-commerce website that is template based and I have absolutely no control over it. Each product have quite good ranking in google. However, we are creating new website using asp.net mvc and host in azure. It has totally new design. Since I have no control over my old website, I cannot force the server to redirect each product page to my new website product page. This is what I have done so far. I told my old website provider to point my domain (ex. domainA.com) to new nameserver at dyndns I created a new zone and add a http redirect service to new domain (http://www.domainB.com) with 301 redirect I'm pretty sure that this is not enough since there is a difference in url like this Old: www.domainA.com/product/70/my-product-name New: www.domainB.com/product/1/my-new-product-name New route config: {product}/{id}/{name} As you can see, the structure is similar but the product id and name is different. Do I need to catch the incoming id and name from old website and 301 redirect it again to the correct one? If so, this will cause double 301 redirect and would this be a SEO problem? Thank you in advance for your answer.

    | as14220808
    0

  • I have a domain (no subdomains) that serves up different dynamic content for mobile/desktop pages--each having the exact same page url, kind of a semi responsive design, and will be using "Vary: User-Agent" to give Google a heads up on this setup. However, some of the pages are only valid for mobile or only valid for desktop.  In the case of when a page is valid only for mobile (call it mysite.com/mobile-page-only ), Google Webmaster Tools is giving me a soft 404 error under Desktop, saying that the page does not exist, Apparently it is doing that because my program is actually redirecting the user/crawler to the home page.  It appears from the info about soft 404 errors that Google is saying since it "doesn't exist" I should give the user a 404 page--which I can make it customized and give the user an option to go to the home page, or choose links from a menu, etc.. My concern is that if I tell the desktop bot that mysite.com/mobile-page-only basically is a 404 error (ie doesn't exist), that it could mess up the mobile bot indexing for that page--since it definitely DOES exist for mobile users.. Does anyone here know for sure that Google will index a page for mobile that is a 404 not found for desktop and vice versa?  Obviously it is important to not remove something from an index in which it belongs, so whether Google is careful to differential the two is a very important issue.  Has anybody here dealt with this or seen anything from Google that addresses it?  Might one be better off leaving it as a soft 404 error? EDIT: also, what about Bing and Yahoo?  Can we assume they will handle it the same way? EDIT: closely related question--in a case like mine does Google need a separate sitemap for the valid mobile pages and valid desktop pages even though most links will be in both?  I can't tell from reading several q&a on this. Thanks, Ted

    | friendoffood
    0

  • My homepage (www.mach7t.com) is optimized for "enterprise imaging solutions", but only ranks #55 in Google. The rest of my subpages rank much better than that for their respective keywords, many on page 1. Any ideas why this might be?

    | CQMarketing
    0

  • Hi, I can not seem to find good documentation about the use of hreflang and paginated page when using rel=next , rel=prev
    Does any know where to find decent documentatio?, I could only find documentation about pagination and hreflang when using canonicals on the paginated page. I have doubts on what is the best option: The way tripadvisor does it:
    http://www.tripadvisor.nl/Hotels-g187139-oa390-Corsica-Hotels.html
    Each paginated page is referring to it's hreflang paginated page, for example: So should the hreflang refer to the pagined specific page or should it refer to the "1st" page? in this case:
    http://www.tripadvisor.nl/Hotels-g187139-Corsica-Hotels.html Looking foward to your suggestions.

    | TjeerdvZ
    0

  • Hello Hopefully can get a few opinions on this. We've added some user reviews to our website for key products. We added these approximately 3-4 weeks ago. In the last week we've seen keyword rankings drop on the pages they've been added to. For example see: http://www.naturalworldsafaris.com/wildlife/primates.aspx This page ranked well for both gorilla safari and gorilla safaris but both terms have dropped considerably (12 to 20 checking Google UK on the Moz rank checker). Due to the formatting required for the Rich Snippets (and we have the user review stars in the SERPS) the term "Gorilla safari" is perhaps becoming a bit spammy on the page. Another example would be "Borneo holidays" (up and down in the SERPS between 12-18) on this page: http://www.naturalworldsafaris.com/destinations/far-east/borneo.aspx Do you feel that these fluctuations in keyword ranking could be to do with this? Thanks

    | KateWaite
    0

  • Hello All, We want to split up our Sitemap , currently it's almost 10K pages in one xml sitemap but we want to make it in smaller chunks splitting it by category or location or both. Ideally into 100 per sitemap is what I read is the best number to help improve indexation and seo ranking. Any thoughts on this ? Does anyone know or any good tools out there which can assist us in doing this ? Also  another question I have is that should we put all of our products (1250) in one site map or should this also be split up in to say products for category etc etc ? thanks Pete

    | PeteC12
    0

  • Hi, I have two pages appearing in positions 11 and 12 for the keyword: 80 btl mortgage. These are: https://www.commercialtrust.co.uk/btl/landlord-advice/mortgages/btl-mortgage-80-ltv/ https://www.commercialtrust.co.uk/btl/product-types/80-buy-to-let-mortgages/ Both pages are good, provide useful information and I would not wish to remove one of them. However, I am concerned that the reason neither one of the pages is on page 1 is because the keywords targeted on both pages is essentially the same. Should I reoptimise one of them for other variations of 80 BTL mortgage keywords? (e.g. 80% LTV Buy to Let Mortgage, 80 Buy to Let Mortgage, etc etc) Or, is there another solution I haven't yet thought of? I welcome your insights! Thanks! Amelia

    | CommT
    0

  • Today I noticed that one of my colleagues was pointing rel canonical tags to a third party domain on a few specific pages on a client's website.  This was a standard rel canonical tag that was written Up to this point I haven't seen too many webmasters point a rel canonical to a third party domain.  However after doing some reading in the Google Webmaster Tools blog I realized that cross domain rel canonicals are indeed a viable strategy to avoid duplicate content. My question is this; should rel canonical tags be written the same way when dealing with internal duplicate content vs. external duplicate content?   Would a rel=author tag be more appropriate when addressing 3rd party website duplicate content issues? Any feedback would be appreciated.

    | VanguardCommunications
    0

  • A client has two different sites selling the same products with the same content, they would like to replatform onto Magento while redirecting those 2 sites to the new URL. The question is, besides monitoring the 301 redirects is there anything else to take into consideration when consolidating two sites into one new site?

    | RocketWeb
    0

  • Hi everyone, this question is a two parter: I am now working for a large website - over 500k monthly organic traffic. The site currently has both http and https urls in Google's index. The website has not formally converted to https. The https began with an error and has evolved unchecked over time. Both versions of the site (http & https) are registered in webmaster tools so I can clearly track and see that as time passes http indexation is decreasing and https has been increasing. The ratio is at about 3:1 in favor of https at this time.  Traffic over the last year has slowly dipped, however, over the last two months there has been a steady decline in overall visits registered through analytics. No single page appears to be the culprit, this decline is occurring across most pages of the website, pages which traditionally draw heavy traffic - including the home page.  Considering that Google is giving priority to https pages, could it be possible that the split is having a negative impact on traffic as rankings sway? Additionally, mobile activity for the site has steadily increased both from a traffic and a conversion standpoint. However that traffic has also dipped significantly over the last two months.  Looking at Google's mobile usability error's page I see a significant number of errors (over 1k). I know Google has been testing and changing mobile ranking factors, is it safe to posit that this could be having an impact on mobile traffic? The traffic declines are 9-10% MOM. Thank you. ~Geo

    | Geosem
    0

  • My client is generating templates for his eBay template based on content he has on his eCommerce platform. I'm 100% sure this will cause duplicate content issues. My question is this.. and I'm not sure where eBay policy stands with this but adding the canonical tag to the template.. will this work if it's coming from a different page i.e. eBay? Update: I'm not finding any information regarding this on the eBay policy's: http://ocs.ebay.com/ws/eBayISAPI.dll?CustomerSupport&action=0&searchstring=canonical So it does look like I can have rel="canonical" tag in custom eBay templates but I'm concern this can be considered: "cheating" since rel="canonical is actually a 301 but as this says: http://googlewebmastercentral.blogspot.com/2009/12/handling-legitimate-cross-domain.html it's legitimately duplicate content. The question is now: should I add it or not? UPDATE seems eBay templates are embedded in a iframe but the snap shot on google actually shows the template. This makes me wonder how they are handling iframes now. looking at http://www.webmaster-toolkit.com/search-engine-simulator.shtml does shows the content inside the iframe. Interesting. Anyone else have feedback?

    | joseph.chambers
    1

  • For reasons I won't get into here, I need to move most of my site to a new domain (DOMAIN B) while keeping every single current detail on the old domain (DOMAIN A) as it is. Meaning, there will be 2 live websites that have mostly the same content, but I want the content to appear to search engines as though it now belongs to DOMAIN B. Weird situation. I know. I've run around in circles trying to figure out the best course of action. What do you think is the best way of going about this? Do I simply point DOMAIN A's canonical tags to the copied content on DOMAIN B and call it good? Should I ask sites that link to DOMAIN A to change their links to DOMAIN B, or start fresh and cut my losses? Should I still file a change of address with GWT, even though I'm not going to 301 redirect anything?

    | kdaniels
    0

  • So here is the situation. I'm working on a site that offers "Best Of" Top 10 list type content. They have a list that ranks very well but is out of date. They'd like to create a new list for 2014, but have the old list exist. Ideally the new list would replace the old list in search results. Here's what I'm thinking, but let me know if you think theres a better way to handle this: Put a "View New List" banner on the old page Make sure all internal links point to the new page Rel=canonical tag on the old list pointing to the new list Does this seem like a reasonable way to handle this?

    | jim_shook
    0

  • We have a medium size site that lost more than 50% of its traffic in July 2013 just before the Panda rollout. After working with a SEO agency, we were advised to clean up various items, one of them being that the 10k+ urls were all mixed case (i.e. www.example.com/Blue-Widget). A 301 redirect was set up thereafter forcing all these urls to go to a lowercase version (i.e. www.example.com/blue-widget). In addition, there was a canonical tag placed on all of these pages in case any parameters or other characters were incorporated into a url. I thought this was a good set up, but when running a SEO audit through a third party tool, it shows me the massive amount of 301 redirects. And, now I wonder if there should only be a canonical without the redirect or if its okay to have tens of thousands 301 redirects on the site. We have not recovered yet from the traffic loss yet and we are wondering if its really more of a technical problem than a Google penalty. Guidance and advise from those experienced in the industry is appreciated.

    | ABK717
    0

  • One of my clients is using a scholarship to build links. We have a nofollow PR campaign getting ready to start and are doing some social marketing for the scholarship page on the site. We are also trying to get backlinks from highschools and colleges that link to scholarship opportunities. So far this has been a slow process. Does anybody have any advice for speeding any of this up? Has somebody ever done a campaign like this before? Is there some kind of database with financial aid contact info for a lot of schools? I contact a lot of schools and always tend to get put on the backburner.

    | Atomicx
    0

  • Imange name and html page name same is count spammy contents  ex. watertreatment - plan.jpg   watertreatment - plan.html

    | Poojath
    0

  • Hi, has anyone noticed across the board drops in DA in specific industries lately? If you're tracking 5 companies in one industry it seems odd that all five would drop 2-3 DA points all at once don't you think? Just peculiar I think and would love some insight if there is some to be had / shared.

    | wearehappymedia
    0

  • Hi all, I'm in discussion with a client who wishes to introduce a 'refurbished' products section to their website. This section will effectively replicate the structure of the 'brand new' products section. Unusually the key difference will be the fact that the 'refurbished' products section will feature significantly more products than the 'brand new' section, in the region of four times as many. As a guide the website currently stocks approximately 200 products across 8 core product areas. We have recommended that the two sections should be combined in order to prevent the creation of two separate product hierarchies. With 'brand new' / 'refurbished' products segmented via filter functionality. However the client is set on having two separate product hierarchies, i.e. a 'refurbished' section within a completely separate directory. Just wanted to crowd source opinion, in additionally to gaining insight if anyone has experience of a similar request. What solution did you implement? My feeling is that there is a high likelihood over time of the 'refurbished' section growing in authority and starting to outrank the 'brand new' products section. Not to mention a key missed opportunity to group and build authority / content within one product hierarchy. All thoughts and opinions much appreciated!

    | 26ryan
    0

  • Hi there, I've been working in SEO for more than five years and I'm always telling clients about the more than 200 factors that influence rankings, but sometimes I meet several urls or websites who haven't optimized their pages nor built links and still appear first. This is the case of the keyword "Escorts en Tenerife" in google.es. If you search that keyword in google.es you'll find this url: escortislacanarias.com... (I don't want to give them a link). My question is why the heck this url is ranking first on Google for that keyword if the url isn't optmized, the page content isn't optimized and hasn't got many or valuable incoming links? Do an on page grader to that url regarding that keyword an it gets an F !!! So there is no correlation between better optimization and good rankings.

    | Tintanus
    0

  • Hi All, We have been streaming our site and got rid of thousands of pages for redundant locations (Basically these used to be virtual locations where we didn't have a depot although we did deliver there and most of them was duplicate/thin content etc ). Most of them have little if any link value and I didn't want to 301 all of them as we already have quite a few 301's already We currently display a 404 page but I want to improve on this. Current 404 page is -  http://goo.gl/rFRNMt I can get my developer to change it, so it will still be a 404 page but the user will see the relevant category page instead ? So it will look like this -  http://goo.gl/Rc8YP8   . We could also use Java script to show the location name etc... Would be be okay ? or would google see this as cheating. basically I want to lower our bounce rates from these pages but still be attractive enough for the user to continue in the site and not go away.  If this is not a good idea, then any recommendations on improving our current 404 would be greatly appreciated. thanks Pete

    | PeteC12
    0

  • Hey everybody! So I was wondering what the difference between the H tags and "H Style". My first thought is that it's just the style guide, and not actually a meta tag, but before I go around changing all these styles I want to make sure my computer isn't going to explode SEO juice. Thanks!

    | HashtagHustler
    0

  • Hi guys i own a photographic website. www.hemeravisuals.co.uk And when I created it , i wasn't aware of the world of SEO , alt tags and labelling your images etc...
    Would it be wise to reupload my sites images (100 in total) as I cannot rename the files on my wordpress site but it does allow me to add alt text , captions etc?Or just add the data i can to the images allready on the site? Would it be worthwhile in terms of search and pagerank?

    | hemeravisuals
    0

  • Hi guys I run my own photography webstie (www.hemeravisuals.co.uk Going through the process optimizing my page for seo. I have one question I have a few gallery pages with no text etc? Do I still have to optimize these ? Would it rank my site lower if they weren't optimized? And how can i do this sucessfully with little text on these pages ( I have indepth text on these subjects on my services & pricing pages? Kind Regards Cam

    | hemeravisuals
    0

  • Hello, We are interested in finding someone experienced in SEO, who is willing to take a look at my work that I am doing for my employer, on an ongoing basis. This would help me ensure that I am on the right track and we are doing the right thing. Even tough I have some knowledge with SEO, I am  still novice in the field. I also have background in marketing and management. I believe by having someone to mentor, would help me to grow professionally, as I feel very passionate about SEO, but to also deliver good work to my employer. I believe someone's experience, reviews and suggestions about our campaigns can help, and I would not expect someone to give out their time freely. Therefore, if required we could talk about a financial agreement. Thank you for your time, and looking for your replies. Monica

    | monicapopa
    1

  • Is there a consensus in the SEO world around the best practice on how to treat the multiple auto-generated links for a domain? With a lot of the link profiles we have been analyzing nearly 70% volume of the backlinks relate to these auto generated links (e.g. similarweb.com, informer.com, webstatsdomain.org etc) I can see arguments for disavowing them (low-quality links) as well as keeping them (skew anchor text distribution towards URL mentions, natural link profile) but would be interested if people have run experiments or prefer strongly one way or the other.

    | petersocapro
    1

  • Hi all, Does the hosting of an website affect your SEO? We have a dynamic hosting currently, taking in account your knowledge and expertise, do you believe that this can affect SEO in any way? Thank you for your time. Good day. Monica

    | monicapopa
    2

  • Hello! We have a website that is built using Asp.net. My colleague and I are wondering whether or not changing the framework from Asp.net to php or html would have any negative impact on current rankings. My colleague was told by an SEO company that doing this would have a big negative effect, but we just can't see why that would be. The URLs of the site do not have an .asp extension, so we don't feel there would be any issues with 404s after the migration. The content, meta data and URL structure would remain the same. We posted this question in the Webmaster Central Forum and were told by a top contributor that it wouldn't have any negative impact, but we wanted a second opinion here. Thanks!

    | BBEXNinja
    0

  • Good Morning! I have a handful of pages that are not ranking very well, if at all. They are not driving any traffic, and are realistically just sorta "there". I have already determined I will not be bringing them over to our new web redesign. My question, could it be in our best interest to try and save these pages with ZERO traction and optimize them? Re-purpose them? Or does having them on our site currently muddy up our other pages? Any help is greatly appreciated! Thanks!

    | HashtagHustler
    0

  • One of my clients is a glazing company. We found out that al his local pages (city related pages, with titles like 'glazing new york') had an old post date, for example 1 june - glazing etc. (in google). I understand that the influence of freshness depends on the topic, but isn't it bad when dates are too far in the past? sorry for my english - we're an european company 🙂

    | remkoallertz
    0

  • I manage a website that I took over 6 months ago - the site was sitting happily on page one of google so I haven't had to do much to keep it there - other than a few onsite improvements. However, last week the site dropped off the SERPs. The site is http://www.pro-techairconditioning.co.uk/content/home.html Could someone please suggest reasons for this and ways to solve the problem? Thanks

    | SWD.Advertising
    0

  • Hey! I am working on a Penguin hit Website. Still ranking for all brand keywords and blog articles are still being returned in Google SERPs, but the website is showing up for only 3 or 4 money keywords. It is clearly a penguin hit as it was ranked 1st page for all money keywords before latest update (3.0). We already did a link cleanup and disavowed all bad backlinks. Still, the recovery process could take over 2 years from previous experience, and in 2 years, the site will suffer a slow death. Solution: We own the .com version of the domain, currently being served on the .net. We bought the .com version about 6 years ago, it is clean and NOT redirected to the .net (actual site). We were thinking about moving the whole Website to the .com version to start over. However, we need to make sure Google doesn't connect the 2 sites (no pagerank flow). Of course Google will notice is the same content, but there won't be any pagerank flowing from the old site to the new one. For this, we thought about the following steps: Block Googlebot (and only googlebot) for  the .net version via robots.txt. Wait until Google removes all URLs from the index. Move content to the .com version. Set a 301 redirect from .net to .com (without EVER removing the block on googlebot). Thoughts? Has anyone went over this before? Other ideas? Thanks!

    | FedeEinhorn
    0

  • Hi, If a domain has been parked for more than 12 years, and has never been used for a project so far, does this has an impact on SEO or its like having a fresh new domain? Sebi

    | TheHecksler
    0

  • We have been diligently managing our index size in Google for our sites and are returning a 410 status code for pages that we no longer consider "up-to-date" but still carry value for users to access to have Google remove them from our index to keep it lean. However we have been receiving GWT warning across sites because of the 410 status codes Google is encountering which makes us nervous that Google could interpret this approach as a lack of quality of our site. Does anyone have a view if the 410 approach is the right approach for the given example or if we should consider maybe simply using 301s or another status code to keep our GWT errors clean? Further notes there is hardly ever any link juice being sent to those pages so it is not like we are missing out on that the pages for which we return 410 are also marked as noindex and nofollow

    | petersocapro
    0

  • Hi there guys, I have a question about redirection. My boss has just bought a new domain name and he wants it to redirect to our current site when looking for specific products. www.example.com is our current website www.productname.com is the new domain So the new domain would be redirected to example.com. Would that be considered against Google Policies? Thanks

    | PremioOscar
    0

  • Hello, I know that one of the 'technical requirements' to get into google news is that the URL's have unique numbers at the end, BUT, that requirement can be circumvented if you have a Google News Sitemap. I've purchased the Yoast Google News Sitemap (https://yoast.com/wordpress/plugins/news-seo/) BUT just found out that you cannot submit a google news Sitemap until you are accepted into google news. Thus, my question is that do you need to add the digits to the URL's temporarily until you get in and can submit a google news sitemap, OR, is it ok to apply without them and take care of the sitemap after you get in. If anyone has any other tips about getting into Google News that would be great! Thanks!

    | stacksnew
    0

  • Hi all, When I search for keywords concerning "little wannahaves", the meta description in attachment 1 appears. This is however not the meta description I gave in. When I search for "site:littewannahaves.nl" the right meta description appears, see attachment 2. Does anyone know how why these two differ and how I can fix this? According to webmaster tools there should not be any error. Thanks in advance! P3FMNzP.png nkDXqRc.png

    | U-Digital
    0

  • I am trying to understand how to build good backlinks. I've read or been told that anchor text is bad (understood), websites with unrelated subjects is bad (understood), paid-for links are bad (?), "submitted" and article to get the link is bad (?), or it is a link/farm directory are bad (?).  This is actually really confusing to me!  Please bear with my silly questions, I am an engineer with an MBA, stay-at-home/homeschooling mom that helps run my husbands business.  My brain gets full fast!  Ha! I thought maybe if I post some of the links we have built, someone could look at them and tell me if they are good or bad.  This way I can go forward understanding better. Articles... I thought that writing articles was a GOOD thing!  Now I see that submitting articles isn't? So what about these two articles where my husband, as an expert in his field, wrote an article for a popular mag/blog. http://coastalanglermag.com/red-snapper-season/ (they did the hyperlink wrong and I am trying to get the publisher to fix it) http://www.bdoutdoors.com/article/shark-encounter-capt.-gregg-rapp/ (this is a pretty funny story if you read it) Blog stories... what about when someone copies your blog post as a story on their website?  Or someone scrapes your content? http://jeremyrymill.tumblr.com/ http://www.bdoutdoors.com/article/heart-pressure/ http://www.sportfishermen.com/board/showthread.php?t=2627767&p=3294972 (we post fishing reports and they get copied) Un-related Directories... http://www.mobileresources.net/mobi/Glowing_Deep_Sea_Fish/ (I didn't create this!!! I guess they just crawl and link?) Related "farm" Directories http://www.sportsmansresource.com/ffishtarpon.htm Other directories/citations such as yellow pages, city search, etc. These are okay right? Local hotels and what not... concierge's link to me, I don't even ask them to http://laquintacocoabeach.com/things-to-do/index.cfm Paid - I pay for MOZ Local!  Isn't that paying??? AND, I paid for a whois listing on enom... is that bad too? NOW FOR THE BAD.... these are bad right? http://darman.niniyoo.com/Sea-Leveler-Sport-Fishing-Charters-Inc-FL.html (I have no idea how this got there) http://teensnow.comwww.whoisbusinesslistings.com/Other/10/23490/Sea-Leveler-Sport-Fishing-Charters.html (what the heck is this??? how can they even append a website on a website?) http://forums.gq.com.au/member.php?u=1813973 (I think this was created on my behalf by my part-time SEO) If you have read all the way down to here, my sincerest thanks!  Thank you for your time!

    | CalicoKitty2000
    0

  • We are changing our homepage (and gradually the rest of the site) to Angular JS.
    In order not to lose anything in terms of SEO we are implementing Hashbangs + escaped fragment snapshots. Are there any other SEO considerations you think we should have and/or additional elements that we could add to the page to improve it in terms of SEO?

    | theLotter
    0

  • Hi - our website's sitemap is pretty huge, and I'm trying to generate it with the hreflang= information in it, because we have 11 different language sites all under the .com. I used the Media Flow generator for this purpose, but it returned a lot of entries with a blank tag. Our U.S. website by far has the most pages, so an example of what I'm getting is: Does this look correct???? Doesn't to me but I'm unsure.

    | Jenny1
    0

  • Is there anyone here that I can pay to give me a deep analysis of my website and my competitors with recommendation on what to do?  I am a small business and I cannot afford expensive monthly SEO fees. I can probably afford a one time consult fee, then I can do the work myself.  Or maybe I can pay a-la cart for some of the fixes. I understand this may not be something SEOs want to do since they make their money off doing the work and may not want to share trade secrets.  I just thought I would throw that question out there. I've been working on trying to SEO my site for a year now... I was improving and happy with my progress until October and lost 30 positions over my tracked keywords.  I have no idea why.  I'm kind of at my wits end! 😞

    | CalicoKitty2000
    0

  • We have two versions of our website English and Arabic. Arabic is in a sub-directory www.oncarx.com/ar (Not a Sub Domain). When we started SEO, we found that most of our Articles are plagiarized and contain non-original content. We are almost done with the optimization of our Arabic articles. English site contains too many articles and are not optimized for SEO. For getting good results and speed up the process, which of the following idea is better? We should separate the English and Arabic sites. We should remove most of the articles from English site. If there is any other good option, please let us know.

    | sagha
    0

  • Hello, For some reason, google is giving us sitelinks for for the wrong spelling of our domain. Our site is stackstreet(.com) and our company is named 'StackStreet'. Instead of showing sitelinks for the search 'StackStreet', google is showing them for 'Stack Street' (with a space). Any ideas? This spelling does not exist anywhere within our source code. Thanks!

    | stackstreet
    0

  • Hi All, Would you recommend displaying charts and graphs as images or HTML5 (highcharts etc.)? Thanks

    | BeytzNet
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.