Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hello Guys! I have a situation with a website and I need some opinions. Today, the structured of my site is: (I have had this site architecture since many years) Main country home (www.mysite.com.tld) o   Product_1 Home (www.mysite.com.tld/product1/) §  Product_1 articles www.mysite.com.tld/product1/product1_art1 www.mysite.com.tld/product1/product1_art2 www.mysite.com.tld/product1/product1_artx o   Product_2 Home (www.mysite.com.tld/product2/) §  Product_2 articles www.mysite.com.tld/product1/product2_art1 www.mysite.com.tld/product1/product2_art2 www.mysite.com.tld/product1/product2_artx I have several TLDs with their main and their products. We are thinking in modify this structure and begin to use subdomains for each product (The IT guys need this approach because is simpler to distribute the servers load). I not very friendly with subdomains and big changes like this always can produce some problem (although the SEO migration would be ok, problems could appear, like ranking drops),   But, the solution (the reasons are technical stuff), requires the mix of directories and subdomains in each product, leaving the structured in this way: Main country home (www.mysite.com.tld) o   Product_1 Home (www.mysite.com.tld/product1/) §  Product_1 articles product1.mysite.com.tld/product1_art1 product1.mysite.com.tld/product1_art2 product1.mysite.com.tld/product1_artx o   Product_2 Home (www.mysite.com.tld/product2/) §  Product_2 articles product2.mysite.com.tld/product1_art1 product2.mysite.com.tld/product1_art2 product2.mysite.com.tld/product1_artx So, the product home will be in a directory buy the pages of the articles of this product will be in a subdomain. What do you think about this solution?  Beyond that the SEO migration would be fine, 301s, etc, can bring us difficulties in the rankings or the change can be done without any consideration? Thanks very much! Agustin

    | SEOTeamDespegar
    0

  • Does anyone know if using a wide responsive layout that brings content well above the fold on big screens (but still pushes it down on small screens or mobile devices) is a good option? We have an adsense site that just got destroyed and I'm assuming its this new Google algo that's looking at sites with too big of ads above the fold.

    | iAnalyst.com
    0

  • Hi, The site I am working on currently uses numerous pages for search terms with similar keywords. vehicle wrapping / vehicle wraps / car wrapping / car wraps / van wrapping / van wraps etc Now obviously i want to bring these into one to help create one high authority page covering all terms. At present the "car wraps" page is ranking for quite a few of these terms. Am i best to stick with this or chose the highest search term being car wrapping, and pass the dribbles of juice from the rest and "car wraps" onto this? This is aimed at a local demographic so the local terms will be thrown in too unless you think the places pages will work in favour? Many thanks,

    | Lee4dcm
    0

  • My website's FAQ section has a lot of detailed answers, of which I want to upload most on an individual basis to my blog. Example: I may have 30 FAQ and I want to upload 28 of these FAQ as individual blog posts, as it could be good additional search traffic. Question: how do I deal with duplicate content issues? Do I Include canonical? The FAQ are all on the same URL - not separate URL's - which means each blog post would only represent a small % of the entire FAQ section, though each blog would be a 100% copy of an FAQ.

    | khi5
    1

  • "Noindex" is a suggested pagination technique here: http://searchengineland.com/the-latest-greatest-on-seo-pagination-114284, and everyone seems to agree that you shouldn't canonicalize all pages in a series to the first page, but I'd love if someone can explain why "noindex" is better than a canonical?

    | nicole.healthline
    0

  • Hello, A while ago (Sep. 19 2013) we had a new url structure upgrade for products pages within our website (with all the needed 301 redirects in place,internal links & sitemaps updates), but our new urls lost the serps of the old ones and with that we experienced a big traffic drop (and since September I can't see any sign of recovery).
    Here are just 3 examples of old and coresponding new urls: http://www.nobelcom.com/phone-cards/calling-Mexico-from-United-States-1-182.html
    http://www.nobelcom.com/Mexico-phone-cards-182.html http://www.nobelcom.com/es/phone-cards/calling-Mexico-from-United-States-1-182.html
    http://www.nobelcom.com/es/Mexico-tarjetas-telefonicas-182.html http://www.nobelcom.com/phone-cards/calling-Angola-Cell-from-Canada-55-407.html
    http://www.nobelcom.com/Angola-Cell-phone-cards/from-Canada-55-407.html We followed every seo/usability rule and have no clue why this happened. Any ideea? Cheers,
    S.

    | Silviu
    0

  • Hi , We are currently in the process of re-developing out commerce website and I wondering should we use a CDN (content delivery nertwork) for our product images. My category pages are currently showing approx 21 product images per page and the page speed is okay but can be better but the page size is rather large ... anything between 600kb - 1 Meg. We do optimise the images already in photoshop. We also do things like minify etc to get the pages to load as fast as possible but I think the only thing left is using a CDN but I have heard mixed reports about using this.? We are also doing a mobile responsive version of the site to but I know that speed will be king with google and how it reflects on rankings. Whilst I can see a CDN will improve image page load speed etc, I guess there a negative SEO impact as well as images will be stored in the cloud ?.. as opposed on to on my site/database. Does anyone know how best to implement a CDN without impacting on SEO or know of any good SEO /implementation articles on this ?... Maybe do Ieave some images on my category pages so I can still do the alt image tags etc/ and have the remaining images on the CDN.? Many Thanks Sarah

    | SarahCollins
    0

  • My company is looking to hire an Intern and a university - ".edu" - has including the job description on a page with list of "Available Internship", which includes a list of 100 other companies looking for an Intern. Our profile includes a link to our homepage like all other listings. Question: do we know search engines will give zero value to such link as they can tell it is an Internship posting page or is there a good chance it will actually be quite a valuable link?

    | khi5
    0

  • We've been outsourcing our link building to India for the past 3 years and the results were pretty good up until beginning of this year. What they were essentially doing is putting links into directories, a few per month, and posting a few articles per month. Out of our top 10 keywords, 8 got into top 10. Then something happened around Jan 1 last year, our ranking started dropping, falling out of the top 50, before settling around 20-30ish. We disavowed most of the low quality links since then. Also, very odd, all the top ranking competitors all fell (including me) and were replaced by less "specialized" companies who sold a broad range of products (for example: all parts of the car, rather than someone who just focused on mufflers). Theres also other differences but again I can't put a finger on it. I'd like to find someone who can do a detailed audit of our site, and our competitors, what happened to cause the drop, and why the new top positions sites are ranked high. And I really don't have time to do an audit myself. Our site is American Hospitality Furniture dot com. Any feed back would be appreciated. Thanks in advance.

    | AHH888
    0

  • Hello, With WordPress MultiSite, does Domain Mapping negatively impact search rankings? I am wondering if the search engines can tell if the Domain is part of a MultiSite Network. Or does it just see the site as a regular website? I understand the issue of IP Address and C Blocks but I'm wondering if the search engines will treat a Mapped Domain Name as it would any other website that is on a shared hosting account. Thanks

    | bronxpad
    0

  • Hello, We are currently adding a new section of content on our site related to Marketing and more specifically 'Digital Marketing' (research reports, trend studies, etc). Over time (several months, or 1-3 years) we will add more 'general' marketing content. My question is which of the following URL structures makes more sense from an SEO perspective (and how best to quantify the benefit of one over another): www.mysite.com/marketing/digital/research/... www.mysite.com/digital-marketing/research/.. Thanks, Mike

    | mike-gart
    0

  • Hello All, I am undertaking the daunting task of a link removal campaign. I've got a pretty good plan for my work flow in terms of doing the backlink research, gathering contact information, and sending the email requests. Where I'm a bit stuck is in regards to tracking the links that actually get removed. Obviously if someone replies to my email telling me they removed it, then that makes it pretty clear. However, there may be cases where someone removes the link, but does not respond. I know Moz has a ton of link tools (which I'm still getting familiar with). Is there a report or something I can generate that would show me links that did exist previously but have now been removed? If Moz cannot do it, does anyone have a recommendation on another tool that can track links to inform me whether or not they have been removed. Thanks!

    | Lukin
    0

  • I have recently launched a website which is using a free sitemap generator (http://web-site-map.com/). It's a large travel agency site (www.yougoadventure.com) with predominantly dynamically generated content - users can add their products as and when and be listed automatically. The guy doing the programming for the site says the sitemap generator is not up to the job and that I should be ranking far better for certain search terms than the site is now. He reckons it doesn't provide lastmod info and the sitemap should be submitted every time a new directory is added or change made. He seems to think that I need to spend £400-£500 for him to custom build a site map. Surely there's a cheaper option out there for a sitemap that can be generated daily or 'ping' google every-time an addition to the site is made or product added? Sorry for the non tech speak - Ive got my web designer telling one thing and the programmer another so im just left trawling through Q&As. Thanks

    | Curran
    0

  • How do you get these to appear - http://imgur.com/xV5LA6E Does a website have any control over what appears?

    | EcommerceSite
    0

  • Hi all, we have product category pages on our ecommerce web site and we also produce blog content (such as buyers guides, setup guides etc) to help with ranking and give our site some good quality, unique content. However we are sometimes finding that the buyers guide / blog content gets ranked by Google over our product category page. I'm hoping, if I give an example or two, some one smart out there may be able to point me in the right direction as to how we can avoid this and get the product category page ranked instead? You will see from my examples we are linking internally using the keywords from the buyers guides to the product category pages in order to show the most important page to Google for these keywords and are trying to structure the product category pages as well as possible to make it the most optimized page for the term. Example: Keyword "twin dvd player"... product category page: http://www.3wisemonkeys.co.uk/dvd/portable-dvd-player-car/twin-dvd-player/ ... blog page actually getting ranked for this keyword: http://www.3wisemonkeys.co.uk/advice-center/dual-screen-and-twin-dvd-player-explained/ Keyword "site radio".... product category page: http://www.3wisemonkeys.co.uk/audio/radio/site-radio/ .... blog buyer guide page actually getting ranked for keyword: http://www.3wisemonkeys.co.uk/advice-center/Site-radio-buying-guide/ Any help / pointers appreciated. Thanks.

    | jasef
    0

  • Hello everyone, Due to the Penguin update my site unfortunately took a bit of a hit. A little while ago I submitted all of our questionable/bad links to the disavow tool, however I still wante to go back and delete any and all problematic links that are still out there. Ive looked into many services, however I haven't been too impressed. Removeem - The email addresses they provided weren't always valid, and their email tool didn't always deploy correctly - a lot of cross referencing and was not saving me any time. Link Detox - Free trial was a bust. They show you 10 links on the free trial, however for me, 9 of the 10 were all the same. Couldn't get a good feel of their system. Rmoov - Their tool is one where you upload your own links, and they help manage everything, however they DONT allow you to email through their system, so Im not sure how this helps my process if I have to do everthing manaully anyway. A lot of sites I see are also a full service approach that charge you based on how many links they remove, and this can get quite costly. I have also contacted:
    Link Delete - No reponse from multiple email requests
    Linkquidator - No response
    Infatex - No response My questions to all of you are: Is there any company out there that you recommend that provide a self service tool [online or desktop driven]? Is this even an avenue I should explore, or should I compile my own list [as 3rd party algorithms are not always accurate] and reach out to sites manually? Is disavowing good enough and Im just spinning my wheeles trying to now get them all removed? Thanks!

    | Lukin
    0

  • Hi All, I want to discuss one of my strategy that i applied on my website to dilute the value of TOXIC links those are coming on my website. Issue: Poor, Spam quality links were created for the home page and some inner pages. Google considered those links Unnatural and took manual action All rankings were disappeared Strategy Deleted all landing pages those are over linked from spam websites. Created new landing pages with some modifications and new content. Because home page(index.html) was also penalized by Google, i made the changes in index.html and  put no follow no index tag so that bad links value couldn't pass from index.html to other inner pages (Newly created pages and pages those were not over optimized). Created new index.php page. Give option to the user to the Enter the Website from Index.html (Default Home Page) to index.php. Blocked all bad URLs (Un Natural Links) through .htaccess file. When user or Google bot will come through those blocked URLs (Un Natural Links), server responses 403 (Access Denied). The domain for which i did above experiment is http://www.thebaildepot.com/index.php Now, i have doubts on below points: Blocking unnatural links (403, access denied) from .htaccess file will really work? No follow no index to default page and than give option to the user to navigate to newly create index.php I did this experiment around 10 days before still rankings are not coming in Google top 100.

    | RuchiPardal
    0

  • I implemented rel=canonical on our pages which helped a lot, but my latest Moz crawl is still showing lots of duplicate page titles (2,000+). There are other ways to get to this page (depending on what feature you clicked, it will have a different URL) but will have the same page title. Does having rel=canonical in place fix the duplicate page title problem, or do I need to change something else? I was under the impression that the canonical tag would address this by telling the crawler which URL was the URL and the crawler would only use that one for the page title.

    | askotzko
    0

  • Hi Mozzers, Just recently we acquired a domain (www.nhacaribbean.com) for marketing purposes. Our technical staff used a frame forward to redirect the domain to the landing page http://www.nha.nl/alles-over-nha/Caribbean.aspx, which is only linked in the sitemap (not in the navigational structure of the site). Now, I'd personally just redirect the domain with a 301. But our CEO really wanted to keep the domain www.nhacaribbean.com visible in the URL bar. My question is: could this (potentially) really hurt rankings for our web site one way or the other? I'd love to hear from you guys. Thanks in advance.

    | NHA_DistanceLearning
    0

  • We have an article that ranks #1 in Google SERP for the keyword we want it to rank for. We decided to revise the article because although it's performing well, we knew it could be better and more informative for the user. Now that we've revised the content, we're wondering: Should we update the article author (and the G+ authorship markup) to reflect that the revisor authored the content, or keep the original author listed? Can changing G+ authorship on an article impact its search ranking, or is that an issue that's a few Google algorithm updates down the road?

    | pasware
    0

  • Hello, My website is in default English and Spanish as a sub folder TLD.  Because of my Joomla platform, Google is listing hundreds of soft 404 links of French, Chinese, German etc. sub TLD's. Again, i never created these country sub folder url's, but Google is crawling them.  Is it best to just "Disallow" these sub folder TLD's like the example below, then "mark as fixed" in my crawl errors section in Google Webmaster tools?: User-agent: * Disallow: /de/ Disallow: /fr/ Disallow: /cn/ Thank you, Shawn

    | Shawn124
    0

  • Hello, I have a hosting company that partnered up with a blogger template developer that allowed users to download blog templates and have my footer links placed sitewide on their website.  Sitewides i know are frowned upon and that's why i went through the rigorous Link Audit months ago and emailed every webmaster who made "WEBSITENAME.Blogspot.com" 3 times each to remove the links. I'm at a point where i have 1000 sub users left that use the domain name of "blogspot.com".  I used to have 3,000! Question: When i disavow these links in Webmaster tools for Google and Bing, should i upload all 1000 subdomains of "blogspot.com" individually and show Google proof that i emailed all of them individually, or is it wise to just include just 1 domain name (www.blogspot.com) so Google sees just ONE big mistake instead of 1000. This has been on my mind for a year now and I'm open to hearing your intelligent responses.

    | Shawn124
    0

  • Hi All, I've a client with this website: http://www.e-rustica.com/casas-rusticas It's a spanish realtor for special houses (rustic). We wanto it to be good posited as "casas rústicas" that it's the correct keyword and asl "casas rusticas" that it's like lot of people write it. Do you know if google see this two keywords as the same? Even we've done SEO for "casas rústicas" it's much better posited for "casas rusticas". Regards,

    | lbenzo_aficiona
    0

  • Dear MOZ Community: In an effort to improve the user interface of our business website (a New York CIty commercial real estate agency) my designer eliminated a standardized footer containing links to about 20 pages. The new design maintains this footer on the home page, but all other pages (about 600 eliminate the footer). The new design does a very good job eliminating non essential items. Most of the changes remove or reduce the size of unnecessary design elements. The footer removal is the only change really effect the link structure. The new design is not launched yet. Hoping to receive some good advice from the MOZ community before proceeding My concern is that removing these links could have an adverse or unpredictable effect on ranking. Last Summer we launched a completely redesigned version of the site and our ranking collapsed for 3 months. However unlike the previous upgrade this modifications does not URL names, tags, text or any major element. Only major change is the footer removal. Some of the footer pages provide good (not critical) info for visitors. Note the footer will still appear on the home page but will be removed on the interior pages. Are we risking any detrimental ranking effect by removing this footer? Can we compensate by adding text links to these pages if the links from the footer are removed? Seems irregular to have a home page footer but no footer on the other pages. Are we inviting any downgrade, penalty, adverse SEO effect by implementing this? I very much like the new design but do not want to risk a fall in rank and traffic. Thanks for your input!!!
    Alan

    | Kingalan1
    0

  • Hi there, We run an e-commerce website and we are aware of our duplicate page content/title problems. We know about the "rel canonical" tag and the "no index" tag but I am more interested in the latter. We use a CMS called Magento. Now, Magento has an extension that allows you to use the "no follow" and "no index" tag on products. Google has indexed many of our pages and I wanted to know if applying the "no index" tag on duplicate pages will instruct Google to remove the duplicate url's it has already indexed. I know the tag will tell Google not to index a page but what if I apply it to a product already indexed?

    | iBags
    0

  • How Quickly I should Add products and categories in this new domain. We are going to start its promotional by google adwords and facebook. I worrying about 10000's of product pages. kindly guide me.

    | innovatebizz
    0

  • If I have a page in English, which exist on 100 other websites, we have a case where my website has duplicate content. What if I use Google Translate to translate the page from English to Japanese, as the only website doing this translation will my page get credit for producing original content? Or, will Google view my page as duplicate content, because Google can tell it is translated from an original English page, which runs on 100+ different websites, since Google Translate is Google's own software?

    | khi5
    0

  • Is it dangerous to have your H1 tag and your title the exact same thing?  My thought was that it's not be the best use of space, but that it couldn't cause harm. What do you think?

    | MarieHaynes
    7

  • On most pages of my site i have a Quick Links section, which gives x3 cross sales links to other products, a newsletter sign up link, link to Blog, x4 links from images to surveys, newsletters, feedback etc. Will these links be hurting my optimal SEO juice between pages, should the number of internal links be kept to a minimum? My site is www.over50choices.co.uk if that helps. Thanks
    Ash

    | AshShep1
    0

  • When i do a google cache of our site, i see 2 menus, our developers say that's because the 2nd is for the mobile menu - is that correct, as when i look up other sites that have mobile rendering they only have one menu visible. Plus GWT's has the number of internal links per page at least x2 what they should have - are they connected? Secondly when i do a spider test through http://tools.seobook.com/general/spider-test/ it shows all "behind the scenes text" eg font names, portals, sliders, margins - "font size px" is shown as 17 times and a density of 2.15% - surely this isnt correct as google will be thinking that these are my keywords !? My site is www.over50choices.co.uk Thanks Ash

    | AshShep1
    0

  • I believe our site is being penalized/held back in rankings, and I think this is why... We placed an advert on a website which they didn't make "no follow" so we had hundreds of site-wide links coming into our site. We asked them to remove the advert which they did. This was 4 months ago, and the links are still showing in GWMT. We have look into their pages which GWMT is saying still link to us, but these a number pages aren't being indexed by Google, and others aren't being cached. Is it possible that because Google cant find these pages, it can tell our link has been removed? And/or are we being penalized for this? Many thanks

    | jj3434
    1

  • Our site: Starcitylimo.com got destroyed by a pharma styled hack to their wordpress site. After having to re build the site from scratch to remove the virus, it was found that hundreds or even thousands of pharma links from overseas sites point to his home page (so we can't just 404 the pages). Contacting the sites for removal does nothing. Added to dissavow 4 months ago did nothing. He's page 5 for every keyword he was position 5 or better for. Is this one of those situations where its time to move on to a new domain?

    | iAnalyst.com
    0

  • Hello all This is my first question in the Moz Forum, hope I will get some concrete answers 🙂 I am looking for some suggestions on implementing the hreflang="x-default" properly in our site. Any previous experience or a link to a specific resource/ example will be very helpful. I have found many examples on implementing the homepage hreflang, however nothing on non-homepage urls within your site. The below will be the code for the "Homepage" for /uk/. Here /en-INT/ is a Global English site not targeted for any country unlike en-MY, en-SG, en-AU etc. Is this the correct approach? Now, in case of non homepage urls, should the respective en-INT url be "x-default" or the "x-default" shouldn't exist altogether? For example, will the below be the correct coding? Many thanks Avi

    | Delonghi_Group
    0

  • Hi mozzers, I have a client that recorded 7 errors when generating Xml sitemap. One of the errors appear to be coming from partial urls and apparently I would need to exclude them from sitemap. What are they exactly and why would they cause an error in the sitemap. Thanks!

    | Ideas-Money-Art
    0

  • Hi, I wanted to know if finding queries (using the GWMT) that are under-performing (not maximizing their potential) either due to A. good position but low CTR
    B. low to medium position and change on-page items (titles and meta description for low CTR, add content to low position) is a good strategy or risky and may harm me further (but what do I have to lose...)?

    | BeytzNet
    0

  • Hi Folks, I was having a discussion with a friend and colleague of mine yesterday about the pros and cons of targeting keyword phrases that have very little if any search volume. I was of the opinion that if the keyword phrases (whether they were local or not) did not have any search volume as indicated by Google's Keyword Planner tool, then they had little if any value. Would this be a correct assumption? Or is there merit to targeting these phrases in order to begin to build a picture of a sites overall subject matter and to help rank in local search? For example, say there is a phrase like 'second hand clothing slough' (just a random phrase) which has no search volume but 'second hand clothing' has 2400 visits a month, would it be worth targeting the search phrase with no volume to build a better local profile, so that if someone in Slough searches for 'second hand clothing' the site shows up for that keyword? Thanks in advance guys! Gareth

    | PurpleGriffon
    0

  • I am looking into purchasing an existing ecommerce site with high SEO rankings. However, the site is old and beyond outdated. The shopping cart would need to be updated (say, to Magento). However, an updated shopping cart will force a change in the URLs (from example.com/category.asp to example.com/category). Will the SEO value be lost? Does anyone have any experience with this issue? There is a significant amount of money for me at stake so I would really appreciate to hear your experiences in this matter.

    | Ywsw
    0

  • Hi, I have two pages ranking for the same keyword phrase. Unfortunately, the wrong page is ranking higher, and the other page, only ranks when you include the omitted results. When you have a page that only shows when its omitted, is that because the content is too similar in google's eyes? Could there be any other possible reason? The content really shouldn't be flagged as duplicate, but if this is the only reason, I can change it around some more. I'm just trying to figure out the root cause before I start messing with anything. Here are the two links, if that's necessary. http://www.kempruge.com/personal-injury/ http://www.kempruge.com/location/tampa/tampa-personal-injury-legal-attorneys/ Best, Ruben

    | KempRugeLawGroup
    0

  • Hi Moz crew. I have two sites (one is a client's and one is mine). They are both Wordpress sites and both are hosted on WP Engine. They have both been set up for a long time, and are "on-page" optimized. Pages from each site are indexed, but Google is not indexing the homepage for either site. Just to be clear -  I can set up and work on a Wordpress site, but am not a programmer. Both seem to be fine according to my Moz dashboard. I have Webmaster tools set up for each - and as far as I can tell (definitely not an exper in webmaster tools) they are okay. I have done the obvious and checked that the the box preventing Google from crawling is not checked, and I believe I have set up the proper re-directs and canonicals.Thanks in advance! Brent

    | EchelonSEO
    0

  • This is real estate MLS listings related. I have a page "B" with lots of unique content (MLS thumbnails mixed with guide overview writing, pictures etc) which outranks "A" which is a page simply showing MLS thumbnails with map feature included. I am linking from "B" to "A" with anchor "KEYWORD for sale" to indicate to search engines that "A" is the page I want to rank, even though "B" has more unique content. It hasn't worked so far.
    Questions: Should I avoid linking from "B" to "A" as that could impact how well "B" ranks? Should I leave this setup and over time hope search engines will give "A" a chance to rank? Include some unique content on "A" mostly not viewable without clicking "Read more" link? I don't foresee many users will click "Read more" as they are really just looking for the properties for sale and do rarely care about written material when searching for "KEYWORD for sale". Should I "no index, follow" A as there are limited to none unique content and this could enhance chance of ranking better for B? When I write blog posts and it includes "KEYWORD for sale" should I link to "A" (best for users) or link to "B" since that page has more potential to rank really well and still is fairly good for users? Ranking for "B" is not creating a large bounce rate, just that "A" is even better. Thank you,
    Kristian

    | khi5
    0

  • I have a couple sites that have these, and I have done a lot of work to get them removed, but there seems to be very little if any benefit from doing this.  In fact, sites were we have done nothing after these penalties seem to be doing better than ones where we have done link removal and the reconsideration request. Google says "I_f you don’t control the links pointing to your site, no action is required on your part. From Google’s perspective, the links already won’t count in ranking. However, if possible, you may wish to remove any artificial links to your site and, if you’re able to get the artificial links removed, submit a reconsideration request__. If we determine that the links to your site are no longer in violation of our guidelines, we’ll revoke the manual action._" I would guess a lot of people with this penalty don't even know they have it, and it sounds like leaving it alone really doesn't hurt your site. If seems to me that just simply ignoring this and building better links and higher quality content should help improve your site rankings vs. worrying about trying to get all these links removed/disavowed. What are your thoughts?  Is it worth trying to get this manual action removed?

    | netviper
    0

  • I am adding schema.org markup for some clients and I am running into an issue.  Specifically the site I am working on uses a child theme for Genesis 2.0.  Genesis 2.0 added schema.org markup by default to all pages.  If I want to change the markup on the home page to a LocalBusiness should I remove ALL other schema.org markup on there for the website (marking nav as nav , header as header, etc)?  Or can I leave the markup as a webpage and just add the local business markup as well? When I look at it in the testing tool in GWMT, it just shows the webpage markup not the LocalBusiness markup.  You can take a look here: http://blueskyrestoration.com. Thanks in advance for your help!

    | farlandlee
    0

  • Hi, We made some drastic changes removing links (mainly resulted from one domain) and wondered when we should expect a change in the incoming links report of Google's WMT...? Thanks

    | BeytzNet
    0

  • Hello Moooooooooooooz ! I could not sleep yesterday because of a SEO nightmare ! So I came up with the following question: "Is it better to release regular new articles or update the existing ones" I explain more. Our company release regular pricelists (every month new pricelists available for a month, with the same brands. ex: January pricelist for brand A, etc.) Right now those pricelists are ranking good on google. So I wondered: Would it better to do: Make the pricelist articles stronger: Our company - Brand A pricelist (title) blog/offer/brand-A-pricelist.html (url) -> every month I update the text. So I just have one article /link to work on **Make more content on the pricelist: **Our company - Brand A pricelist - January 2014 (title) blog/offer/brand-A-pricelist-january.html (url) -> So google keeps indexing new fresh content **Work on a extra category: **Our company - Brand A pricelist - January 2014 (title) blog/offer/brand-A/pricelist-january.html (url) -> So I work on one link over the web blog/offer/brand-A where Google finds lots of new relevant contents I know that Matt Cutts said it's good to udpate an old article but in this case it's a bit different. Has anyone experiment the same ? Tks a lot !

    | AymanH
    0

  • Those of you that have algorithmic penalties, how long after making changes did you actually see an improvement, or have you ever? I have several sites that tanked after Penguin 2.1 and after doing Link Removal, Diasvaow files and building newer more quality links and adding content, I am STILL not seeing any change in rankings after several months. I have heard from some people it can take up to 6-months for google to even crawl a disavow file.  I have also heard no matter what you do it won't matter until Google does another update. I feel like we have made a lot of changes in the right direction, but I don't want to go overboard if nothing is going to matter until another Google Update is done. What are your experiences?

    | netviper
    0

  • Bruce Clay and others did some research and found that the first link on the page is the most important and what is accredited as the link. Any other links on the page mean nothing. Is this still true? And in that case, on an ecommerce site with category links in the top navigation (which is high on the code), is it not useful to link to categories in the content of the page? Because the category is already linked to on that page. Thank you, Tyler

    | tylerfraser
    0

  • Is this Ok to add a keyword with business name in the google places (without having in the domain name)? While we create google listing for new business?   Please view attached image. Source - http://moz.com/blog/top-20-local-search-ranking-factors-an-illustrated-guide (13. Product/service keyword in business title) dS7FWJL.png

    | Dan_Brown1
    0

  • Quick question with probably a straightforward answer... We created a new page on our site 4 days ago, it was in fact a mini-site page though I don't think that makes a difference... To date, the page is not indexed and when I use 'Fetch as Google' in WT I get a 'Not Found' fetch status... I have also used the'Submit URL' in WT which seemed to work ok... We have even resorted to 'pinging' using Pinglar and Ping-O-Matic though we have done this cautiously! I know social media is probably the answer but we have been trying to hold back on that tactic as the page relates to a product that hasn't quite launched yet and we do not want to cause any issues with the vendor! That said, I think we might have to look at sharing the page socially unless anyone has any other ideas? Many thanks Andy

    | TomKing
    0

  • I have ecommerce sites the only serve US and Canada. Is there a way to prevent a site from appearing in the Google results in foreign countries? The reason I ask is that we also have a lot of informational pages that folks in other countries are visiting, then leaving right after reading. This is making our overall Bounce Rate very high (64%). When we segment the GA data to look at just our US visitors, then the Bounce Rate drops a lot. (to 48%) Thanks!

    | GregB123
    0

  • Please let me know how to add and what to include in Geo Meta Tags, Dublin Core, Microformats.

    | Dan_Brown1
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.