Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Can anyone point me to the best way to implement 301 redirects on a Ruby on Rails website?

    | brianvest
    0

  • I want to duplicate the main site.com and service Asia from a different datacenter. The content is the same but the domain will be site.asia. How to properly tag to avoid duplicate content?

    | Pandjarov
    0

  • Our website is in the midst of a massive content enrichment project - we're moving from mostly catalog content to optimized web content. Our catalog and copy teams are hoping to include more product comparisons on the web (e.g. "unlike composite basketballs, rubber one's are more X..."), which can certainly provide useful information to our shoppers! However, from an SEO standpoint, we seem to have confused search engines when doing this in the past (i.e. the example above is currently ranked for a "composite basketball" term, not a rubber one). So... What is the best way to provide useful product comparisons without confusing search engines?

    | laurenf
    0

  • I have an e-commerce client who sells shoes. There is a main page for "Kids" shoes, and then right under it on the top-navigation bar there is a link to "Boys Shoes" and "Girls Shoes." All 3 of these links are on the same level - 1 click off the home page. (And linked to from every page on the website via the top nav bar). All 3 are perfectly optimized for their targeted term. However, when you search for "boys shoes" or "girls shoes" + the brand, the "Kids" page is the one that shows up in the #1 position. There are sitelinks beneath the listing pointing to "Girls" and "Boys." All the other results in Google are resellers of the "brand + girls" or "brand + boys" shoes. So our listing is the only one that's "brand + kids shoes." Our "boys" shoes page and "girls" shoes page don't even rank on the 1st page for "brand + boys shoes" or "brand + girls shoes." The only real difference is that "kids shoes" contains both girls and boys shoes on the page, and then "boys" obviously contains boys' shoes only, "girls" contains girls' shoes only. So in that sense there is more content on the "kids" page. So my question is - WHY is the kids page outranking the boys/girls page? How can we make the boys/girls pages be the ones that show up when people specifically search for boys/girls shoes?

    | FPD_NYC
    0

  • I'm looking for advice on how to handle my product description pages for my website vinylabs.com. The website sells vinyl wrap for cars and each color of vinyl (89 variations) has it's own product page. The product descriptions will all be identical except for the color description and code. All of our competitors have an identical layout, different pages for each color, and it fits the product so I don't want to depart from featuring each color as it's own page. Here is my dilemma. I don't want to get penalized for duplicate content, however I do want individual color codes to be searchable on google. For example if you google 3M vinyl wrap M203 you'll get individual pages from the manufacturer and our competitors featuring just that color. I want our website to show up as well. I was thinking about creating a single page that has selectable colors and sizes and then using the canonical tag to point all of my individual color code pages to that single page. However won't that hurt the ability for my individual color code pages to show in search? None of my competitors are using the canonical tag to redirect to a different page. Any advice welcome! Thank you for your time.

    | vinylabs
    1

  • We have 2 domains: revolve.com and fwrd.com (unrelated to each other, but hosted on the same server). If you do a site search for revolve.com but enter a designer brand that is only carried on FWRD (not on Revolve), the domain "revolve.com" pops up in the SERP, which is redirected to FWRD.com. Ex. https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=site%3Awww.revolve.com isabel marant Why is Google indexing the revolve.com pages, which don't actually exist? Thanks.

    | ggpaul562
    0

  • For a site that does NOT want a separate subdomain, or directory, or TLD for a country/language would the directly translated page (static) content/meta be duplicate? (NOT considering a translation of the term/acronym which could exist in another language) i.e. /SEO-city-state in English vs. /SEO-city-state Spanish -In this example a term/acronym that is the same in any language. Outside of duplicate content, are their other conflict potentials in rankings you can think of?

    | bozzie311
    0

  • Hi, I am trying to cleanse a news website.  When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts.  This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012.  So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
    https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove.  Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404?  I believe this is very wrong.  As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
    https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
    http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?

    | ioannisa
    0

  • Hi there, The scenario is this: We have been working on a rebrand and have changed the company name So, we want to redirect www.old-name.com to www.new-name.com However, the parent company is retaining the old brand name for corporate purposes So, in an ideal world, we'd be able to keep www.old-name.com active - but clearly that would sacrifice all of the authority built up over the years, so we do have to redirect the main www. subdomain in it's entirity. However - one suggested solution is to redirect www.old-domain.com to www.new-domain.com... but then create a new corporate subdomain: for example, business.old-domain.com business.old-domain.com will not be competing with the new site on any service/product related terms; it will only need to appear in SERPs for the company name I'd appreciate some thoughts on this, as I've not done this before or found any examples of anyone that has. Is that a massive risk in terms of sending a confusing message to Google? Thanks for your help

    | edlondon
    0

  • I've been trying to figure out why my site www.stephita.com has lost it's google ranking the past few years.  I had originally thought it was due to the Panda updates, but now I'm concerned it might be because of the Penguin update.  Hard for me to pinpoint, as I haven't been actively looking at my traffic stats the past years. So here's what I just noticed.  On my Google Search Console - Links to your Site, I discovered there are 301 domains, where over 75% seem to be spammy.  I didn't actively create those links. I'm using the MOZ - Open site Explorer tool to audit my site, and I noticed there is a smaller set of LINKING DOMAINS, at about 70 right now.  Is there a reason, why MOZ wouldn't necessarily find all 300 domains? What's the BEST way to clean this up???  I saw there's a DISAVOW option in the Google Search Console, but it states it's not the best way, as I should be contacting the webmasters of all the domains, which is I assume impossible to get a real person on the other end to REMOVE these link references. HELP! 🙂  What should I do?

    | TysonWong
    0

  • Hi all, I represent a hosting company which has thousands of domain names that is parked for the clients until they start using them. Currently we are presenting the client and visitors information about the situation in the top of the pages and we have placed information about all the main products in the last part of the page. You can see an example here:
    http://prodesign.no/ Would you recommend utilizing these pages in a better way than how we are doing today (SEO wise towards our own website)? We have the ability to instantly change all of these pages at once and we are also able to present different pages for every single parked domain name if we want to. Best regards,
    Jon

    | proisp-no
    0

  • With the depreciation of Freebase, we're moving some of our data to Wikidata. One of the identifiers (and signals for a Knowledge Graph placement) is your Crunchbase Organization ID. However, I can't find any reference to this number on our company Crunchbase profile. There's an application ID in the source code but it seems to be a different number length than other Org. ID examples I've seen. Anybody have experience and know where I can find this?

    | MattCommonBond
    0

  • Hello! I am located in NYC and am actively searching for a company or individual to take control of my SEO. I have attempted to do this on my own and it is just too time consuming and technical for me to handle while also running my business. I have been burned by SEO companies in the past so I am looking for someone reputable that can be trusted. I am in the service industry in an extremely competitive market. I am hoping that I can find some recommendations on here on who might be able to assist me grow my online presence. My websites do currently rank for several key words and as well as local packs. I am looking to expand on that and maintain current standings. Please help!!!!

    | scohen86
    2

  • Over lunch with our head of development, we were discussing the way CloudFlare and other CDN's help prevent DDOS attacks, etc. and I began to wonder about the IP address vs. the reverse proxy IP address. Before we would look to see commonalities in the IP as a way that search engines would modify the value to given links and most link software showed this. For ahrefs, I know they still show common IPs using the C block as the reference point. I began to get curious about what was the real IP when our head of dev said, that is the IP from CloudFlare... So, I ran a site in ahrefs and we got an older site we had developed years ago that showed up as follows: Actos-lawsuit.org 104.28.13.57 and again as 104.28.12.57 (duplicate C block is first three sets of numbers are the same and obviously, this has a .12 and a .13 so not duplicate.) Then we looked at our host to see what was the IP shown there: 104.239.226.120. So, this really begs a question of is C Block data or even IP address data still relevant with regard to links? What do the search engines see when they look for IP address now? Yes, I have an opinion, but would love to hear yours first!

    | RobertFisher
    0

  • I have an ecommerce site that is fully built out with thousands of products. I own many industry related domains for the products that i sell.  Many of these domains are sitting unused. I started to think that it would beneficial if i 301 redirect (at the registrar level) these domains to their SPECIFIC subcategories on my main money site. For example, i sell sporting goods and my main website is buysportinggoods.com I also own the following domains: basketballoutlet.com & baseballequipmentstore.com & footballpads.com Would it be wise or foolish (and potentially cause a Google penalty) if i did the following: Point basketballoutlet.com to buysportinggoods.com/basketballs Point baseballequipmentstore.com to buysportinggoods.com/baseball Point footballpads.com to buysportinggoods.com/football Please let me know your thoughts or experiences with similar situations. Thanks!

    | Prime85
    0

  • Bazaar Voice provides a pretty easy-to-use product review solution for websites (especially sites on Magento): https://www.magentocommerce.com/magento-connect/bazaarvoice-conversations-1.html If your product has over a certain number of reviews/questions, the plugin cuts off the number of reviews/questions that appear on the page. To see the reviews/questions that are cut off, you have to click the plugin's next or back function. The next/back buttons' URLs have a parameter of "bvstate....." I have noticed Google is indexing this "bvstate..." URL for hundreds of sites, even with the proper rel canonical tag in place. Here is an example with Microsoft: http://webcache.googleusercontent.com/search?q=cache:zcxT7MRHHREJ:www.microsoftstore.com/store/msusa/en_US/pdp/Surface-Book/productID.325716000%3Fbvstate%3Dpg:8/ct:r+&cd=2&hl=en&ct=clnk&gl=us My website is seeing hundreds of these "bvstate" urls being indexed even though we have a proper rel canonical tag in place. It seems that Google is ignoring the canonical tag. In Webmaster Console, the main source of my duplicate titles/metas in the HTML improvements section is the "bvstate" URLs. I don't necessarily want to block "bvstate" in the robots.txt as it will prohibit Google from seeing the reviews that were cutoff. Same response for prohibiting Google from crawling "bvstate" in Paramters section of Webmaster Console. Should I just keep my fingers crossed that Google honors the rel canonical tag? Home Depot is another site that has this same issue: http://webcache.googleusercontent.com/search?q=cache:k0MBLFcu2PoJ:www.homedepot.com/p/DUROCK-Next-Gen-1-2-in-x-3-ft-x-5-ft-Cement-Board-172965/202263276%23!bvstate%3Dct:r/pg:2/st:p/id:202263276+&cd=1&hl=en&ct=clnk&gl=us

    | redgatst
    1

  • Hello, We've currently got 9500 products live on our site at the moment with ~2000 in this category that we're adding the new products in. All of these products we're adding are coming from a site that we own and we're trying to expand the range on our site (the 9500 product site has a lot more visitors than the 4000 product site). However, all these products imported I believe are atleast duplicates from the 4000 product site, but the first ones I have seen (500) are manufacturer duplicates. What issues are we potentially going to run in to? Just for extra information: We have no control over canonical/noindex/robots etc

    | ThomasHarvey
    0

  • Please check the below URLs and let me know which algorithm hit my website.  I collect the data from 3 different  tools. http://i.imgur.com/ImljFmO.png http://i.imgur.com/aWxqOdj.png http://i.imgur.com/PqhhruN.png

    | Michael.Leonard
    0

  • Lately we have been applying structured data to the main content body of our client's websites.  Our lead developer had a good question about HTML however. In JSON-LD, what is the proper way to embed content from a data field that has html markup (i.e. p, ul, li, br, tags) into mainContentOfPage. Should the HTML be stripped our or escaped somehow? I know that apply schema to the main body content is helpful for the Googlebot.  However should we keep the HTML?  Any recommendations or best practices would be appreciated. Thanks!

    | RosemaryB
    0

  • Hello Moz Community, Very recently I've started working with a national branch of a global company. The way they have their website set up is that you must pick one of the National branches from the main home page. From there, you'll get content related to that country (I'm in a Canada, so naturally, I'm working on the Canadian side). The question I have is more from a URL structure. Because I can only make any influence for the national side, would it make sense to put it on its own subdomain or is this older SEO thinking? Example:
    Currently: reallylongdomain.com/global/ca/pages/page-title.aspx
    Recommended: canada.reallylongdomain.com/page-title Thanks, 
    Dan

    | dn_nicholson
    0

  • Hi, Let me first say that I really like the tool DeepCrawl. So, not busting on them. More like I'm interested in the relative importance of two items they call as "Issues." Those items are "Incomplete Open Graph Tags" and "No Valid Twitter Cards." They call this out on every page. To define it a bit further, I'm interested in the importance as it relates to organic search.I'm also interested in there's some basic functionality we may have missed in our Share42 implementation. To me, it looks like the social sharing buttons work. Also, we use Share42 social sharing buttons, which are quite functional. If it would help, I could private message you an example url. Thanks! Best... Mike

    | 94501
    1

  • Hi guys, im putting together a proposal for a new site and trying to figure out if it'd be better to (A) have a keyword split across multiple directories or duplicate keywords to have the keyword hyphenated? For example, for the topic of "Christmas decor" would you use; (A) - www.domain.com/Christmas/Decor (B) - www.domain.com/Christmas/Christmas-Decor in example B the phrase 'Christmas' is duplicated which looks a little spammy, but the key term "Christmas decor" is in the URL without being broken up by directories. which is stronger? Any advice welcome! Thanks guys!

    | JAR897
    1

  • So I'm having a conversation with the development team at my work and I'm a little tired today so I thought I would ask for other opinions. The currently the site duplicates it's full site by having a 200 show with or without a trailing slash. I have asked for a 301 redirect to with the trailing slash. They countered with having all the rel=canonical be the trailing slash, which I know is acceptable. My issue is that while a rel=canonical is acceptable, since my site has a very high level of competition and a very aggressive link building strategy, I believe that it may be beneficial to have the 301 redirect. BUT, I may be wrong. When we're talking hundreds of thousands of links, I would love to have them directly linked instead of possibly splitting them up between a duplicate page that has a correct canonical. I'm curious to what everyone thinks though....

    | mattdinbrooklyn
    1

  • Hey there, I have a website that it shows as a .com.au in the SERPs but it redirects to .com when you click on it. Is that ok in matters of SEO and why not if not.Kind Regards

    | AngelosS
    0

  • I need help to create robots.txt file. Please let me know what to add in the file. any real example or working example.?

    | Michael.Leonard
    0

  • Can we use Domain masking/URL masking? How google see this? Orignal domain - http://mstylecrazy.comMasked domain - Bestupforyou.comIs this also creates duplicate content? and Is this invite google penality?

    | Michael.Leonard
    1

  • Our #1 priority keyword, which we've ranked #1 for years, suddenly we're #2. And, the page listed above us doesn't seem to be even compete. Moz On-Page has us at an A and them at a C.  When I review the html, I don't even seem any exactly keyword match or matching text on the page.  I checked are ranking last week and didn't notice any change - so I've narrowed it down to something changing in the last 4-5 days. Also of note, when I test,  We're #1 on mobile, #2 on desktop. Sorry to not list the url's. omitted intentionally. Any thoughts would be much appreciated.

    | FX4nWOO
    0

  • I address a two sided market: consumer research and school fundraising. Essentially parents answer research surveys to generate proceeds for their school. My site will have a landing page at www.centiment.co that directs users to two different sub-landing pages, one related to research and one related to school fundraising. I am going to create two blogs and I am wondering if I should run off one installation of wordpress.org or two? The goal here is to optimize SEO. Separate URL paths by topic are clean but they require two installations of wordpress.org www.centiment.co/research/blog www.centiment.co/fundraising/blog If were to use one installation of wordpress it would be www.centiment.co/blog and then I would have a category for fundraising and a category for research. This is a little simpler. My concern is that it will confuse google and damage my SEO given general blog posts about fundraising are far different then those about research. Any suggestions? Again I don't want to compromise my SEO as I'm creating a blog to improve my SEO. Any insights are much appreciated. Thank you!
    Kurt

    | kurtw14
    0

  • So here is the website I’m looking at, it ranks #1 in keywords like used cars for sale billings mt, etc. I was trying to figure out how, because there is no content on the page! I am working on one of our sites to get it to rank better when I found this in my research. #1 http://prntscr.com/aoy0ho So I did a “view page source” to see how many times they’re using keywords and what they’re title and description tags are. #2 http://prntscr.com/aoy0w1 WAIT WHAT…. WHERE IS THIS CONTENT?! #3 http://prntscr.com/aoy13o Then I found it… #4 http://prntscr.com/aoy1e8 #5 http://prntscr.com/aoy1o8 #6 http://prntscr.com/aoy1u1 It doesn’t even read like real content. This has to be considered poor form. I'm not sure why it makes me so angry. What do you guys think?

    | rachaelpracht
    1

  • Hi, We had a content manager request to delete a page from our site. Looking at the traffic to the page, I noticed there were a lot of inbound links from credible sites. Rather than deleting the page, we simply removed it from the navigation, so that a user could still access the page by clicking on a link to it from an external site. Questions: Is it bad for SEO to have a page that is not directly accessible from your site? If no: do we keep this page in our Sitemap, or remove it? If yes: what is a better strategy to ensure the inbound links aren't considered "broken links" and also to minimize any negative impact to our SEO? Should we delete the page and 301 redirect users to the parent page for the page we had previously hidden?

    | jnew929
    0

  • Hi there, I have a client who changed domain names back in November 2015 but is still coming up in search engines with their old domain name not their new one. For example, I search for my clients name, let's call them Example B. So I search for "Example B" and within the search results they come up top and the title tag is correct as it says something along the lines of "Welcome to Example B". However the URL underneath is actually their old name which is Example A. When you click on the link, it redirects over to the new name so thats fine, but it's just annoying that Example A is still appearing when it should be Example B now. I don't think they have a new Webmaster Tools account setup for their new domain (I need to check still), but they do still have their old one setup. Is there something I can do within Webmaster Tools to tell it that Example A is now gone and to start indexing and referring to them as Example B? What else should I do to make sure their new name is coming up not their old one anymore?

    | Virginia-Girtz
    1

  • Hi Wondering if we need to worry about IP Canonicalization via htaccess and if this is really required? and does would it have a big impact?

    | Cocoonfxmedia
    0

  • Hi Moz community! I'm posting a new question here as I couldn't find specific answer to the case I'm facing. Along with canonical tags, we are implementing meta robots on our pages (e-commerce website with thousands of pages). Most of the cases have been covered but I still have one unanswered case: our products are linked from list pages (mostly categories) but they almost always include a tracking parameter (ie /my-product.html?ref=xxx) products urls are secured with a canonical tag (referring only to the clean url /my-product.html) but what would be the best solution regarding the meta robots? For now we opted for a meta robot 'noindex, follow' for non canonical urls (so the ones unfortunately linked from our category/list pages), but I'm afraid that it could hurt our SEO (apparently no juice is given from URLs with a noindex robots), and even maybe prevent bots from crawling our website properly ... Would it be best to have no meta robots at all on these product urls with parameters? (we obviously can't have 'index, follow' when the canonical ref points to another url!). Thanks for your help!

    | JessicaZylberberg
    0

  • If I have an eCommerce website containing 10,000 product pages and then I add 10,000 new product pages using a bulk upload (with limited/basic but unique content), does this pose any SEO risk? I am obviously aware of the risks of adding a large number of low quality content to the website, which is not the case here, however what I am trying to ascertain is whether simply doubling the number of pages in itself causes any risk to our SEO efforts? Does it flag to the Search Engines that something "spammy" is happening (even if its not)

    | DHS_SH
    0

  • I have previously searched the forum and could not find a definitive answer on this subject so would appreciate any guidance. I have just joined a new company, we have a .co.uk site which gets lots of traffic. We have a .com site which is targeting USA and .com/de/ targeting Germany. 'hreflang' is configured on the .com (between the USA and German sites) but not on .co.uk. This means that in the eyes of search engines (and Moz Pro) the 2 domains are competitors (and the .co.uk has much more presence than the .com in the USA). I know how to fix this and I am in the process of doing so. My question is whether it would make sense to migrate the .co.uk site to .com As previously mentioned the .co.uk site already does very well both in the UK and around the world (as our product is well known in our niche). As .co.uk can only primarily be targeted to UK would our global reach increase enough to justify migrating it to .com? We have dealers/distributors in maybe 30 countries and are continuing to expand, we will at point point add additional languages so my suggestion is that we migrate now as the authority of the .co.uk will help the emerging markets as well as increase our visibility in markets that are not currently primary targets. We are also in the process of hiring new staff specifically to focus on Content Marketing. So again this suggests having the 1 domain will make sense in the long run (as any value gained from content marketing success will be seen by all country/language focussed sites). I am also planning to rebuild the sites in the next few months as the current ones are not fit for purpose so the migration would coincide with this (I know this is not ideal). Apologies for the lengthy question, I hope the additional background information will help in providing some feedback to help me make the decision. David

    | JamesCrossland
    0

  • My client and I have a problem: An ecommerce store with around 20 000 products has nearly 1 000 000 pages indexed (according to Search Console). I frequently get notified by messages saying “High number of URLs found” in search console. It lists a lot of sample urls with filter and parameters that are indexed by google, for example: https://www.gsport.no/barn-junior/tilbehor/hansker-votter/junior?stoerrelse-324=10-11-aar+10-aar+6-aar+12-aar+4-5-aar+8-9-aar&egenskaper-368=vindtett+vanntett&type-365=hansker&bruksomraade-367=fritid+alpint&dir=asc&order=name If you check the source code, there’s a canonical tag telling the crawler to ignore (..or technically commanding it to regard this exact page as another version of the page without all the parameters) everything after the “?” Does this url showing up in the Search Console message mean that this canonical isn’t working properly? If so: what’s wrong with it? Regards,
    Sigurd

    | Inevo
    0

  • Hello all, We want to block something that has the following at the end: http://www.domain.com/category/product/some+demo+-text-+example--writing+here So I was wondering if doing: /*example--writing+here would work?

    | ThomasHarvey
    0

  • I am currently refreshing my WordPress business website. I used a theme that had a built in portfolio option. I wanted to strip down the bloat and move to something more simple to better articulate my message. Upon switching themes I will loose my urls for my portfolio projects. I should have never used this built in function but it did exactly what I needed and wanted here is an example. http://silvernailwebdesign.com/portfolio-view/central-jersey-claims-association-wordpress-consulting/ Now on my staging site these portfolio pieces have vanished and the urls are indexed with google. I could create posts and recreate the portfolio pieces however the problem with the url is the /portfolio-view/ portion. I cannot recreate the part of the url. Any advice would be greatly appreciated. I receive some traffic through the portfolio pages but not much, however, I do not want to loose any traffic. I am looking for a strategy that will solve this url issue with WordPress. I have about 10 separate portfolio pages with this url issue.

    | donsilvernail
    0

  • Hi guys, Recently Neil Patel did an analysis on the link type % for his blog. Link: https://www.quicksprout.com/2016/03/28/the-blueprints-the-exact-links-your-blog-needs-to-rank-like-quick-sprout/?utm_source=email&utm_medium=email&utm_campaign=email I was wondering if anyone knows of any other good studies on this topic? Cheers.

    | jayoliverwright
    0

  • Capitalization of first letter of each word in meta description. Catches more attention, but may this lead to google ignoring the meta description then more frequently? Same for an occasional capitalized FREE in meta description. Anybody had experience with this?

    | lcourse
    1

  • We have pretty strong rankings and have done since we launched our business in 2004. Over the past two months, we have seen a strong growth in rankings after a bit of a drip last year, however over the past week we have seen a significant drop off in organic traffic. Further analysis has shown that this is coming from mobile/tablet traffic, as the desktop traffic has been fairly consistent. There does appear to be a drop (across all main keywords) on mobile/tablet, but again there is no noticeable drop in desktop rankings. I am obviously aware Google treats these differently, but we can find no obvious reasons why this might have occurred. It is almost as if some sort of penalty has been applied on mobile/tablet only. There are no warnings or obvious problems in webmaster tools and our mobile/tablet site is mobile optimised. At a bit of a loss and would appreciate a bit of guidance in where you think we should start to understand what is causing this and how we should go about correcting it. Thanks in advance for the help.

    | simonukss
    0

  • I'm currently in debate with our 508 compliance team over the use of alt tags on images. For SEO, it is best practice to use alt tags so that readers can tell what the image represents. However, they are arguing that these images should NOT have alt text as it doesn't add anything to the disability screen reader as the image text would be repetitive with the text on the page. I feel they are taking the "decorative" image concept in 508 compliance too far. It's intention is for images for bullets, etc that truly are decorative in nature and add no benefit to the reader. What is the communities thoughts on this? Have you ever run into scenario where 508 is attempting to ruin SEO? Usually the 2 play nicely.

    | jpfleiderer
    0

  • I only have two questions.... Approximately when did you do it (year is close enough)? Did the rankings of Domain B go up? Any other information that you care to share will be appreciated. Thank you!

    | EGOL
    0

  • Hi everyone, I set up a silo for my page http://werkzeug-kasten.com/ . Unfortunately only the silos inner pages rank very good. These are for example http://werkzeug-kasten.com/suchmaschinenoptimierung-seo-freiburg/keyword-analyse/ for "Keywordanalyse SEO Freiburg" <a>http://werkzeug-kasten.com/suchmaschinenoptimierung-seo-freiburg/onpage-seo/</a>  for "Onpage SEO Freiburg" ... but the silos main page <a>http://werkzeug-kasten.com/suchmaschinenoptimierung-seo-freiburg/</a> does not rank for "SEO Freiburg". Do you have any idea why that might be? Cheers, Marc

    | RWW
    0

  • My website is having more than 3500 posts. Please let me know what sitemap plugin I need to use for the website and what is the best practice for it?

    | Michael.Leonard
    0

  • i have more than 3500 pages on my website. Please let me know the best sitemap plugin for my website.

    | Michael.Leonard
    1

  • Hey Mozzers! I recently started working with a new Magento programmer for our ecommerce site. He sent me this scan/report outlining some security issues that need to be addressed. This is a new partnership so I'm not sure which issues should be a major concern, or if I should not focus on them. Would you be able to give me your opinion on the importance of the security issues? https://www.magereport.com/scan/?s=http://metallumcreations.com/

    | localwork
    0

  • I changed my blog URL structure Can Someone please let me how to solve this?

    | Michael.Leonard
    0

  • Hi guys, I was wondering if there is a tool or way to pull link data for a list of URLs/Pages at once to one single file with ahrefs or majestic. I know scrapebox can do this with OSE, but looking for a way to do this with the other backlink databases. Any ideas? Cheers. Hi guys, I was wondering if there is a tool or way to pull link data for a list of URLs/Pages at once to one single file with ahrefs or majestic. I know scrapebox can do this with OSE, but looking for a way to do this with the other backlink databases. Any ideas? Cheers.

    | jayoliverwright
    0

  • The below chart shows what we believed on of our domains being caught in the Google sandbox for the last 6 months; the sites has built site has built significant, relevant, high quality backlinks and content and is outperforming the majority of sites ranked in the top 10. The site was a reset project of a site caught in Panda however we tried to not use any 301 redirects or linkages between the site. After 6 months without organic traffic I am leaning on the community for advice if this could still be legitimate Google sandboxing or if we should try to repoint our external links to a new domain and see if the site can gain rankings quicker? From past project the Google sandbox issue has been resolved within 2 months. Any advice welcome. matIgqE

    | spanish_socapro
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.