Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Is it ok to run different html & different content on different mobile browsers even though the url is same. or the site can get penalize ?

    | vivekrathore
    0

  • Hi Mozzers, I'm looking for some feedback regarding best practices for setting up Robots.txt file in Magento. I'm concerned we are blocking bots from crawling essential information for page rank. My main concern comes with blocking JavaScript and CSS, are you supposed to block JavaScript and CSS or not? You can view our robots.txt file here Thanks, Blake

    | LeapOfBelief
    0

  • We have a large consumer website with several sections that have navigation of several pages. How would I prevent the pages from getting duplicate content errors and how best would I handle SEO for these? For example we have about 500 events with 20 events showing on each page. What is the best way to prevent all the subsequent navigation pages from getting a duplicate content and duplicate title error?

    | roundbrix
    0

  • Instead of a URL such as domain.com/double-dash/  programming wants to use domain.com/double--dash/ for some reason that makes things easier for them. Would a double dash in the URL have a negative effect on the page ranking?

    | CFSSEO
    0

  • Hello Moz Community! It seems like there are two opinions coming from directly from Google on tabbed content: 1) John Mueller says here that content is indexed but discounted 2) Matt Cutts says here that if you're not using tabs deceptively, you're in good shape I see this has been discussed in the Moz Q & A before, but I have an interesting situation: The pages I am building have  ~50% static content, and ~50%  tabbed content (only two tabs). Showing all tabbed content at once is not an option. Since the tabbed content will make up 50% of the total content, it's important that it is 100% weighted by Google. I can think of two ways to show it: 1) Standard tabs using jQuery Advantage: Both tab 1 and tab 2's content indexed Disadvantage: Tabbed content may be discounted? 2) Make the content of the tabs conditional on the server side website.com/page/ only shows tab 1's content in html website.com/page/?tab=2 only shows tab 2's content in the html. Include rel="canonical" pointing to website.com/page/. Advantage: Content of tab 1 indexed & 100% counted by Google Disadvantage: Content of tab 2 not indexed Which option is best? Is there a better solution?

    | jamiestu13
    0

  • We host a online fashion magazine on a subdomain of our e-commerce site. Currently we host the blog which is word press based on a subdomain ex:  stylemag.xxxxxxx.com First question is are all the links from our blog considered internal links? They do not show in the back links profile.  Also would it be better to host this on its own domain? Second question Is my main URL getting credit for the unique content published to the blog on the subdomain and if so is it helping the overall SEO of my website more then if it and the links were hosted on its own wordpress.com

    | kushvision
    0

  • From an SEO perspective, is the title tag more important than the description tag? We use a set format for these tags on our real estate web site. The site contains 300 listings. Sample Title Tag:
    Greenwich Village | Office Space Rental| 2300SF $9583/month Sample Description Tag: 
    Classic Greenwich Village office rental. Hardwood floors, 11' ceiling. 5 oversized windows. 24/7 attended lobby. Renovated common areas. Below market rent. Are we shooting ourselves in the foot by repeating the Square Footage and monthly rent amounts in the title tag? Should this tag be used for a short more descriptive terms so as to maximize the SEO benefit? Should these numbers be listed in the description tag? The listings are not heavily SEO optimized so I don't know whether this is really a non-issue.

    | Kingalan1
    0

  • my site uses is set up at http://www.site.com I have my site redirected from non- www to the www in htacess file. My question is... what should my robots.txt file look like for the non-www site? Do you block robots from crawling the site like this? Or do you leave it blank? User-agent: * Disallow: / Sitemap: http://www.morganlindsayphotography.com/sitemap.xml Sitemap: http://www.morganlindsayphotography.com/video-sitemap.xml

    | morg45454
    0

  • A client site had the follwing URLs for all blog posts: www.example.com/health-news/sample-post www.example.com/health-news is the top level page for the blog section. While making some theme changes during Google mobilegeddon, the permalink structure got changed to www.example.com/sample-post ("health-news" got dropped from all blog post URLs). Google has indexed the updated post structure and older URLs are getting redirected (if entered directly in the browser) to the new ones; it appears that WordPress takes care of that automatically as no 301 redirects were entered manually. It seems that there hasn't been any loss of rankings (however not 100% sure as the site ranks for well over 100 terms). Do you suggest changing the structure back to the old one? Two reasons that I see are preserving any link juice from domains linking to old URLs and ensuring no future/current loss of rankings.

    | VishalRayMalik
    0

  • Hi All, I use pagination (Rel=Prev , Rel=Next) and a canconical tag on my paginated pages and also on my View All page to point to my root page (Page 1). My thoughts are am I missing a trick by having the same H1 and H2 tag on each of my paginated pages ? Should I having different ones on say Page 2, 3,4 and view all , to give the Collective Page better SEO . Just wondered what peoples thoughts here were ?. thanks Pete

    | PeteC12
    0

  • Hi All, Hi hope someone could answer to this question because on internet I haven't found a clear solution so far: I have: 1 desktop website (let's make www.example.com) and different mobile websites for each main device (let's make iphone.example.mobi; android.example.mobi; winphone.example.mobi) In order to optimize my mobile websites, According to the Google guideline of the above separate urls configuration , I should add a tag link alternate media in the desktop page and a canonical tag in the corresponding mobile page in order to create a connection between them. But, I need to keep a 1-to-1 connection between desktop page and mobile page (Google recommends to have 1 desktop page linked to  1 mobile page and viceversa and discourages the 1-to-multi connections). What I would like: In my case, I have to add the a single desktop page of desktop site (example www.example.com/category1/), 3 links alternate media tag,( one for iphone.example.mobi, one for android.example.mobi and one for winphone.example.mobi). Furthemore, I have to add a canonical tag in every corresponding mobile page of the 3 mobile site version, a canonical tag pointing to my sektop page www.example.com/category1/. Now my worries are: having a single desktop page with 3 different link alternate tags pointing to 3 different mobile websites (one each), is something or not aligned to the google seo mobile guideline? If not, How should I configure my desktop website and my 3 mobile web applications(iphone, android, winphone) in order to follow the Google requirements for Separate urls apllication? Thanks, Massimliano

    | AdiRste
    0

  • Hi, I was thinking: If I had 4 pages, each of them optimized for an especific keyword, but set a canonical url to another page, would this another page rank for the 5 specific keywords? Ex: Page 1- Shoes
    Page 2- Snickers
    Page 3- Socks
    Page 4- Feet
    All set the canonical url to Page 5 Page 5 will rank for all this four keywords?

    | PedroVillalobos
    0

  • So if I want to block any URL from being indexed that contains a particular parameter what is the best way to put this in the robots.txt file? Currently I have-
    Disallow: /attachment_id Where "attachment_id" is the parameter.  Problem is I still see these URL's indexed and this has been in the robots now for over a month. I am wondering if I should just do Disallow: attachment_id or Disallow: attachment_id= but figured I would ask you guys first. Thanks!

    | DRSearchEngOpt
    0

  • Hey Mozers This is a question which has been bugging me for a while now I have an authority site in my niche which has a stronger DA than pretty well every competitor, but certain sections of the site underperform. For instance, when you search for 'Jerusalem Dead Sea tour', my item, http://www.touristisrael.com/tours/jerusalem-dead-sea-day-tour/ does not appear in the first few pages. I have a page that appears on the first page, but it is less relevant than this product page. This is an example, there are tens of cases like this. So the question is, am I signalling to Google not to rank these pages, and is there something I'm missing with regards to strengthening product pages in this tour section? Thanks

    | ben10001
    0

  • Hi This is to spark a debate, rather than an answer which has a specific answer. While Google may claim that being on the certified programme doesn't increase your ranking, but part of their algorithm looks into whether a website is trustworthy. To get accepted onto the certified shops you have to prove your a trust worthy reliable business that constantly gets audited. So surely this must be directly / indirectly be a ranking factor? Just thought I would throw it out there for a debate.

    | Andy-Halliday
    1

  • I am wanting to get some site links in the serps to increase the size of my "space",  has anyone found a way of getting them? I know google says that its automatic and only generated if they feel it would benifit browsers but there must be a rule of thumb to follow. I was thinking down the line of a tight catagorical system that is implimented throughout the site that is clearly related to the content (how it should be I guess)... Any comments, suggestions welcome

    | CraigAddyman
    0

  • We want to expand to a few new regions internationally. My question is if we register sites in different geographies and upload our exact site to these web addresses (exact duplicates) so our web addresses will then be www.mysite.co.uk (current site) www.mysite.com (new intended site) www.mysite.com.au (new intended site) and add rel=“canonical” linking elements to prevent duplicate content issues.Will our content production on our current site www.mysite.co.uk retain its value within all the other sites. Is this the best way to do it? Thanks in advance!

    | aquaspressovending
    0

  • We recently ran cross domain canonicals for 2 of our websites. What's interesting is that when I do a search for ""site:domain1.com "product name"" the Title in the SERPs uses the Domain Name from the site the page has been canonicaled to. So the title for Domain1 (for the search term above) looks like this: Product Name | Keywords | Domain 2 Interesting quirk. Ha anyone else seen this?

    | AMHC
    0

  • My domain authority dropped by 9 points and I haven't done anything differently since the last scan. What is going on?

    | infotrust2
    0

  • Hi, When you Google: "Los Angeles divorce attorney", you will see this site on the 5th page of the SERPS:  www.berenjifamilylaw.com/blog/. For some reason, Google is serving the BLOG page as opposed to the homepage.  This has been going on now for several weeks. Any tips on how to fix this?  Obviously, the Homepage is more relevant and has more links going to it, so not sure why it's happening. Would you just leave it alone?  Would you use robots.txt to block Google from crawling the BLOG post page? Thanks.

    | mrodriguez1440
    0

  • Is there any way to disallow URLs ending in a certain value? For example, if I have the following product page URL: http://website.com/category/product1, and I want to disallow /category/product1/review, /category/product2/review, etc. without disallowing the product pages themselves, is there any shortcut to do this, or must I disallow each gallery page individually?

    | jmorehouse
    0

  • Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?

    | AMHC
    0

  • Hey everyone, So we are currently working on a new website and are in the final stages right now. We have some plans for a brand name change too and there is some debate internally whether we should: a) roll out the new site now and hold off on the rebrand - let the redirects kick in and the site bed in so to speak. Then when the dust settles look at a domain name change b) Roll out the new site with the domain name change too - an all in change A bit of background on the changes being made: The new website will have some structural changes but the main blog content will remain the same - this is where we get the majority of our traffic. The blog will have a slight page layout change but the core content, structure, urls, etc. will be exactly the same. The core website surrounding the blog will change with 301 redirects from old out of date content pages consolidated to fewer, more relevant pages. I hope I've explained enough here, if not please let me know and I'll add more detail

    | hotchilidamo
    0

  • My website name has changed in the title, but only shows up sometimes in the SERPs. What can I do to ensure the new name is the name that always shows up? It's been a month since the change and we have submitted a new sitemap. Here's one example: http://www.building.govt.nz/blc-building-act. In Google (for New Zealand building code) it shows up as Building Act - Department of Building and Housing. Any ideas?

    | DanielleNZ
    0

  • I've been getting mixed reviews on this and I'm trying to figure out whether this is something I should be concerned with. We have a higher than recommended amount of code, relative to "content" but not by a crazy amount. Thanks!

    | absoauto
    0

  • Our site had a huge drop in bounce rate in one day (went from 10% to 3%) and it has stayed that way. What could cause this? P4bYpNF.png

    | navidash
    0

  • Hello, We run an online store. The main content keyword for our niche is very competitive, but if I was going to go look up information and I was one of our customers, that is exactly what I would type in - this main general keyword. We have an expert in the field to write it and plenty of time. Although the main keyword is competitive, there are many many subkeywords that are a lot less competitive that would be answered in the article. It's tough to find good topics in this niche. We're thinking about doing a "Complete Guide to X". We would have far less backlinks and authority for about half of the 30 keywords it will cover than our main competitors. Should we do this and spend the next couple of years working on it, or should we perhaps target a smaller topic? Any advice is appreciated.

    | BobGW
    0

  • Hello, I have a question about page titles.  How important is branding here?  I'm not referring to the company name, but rather the terminology that's used as "branding language" for a company.  For example, let's say that the it would be a good idea to target the keyword "Restaurant Coupons" based on search volume and competition.  However, our branding adheres to the language "Dining Offers".  Is it considered a bad idea to use "Restaurant Coupons" in the page title?  Or is that considered inconsistent branding? Basically, I'm just trying to figure out the correct balance between the SEO value of words and adhering to a company's branding. Any help is appreciated! Thanks,
    Nick

    | atmosol
    1

  • So I work for a company that has a very successful affiliate that operates under a third level domain name such as "region.company.com". Their SEO practices are very good and they rank highly in keyword searches. However "company.com" does not even though it is not a subdomain. Even after optimizing the company.com's pages etc, the regional sub domain ranks much higher for keywords and the main company fails to rank at all. Is Google discounting the main company's page? Is it a matter of trust or time? or is it something else? How can I get Google to prioritize the main company website rather than a lower level domain affiliate?

    | Resolute
    0

  • Not too long ago, Dublin Core was all the rage. Then Open Graph data exploded, and Schema seems to be highly regarded. In a best-case scenario, on a site that's already got the basics like good content, clean URLs, rich and useful page titles and meta descriptions, well-named and alt-tagged images and document outlines, what are today's best practices for microdata? Should Open Graph information be added? Should the old Dublin Core be resurrected? I'm trying to find a way to keep markup light and minimal, but include enough microdata for crawlers to get a better sense of the content and its relationships to other subdomains and sites.

    | WebElaine
    0

  • I work for an online theater news publisher. Our article page titles include various pieces of data: the title, publication date, article category, and our domain name (theatermania.com). Are all of these valuable from an SEO standpoint? My sense it'd be cleaner to just show the title (and nothing more) on a SERP. But we'll certainly keep whatever helps us with rankings.

    | TheaterMania
    0

  • When building very large ecommerce sites, the catalog data can have millions of product SKUs and a massive quantity of hierarchical navigation layers (say 7-10) to get to those SKUs.  On such sites, it can be difficult to get them to index substantially.  The issue doesn’t appear to be product page content issues.  The concern is around the ‘intermediate’ pages -- the many navigation layers between the home page and the product pages that are necessary for a user to funnel down and find the desired product.  There are a lot of these intermediate pages and they commonly contain just a few menu links and thin/no content.  (It's tough to put fresh-unique-quality content on all the intermediate pages that serve the purpose of helping the user navigate a big catalog.)  We've played with NO INDEX, FOLLOW on these pages.  But structurally it seems like a site with a lot of intermediate pages containing thin content can result in issues such as shallow site indexing, weak page rank, crawl budget issues, etc.  Any creative suggestions on how to tackle this?

    | AltosDigital-1
    0

  • We have to products: - loan for a new car
     - load for a second hand car Except for title tag, meta desc and H1, the content is of course very similmar. Are these pages considered as duplicate content? https://new.kbc.be/product/lenen/voertuig/autolening-tweedehands-auto.html
    https://new.kbc.be/product/lenen/voertuig/autolening-nieuwe-auto.html thanks for the advice,

    | KBC
    0

  • Hey there awesome Mozzers, I have a site that it automatically redirects people by using geolocation ( i know that probably is not good ) to the various languages of the site. I just wanted to know Is 301 or 302 the best option? ( I've heard that for language re-directions 302 is the best case scenario ) My main page for example is www.example.com and it automatically redirects with a 301 to www.example.com/en for any language that is not there. What is the best case scenario? Leave it to redirect to /en or just leave it go to the root page www.example.com.

    | Angelos_Savvaidis
    0

  • Hey moz fans, So these week I noticed significant drop in rankings... But what caught my attention is that one specific keyword dropped 18 positions, and all the other just 1-3. Print screen: http://prntscr.com/7fb4g4 Do you think it's possible that the drop of that page, that went 18 positions down, brought the whole domain down? Or is it another cause?

    | Kokolo
    0

  • Hi All! We have new nice functionality on website, but now i doubt if we will have SEO issues. Duplicate content and if google is able to spider our website. See: http://www.allesvoorbbq.nl/boretti-da-vinci-nero.html#608=1370
    With the new functionality we can switch between colors of the models (black / white / red / yellow).
    When you switch with Ajax the content of other models is fetched without refreshing the page. (so the url initial part of url stays the same (for initial model) only part behind # changes. The other models are also accessible by there own url, like the red one: http://www.allesvoorbbq.nl/boretti-da-vinci-rosso.html#608=1372 So far so good. But now the questions: 1. We use to have url like /boretti-da-vinci-nero.html - also our canonical is that way But now if we access that url our system is adding automatically the #123-123 to the url to indicate which model(color) is shown. Is this hurting SEO or confusing google? Because it seems that the clean url is not accessible anymore? (it adds now #123-123) 2. Should we add some tags around the different types (colors) to prevent google from indexing that part of website? Every info would be very helpfull! We do not want to lose our nice rankings thanks to MOZ! Thanks all!
    Jeroen

    | RetailClicks
    0

  • Howdy... I'm facing something weird about my domain : proudlylived.com
    there's another domain points to my domain in search results which is : <cite class="_Rm">www.animaisfotos.com</cite> I went to check if it's 301/302 redirect and it shows that there isn't any redirection at all it just shows 200 Status, Now I don't know what is happening, and whatever it's i want to cancel it because it shows the second domain instead of mine in search results . any suggestions ?

    | MasdrSE
    0

  • Hi everyone, I run a travel-related website and work with various affiliate partners. We have thousands of pages of well-written and helpful content, and many of these pages link off to one of our affiliates for booking purposes. Years ago I followed the prevailing wisdom and cloaked those links (bouncing them into a folder that was blocked in the robots.txt file, then redirecting them off to the affiliate). Basically, doing as Yoast has written: https://yoast.com/cloak-affiliate-links/ However, that seems kind of spammy and manipulative these days. Doesn't Google talk about not trying to manipulate links and redirect users? Could I just "nofollow" these links instead and drop the whole redirect charade? Could cloaking actually work against you? Thoughts? Thanks.

    | TomNYC
    0

  • I have had the setting of "let googlebot decide" on managing my URL parameters on an Ecommerce site in Magento.  The products I sell come in different sizes and colors and finishes etc. These parameters are showing up in Google Webmaster Tools and set for "let googlebot decide". Some of them have as many as 8 million urls monitored. I changed the editing option to clam these parameters as "narrow searches", but still left the option to "let googlebot decide" (versus block urls). Will blocking these erroneous urls serve any benefit?  Does blocking these help with the crawl/seo?

    | nat88han
    0

  • The client specializes in home, commercial and restoration cleaning services and offers carpet, upholstery, area rug, wood floor, drapery, tile and gout, stone and restoration services such as (water damage, fire damage, mold remediation). This company has over 40 franchises. Carpet cleaning service is their core service that gets them to the customer’s door, then technicians get to up sell on the secondary services (tile, upholstery, stone, wood…) One of the main strategies we have implemented successfully is to be more visible at the local level was a local SEO strategy with every locations having their own unique landing pages for each of the services they would offer ( for instance the san diego location would a customized page for carpet, upholstery and all services they would offer). We have done a great job optimizing each of these locations. Optimization includes on page optimization, unique NAP information, local citations (manual insertions + Yext). We also added local markups and for some of the franchises we added review snippet. Link building around carpet cleaning has been conducted as well through guest posting and in links content. Most of our locations have a google business updated and optimized as well. We are working to get as many reviews as possible but it is still very challenging. In summary basic SEO tactics have been implemented following google’s guideline. Traffic & rankings got us a positive progressive boost in mid 2013(April to August) but in april 2014 the site got hit by manual penalty affecting all carpet cleaning queries only. I was able to cleanup the mess within 2 months luckily but unfortunately we still saw a drop of %40 in traffic (vs 2013) on average in all carpet cleaning pages YoY (april to august). 2015 Q1 traffic has improved by 6% compared to Q1 2014 which is good but still not at the level we were. With the pigeon update and all the high authoritative directories (yelp, angieslist) taking over more and more of the organic real estate in the SERP and increase in competition we have had a hard time getting back to where we were (2013) and we may never get back unless another algorithmic change happens. Another frustrating thing is local competition which has the worst sites as far as UX and content and still outranks us ( such as http://www.carpetcleaninglosangeles.com/). My main goal is to figure out a plan to increase traffic within the carpet cleaning pages and therefore increase conversions. Like it or not rankings for carpet cleaning queries is affecting our CC traffic, so working towards improving them is one way to go even though I shouldn’t focus all my efforts on just rankings. 2015 SEO main activities has been:
    -local link building= somewhat successful (Seeing some rankings improvements but not consistent across all franchises)
    -content marketing projects= quiet successful as far as traffic, branding and link acquisition but not seeing enough ROI
    -new web design (launched late 2014)
    -google business reviews
    -local citations duplicate removal
    -weekly blogging ( successful as far as traffic and branding) Things I would like to work on:
    -improve Bounce rate within site
    -improve CTR by adding review snippets across all franchises
    -add industry certification logos to build trust with users and improve conversion
    -add before and after pictures of services performed
    -site speed (has slowed down compared to the old site) I would love get feedback on what other crucial components(that I am missing) can be done to improve most of these franchises rankings. I am a bit out of ideas as far as what else can be done. Thanks!

    | Ideas-Money-Art
    0

  • I know that Google is a mystery, so I am not sure if there are answers to these questions, but I'm going to ask anyway! I recently realized that Google is not happy with duplicate photo content. I'm a photographer and have sold many photos in the past (but retained the rights for) that I am now using on my site. My recent revelations means that I'm now taking down all of these photos. So I've been reverse image searching all of my photos to see if I let anyone else use it first, and in the course of this I found out that there are many of my photos being used by other sites on the web. So my questions are: With photos that I used first and others have stolen, If I edit these photos (to add copyright info) and then re-upload them, will the sites that are using these images then get credit for using the original image first? If I have a photo on another one of my own sites and I take it down, can I safely use that photo on my main site, or will Google retain the knowledge that it's been used somewhere else first? If I sold a photo and it's being used on another site, can I safely use a different photo from the same series that is almost exactly the same? I am unclear what data from the photo Google is matching, and if they can tell the difference between photos that were taken a few seconds apart.

    | Lina500
    0

  • We own discount banner printing and we are trying to rank 1 for pvc banners or vinyl banners and cannot understand for example how the below is correct, we did suffer a link penalty years ago but we fixed this and the domain has some good links (more and better quality than the sites above us) and cannot understand how we rank below most of the sites above us? If we type on for example pvc banners we get http://www.bannershop.co.uk/cats/pvc_banners.htm https://www.hfe-signs.co.uk/banners.php http://bannerprintingandroid.co.uk/pvc-banners/ http://www.discountbannerprinting.co.uk/banners/vinyl-pvc-banners.html And if we type in vinyl banners we get http://www.vistaprint.co.uk/banners.aspx http://www.bigvaluebanners.co.uk/ http://vinylbannersprinting.co.uk/ http://www.discountdisplays.co.uk/html/vinyl_banners.html https://www.buildasign.co.uk/banners http://www.monkey-print.com/outdoor banners/budget-outdoor-banners http://www.discountbannerprinting.co.uk/banners/vinyl-pvc-banners.html

    | BobAnderson
    0

  • I have an ecommerce site that is on https. We have a Wordpress blog for blogging, but we also have our help section located on it. I used a plugin to switch the blog to https but now have a few problems. 1. My sitemap generator still shows the blog as http and Google gives me a warning for the redirect. 2. When trying to use the Moz page grader I was told that I was in a redirect loop. 3. The pages do not seem to be getting indexed. It is a blog so there is never any information exchanged that is private. Would I be ok with just switching it to http? Or would Google see that as two different sites even though they have the same domain?

    | EcommerceSite
    0

  • As far as I know, it's not normally possible for a website to rank for a keyword that is not mentioned on the website. I have seen a website that ranks very well for key terms and yet they are not mentioned anywhere on the website, I have run advanced search & checked using tools including cloak checker on my findings. How can this be?

    | lee-murphy
    0

  • We own discount banner printing and we are trying to rank 1 for pvc banners or vinyl banners and cannot understand for example how the below is correct, we did suffer a link penalty years ago but we fixed this and the domain has some good links (more and better quality than the sites above us) and cannot understand how we rank below most of the sites above us? If we type on for example pvc banners we get http://www.bannershop.co.uk/cats/pvc_banners.htm https://www.hfe-signs.co.uk/banners.php http://bannerprintingandroid.co.uk/pvc-banners/ http://www.discountbannerprinting.co.uk/banners/vinyl-pvc-banners.html (our website) And if we type in vinyl banners we get http://www.vistaprint.co.uk/banners.aspx http://www.bigvaluebanners.co.uk/ http://vinylbannersprinting.co.uk/ http://www.discountdisplays.co.uk/html/vinyl_banners.html https://www.buildasign.co.uk/banners http://www.monkey-print.com/outdoor banners/budget-outdoor-banners http://www.discountbannerprinting.co.uk/banners/vinyl-pvc-banners.html (Our website)

    | BobAnderson
    0

  • Hi All, I have at least 2 website that have opt-in form that isn't visible all the time (cookie related). Also, the thank you message will appear (obviously) only after signing in. Here how it looks like: Sign up to receive exclusive updates and special offers Congratulations on joining our email list!
    You will be the first to know about exciting news, red carpet updates, special offers, and much more. There seems to be a problem with a signup. Please try again later. What do you think? It may penalize my website or do you find it legit?

    | seoperad
    0

  • Hello everybody, i have a problem with some pages of my website. I have had to removed 5-10 pages because these pages linked to 404 pages and i removed it. Need i to tell to Google or Only removed? Thanks so much

    | pompero99
    0

  • Hi Mozzers, I've been looking at the View Source on my landing pages , and it looks to me that my H1 Tag etc is not in the head but in the body . My developer says its in the correct place , but can someone please confirm as it looks wrong to me. short url link - http://goo.gl/vfXeut Many thanks Pete

    | PeteC12
    0

  • OK – bear with me on this… I am working on some pretty large ecommerce websites (50,000 + products) where it is appropriate for some individual products to be placed within multiple categories / sub-categories. For example, a Red Polo T-shirt could be placed within: Men’s > T-shirts >
    Men’s > T-shirts > Red T-shirts
    Men’s > T-shirts > Polo T-shirts
    Men’s > Sale > T-shirts
    Etc. We’re getting great organic results for our general T-shirt page (for example) by clustering creative content within its structure – Top 10 tips on wearing a t-shirt (obviously not, but you get the idea). My instinct tells me to replicate this with products too. So, of all the location mentioned above, make sure all polo shirts (no matter what colour) have a canonical set within Men’s > T-shirts > Polo T-shirts. The presumption is that this will help build the authority of the Polo T-shirts page – this obviously presumes “Polo Shirts” get more search volume than “Red T-shirts”. My presumption why this is the best option is because it is very difficult to manage, particularly with a large inventory. And, from experience, taking the time and being meticulous when it comes to SEO is the only way to achieve success. From an administration point of view, it is a lot easier to have all product URLs at the root level and develop a dynamic breadcrumb trail – so all roads can lead to that one instance of the product. There's No need for canonicals; no need for ecommerce managers to remember which primary category to assign product types to; keeping everything at root level also means there no reason to worry about redirects if product move from sub-category to sub-category etc. What do you think is the best approach? Do 1000s of canonicals and redirect look ‘messy’ to a search engine overtime? Any thoughts and insights greatly received.

    | AbsoluteDesign
    0

  • I have a website that have a strong root domain (ranking on many terms) but the subpages (articles) doesn't rank well. My feeling is that the linkjuice is not flowing to them (not enough anyway). When I run site:http://mydomain.com I have my root as the first result and the next many results are tagpages on my sites. I have arund 180 index pages, and I need to go to down to result #50 give or take before I see any subpage using the site command. My website theme have the tags on every page possible. The tags are useful for my viewers, but not SEO useful, but I fear that they are dilluting my linkjuice. Should I nofollow and noindex them? Noindex makes sense (the tags are just duplicate content featuring snippets of text from the articles). But Nofollow would make sense too since I wouldn't send any linkjuice through the tags. What would you guys do? Bests regards

    | claus10
    1

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.