Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hello, In the last week we have noticed an extremely large number of backlink links appearing in Google Webmaster Tools. One of the sites which links to us now have over 101,000 backlinks pointing to us, when in reality it should only have 300-600. We have check the websites have not been hacked, with hidden links etc, but we can not find any. Has anyone else experienced problems with Google webmaster tools lately, displaying way too many links? Or could this be a negative SEO attack, which is yet to emerge. Thanks Rob

    | tomfifteen
    0

  • Hi Mozzers Just noticed this pattern on a retail website... This URL product.php?cat=5 is also churning out products.php?cat=5&sub_cat= (same content as product.php?cat=5 but from this different URL - this is a blank subcat - there are also unique subcat pages with unique content - but this one is blank) How should I deal with that? and then I'm seeing: product-detail.php?a_id=NT001RKS0000000 and product-detail.php?a_id=NT001RKS0000000&cont_ref=giftselector (same content as product-detail.php?a_id=NT001RKS0000000 but from this different URL) How should I deal with that? This is a bespoke ecommerce CMS (unfortunately). Any pointers would be great 🙂 Best wishes, Luke  

    | McTaggart
    0

  • Hi all, Imagine if you will I was the owner of many domains, say 100 demographically rich kwd domains & my plan was to redirect these into one website - each into a different relevant subfolder. e.g. www.dewsburytilers..com > www.brandname.com/dewsbury/tilers.html www.hammersmith-tilers.com > www.brandname.com/hammersmith/tilers.html www.tilers-horsforth.com > www.brandname.com/horsforth/tilers.html another hundred or so 301 redirects...the backlinks to these domains were slim but relevant (the majority of the domains do not have any backlinks at all - can anyone see a problem with this practice? If so, what would your recommendations be?

    | Fergclaw
    0

  • I've been working on removing duplicate content on our website. There are tons of pages created based on size but the content is the same. The solution was to create a page with 90% static content and 10% dynamic, that changed depending on the "size" Users can select the size from a dropdown box. So instead of 10 URLs, I now have one URL. Users can access a specific size by adding an anchor to the end of the URL (?f=suze1, ?f=size2) For e.g: Old URLs. www.example.com/product-alpha-size1 www.example.com/product-alpha-size2 www.example.com/product-alpha-size3 www.example.com/product-alpha-size4 www.example.com/product-alpha-size5 New URLs www.example.com/product-alpha-size1 www.example.com/product-alpha-size1?f=size2 www.example.com/product-alpha-size1?f=size3 www.example.com/product-alpha-size1?f=size4 www.example.com/product-alpha-size1?f=size5 Do search engines read the anchor or drop them? Will the rank juice be transfered to just www.example.com/product-alpha-size1?

    | Bio-RadAbs
    0

  • Greeting MOZ Community: Our site is hosted on a virtual private server. Apparently there are dozens of other web sites hosted on the same server. The performance is usually pretty fast, with the site downloading in 1-3 seconds. However, a few times per month, the performance slows down, to say 5-6 seconds. Please see the attached image. I suspect this may have something to do with the other web sites on the server. Currently we pay about $60/month. A dedicated server would cost about $120/month. Would site performance be more consistent on a dedicated server? Could we enjoy potential SEO benefits by having our own server, i.e. could we rank slightly higher if the speed was more consistent and the performance slightly faster? Thanks, Alan 6D7kK61

    | Kingalan1
    0

  • Here is an example of the link that is no longer on the website (Broken link) http://www.weddingrings.com/item.cfm?str_shortdesc=UNIQUE The broken link was fixed to : http://www.weddingrings.com/item.cfm?str_shortdesc=UNIQUE CARRE CUT DIAMOND ETERNITY BAND&str_category=Diamond-Bands-and-Gold-Rings&grouping_id=9&category_id=21&int_item_id=6884 Would I still need to redirect the old broken link to the new fixed one using 301 redirect?

    | alexkatalkin
    0

  • I inherited a site that used to be in Flash and used hashbang URLs (i.e.  www.example.com/#!page-name-here).  We're now off of Flash and have a "normal" URL structure that looks something like this:  www.example.com/page-name-here Here's the problem:  Google still has thousands of the old hashbang (#!) URLs in its index.  These URLs still work because the web server doesn't actually read anything that comes after the hash.  So, when the web server sees this URL  www.example.com/#!page-name-here, it basically renders this page www.example.com/# while keeping the full URL structure intact  (www.example.com/#!page-name-here).  Hopefully, that makes sense.  So, in Google you'll see this URL indexed (www.example.com/#!page-name-here), but if you click it you essentially are taken to our homepage content (even though the URL isn't exactly the canonical homepage URL...which s/b www.example.com/). My big fear here is a duplicate content penalty for our homepage.  Essentially, I'm afraid that Google is seeing thousands of versions of our homepage.  Even though the hashbang URLs are different, the content (ie. title, meta descrip, page content) is exactly the same for all of them. Obviously, this is a typical SEO no-no.  And, I've recently seen the homepage drop like a rock for a search of our brand name which has ranked #1 for months.  Now, admittedly we've made a bunch of changes during this whole site migration, but this #! URL problem just bothers me. I think it could be a major cause of our homepage tanking for brand queries. So, why not just 301 redirect all of the #! URLs?  Well, the server won't accept traditional 301s for the #! URLs because the # seems to screw everything up (server doesn't acknowledge what comes after the #). I "think" our only option here is to try and add some 301 redirects via Javascript. Yeah, I know that spiders have a love/hate (well, mostly hate) relationship w/ Javascript, but I think that's our only resort.....unless, someone here has a better way? If you've dealt with hashbang URLs before, I'd LOVE to hear your advice on how to deal w/ this issue. Best, -G

    | Celts18
    0

  • Hey everyone, We're redesigning parts of our site and I have a tricky question that I was hoping to get some sound advice about. We have a blog (magazine) with subcategory pages that are quite thin. We are going to restructure the blog (magazine) and feature different concert and have new subcategories. So we are trying to decide where to redirect the existing subcategory pages, e.g. Entertainment, Music, Sports, etc. www.charged.fm/magazine Our new ticket category pages ( Concert Tickets, NY Yankees Tickets, OKC Thunder Tickets, etc) are going to feature a tab called 'Latest News' where we are thinking of 301 redirecting the old magazine subcategory pages. So Sports News from the blog would 301 to Sports Tickets (# Latest News tab). See screenshot below for example. So my question is: Will this look bad in the eyes of the GOOG? Are these closely related enough to redirect? Are there any blatant pitfalls that I'm not seeing? It seems like a win/win because we are making a rich Performer page with News, Bio, Tickets and Schedule and getting to reallocate the link juice that was being wasted in an pretty much useless page that was allowed to become to powerful. Gotta keep those pages in check! Thoughts appreciated. Luke Cn6HPpH.jpg

    | keL.A.xT.o
    0

  • Is it harmful to have two of these which are identical in the section?

    | Sika22
    0

  • Right now, the only way I know to do schema markup is through hard coding the designated area where the information is located on the site. I'm thinking about developing an option in a WYSIWYG that allows for basic schema implementation to make it easier for multiple users of experience levels. Does anyone have any examples, tools, or tips for making schema scalable? I want to do a good job with this, but the site I am working on has thousands of pages and multiple "owners."

    | Becky_Converge
    0

  • Hi, Anyone have any good suggestions about using commas, hyphens, vertical bar in the title tag and how it affects rankings? Thanks.

    | bjs2010
    0

  • Hello I work as an in-house SEO (previously worked for an agency) for the website naturalworldsafaris.com. After a strong start to the year we were seeing really good growth but since Panda 4.0 was released we have been steadily declining. Our site has, I believe, good, unique content and is largely free of technical issues. I'm struggling to identify what exactly is the issue for the drop. We've had several key terms drop from the top half of page 1 to page 2 of the SERPS, such as the term "Borneo Holiday" (on a Google UK search). I don't believe we have any duplicate content issues. We've had a few external SEO specialists take a look and none have come up with anything new. Site speed has been flagged as a concern but when compared to our competitors in the SERPS we are consistently one of the faster sites so while we are looking to improve this, I don't feel it can be the only issue. Any suggestions as to what else we should be investigating next would be much appreciated.

    | KateWaite
    0

  • Hi,
    I am trying to SEO optimized my webpage dreamesatehuahin.com When I saw SEO Moz webpage crawl diagnostic I kind of got a big surprise due to the high no. of errors. I don’t know if this is the kind of errors that need to be taken very serious i my paticular case, When I am looking at the details I can see the errors are cause by the way my wordpress theme is put together. I don’t know how to resolve this. But If important I might hire a programmer. DUPLICATE ERRORS (40 ISSUES HIGH PRIORITY ACCORDING TO MOZ)
    They are all the same as this one.
    http://www.dreamestatehuahin.com/property-feature/restaurent/page/2/
    is eaqual to this one
    http://www.dreamestatehuahin.com/property-feature/restaurent/page/2/?view=list This one exsist
    http://www.dreamestatehuahin.com/property-feature/car-park/
    while a level down don’t exsit
    http://www.dreamestatehuahin.com/property-feature/ DUPLICATE PAGE TITLE (806 ISSUES MEDIUM PRIORITY ACCORDING TO MOZ)
    This is related to search results and pagination.
    Etc. Title for each of these pages is the same
    http://www.dreamestatehuahin.com/property-search/page/1 http://www.dreamestatehuahin.com/property-search/page/2 http://www.dreamestatehuahin.com/property-search/page/3 http://www.dreamestatehuahin.com/property-search/page/4 Title element is to long (405)
    http://www.dreamestatehuahin.com/property-feature/fitness/?view=list
    this is not what I consider real pages but maybe its actually is a page for google. The title from souce code is auto generated and in this case it not makes sense
    <title>Fitness Archives - Dream Estate Hua Hin | Property For Sale And RentDream Estate Hua Hin | Property For Sale And Rent</title> I know at the moment there are properly more important things for our website like content, title, meta descriptions, intern and extern links and are looking into this and taking the whole optimization seriously. Have for instance just hired a content writer rewrite and create new content based on keywords research. I WOULD REALLY APPRICIATE SOME EXPERIENCE PEOPLE FEEDBACK ON HOW IMPORTANT IS IT THAT I FIX THIS ISSUES IF AT ALL POSSIBLE? best regards, Nicolaj

    | nicolaj1977
    1

  • My very large e-commerce client is about to undergo a site migration in which every product page URL will be changing. I am already planning my 301 redirect process for the top ~1,000 pages on the site (categories, products, and more) but this will not account for the more than 1,000 products on the site. The client specified that they don't want to implement much more than 1,000 redirects so as to avoid impacting site performance. What is the best way to handle these pages without causing hundreds of 404 errors on site migration day? Thanks!

    | FPD_NYC
    0

  • Hi, Google Webmaster Tools tells me, that every blog category and blog post is missing: 'updated' 'author' I find this data under 'Structured Data' => The datatype is 'hentry'. Markup is microformats.org. Is this a problem for SEO? How can I fix this? Best, Robin

    | soralsokal
    0

  • Hi I am migrating from my old website to a new one on a different, server with a very different domain and url structure. I know it's is best to change as little as possible but I just wasn't able to do that. Many of my pages can be redirected to new urls with similar or the same content. My old site has around 400 pages. Many of these pages/urls are no longer required on the new site - should I 404 these pages or 301 them to the homepage? I have looked through a lot of info online to work this out but cant seem to find a definative answer. Thanks for this!! James

    | Curran
    0

  • I am having some trouble with getting the landing pages for a clients website to show up in the SERPs. 
    As far as I can see, the pages are optimized well, and they also get indexed by Google. The website is a danish webshop that sells wine, www.vindanmark.com Take for an instance this landing page, http://www.vindanmark.com/vinhandel/ 
    It is optimzied for the keywords "Vinhandel Århus".   Vinhandel means "Winestore" and "Århus" is a danish city. As you can see, I manage to get them at page 1 (#10), but it's the frontpage that ranks for the keyword. And this goes for alle the other landing pages as well. But I can't figure out, why the frontpage keep outranking the landingpages on every keyword. 
    What am I doing wrong here?

    | InmediaDK
    1

  • It used to work for me on some sites - but maybe it's considered spammy these days? Any feedback appreciated.

    | bjs2010
    0

  • I have been tracking ranks of some keywords important to my business since the last 2 months. Recently I have observed that, for one of my keywords, google webmasters is giving the avg position as 8 but when i search in google it comes in the 6th page. I know that webmasters tools gives the average position but i do not think there will be such big difference in the ranks. Please help.Thanks.

    | seomoz1232
    0

  • So I've been killing myself with learning seo. My site has every type of markup and I feel that I have run out of ways to optimize my site. If anyone has any recommendations, it would be enormously appreciated. http://www.j-26.com/

    | jp26jp
    0

  • I understand the importance of keywords, but I also worry about the usability factor. Curious - anyone ever study about the impact of calling your WP folder "blog" vs "long-primarykeyword" Im thinking of something generic /blog
    /community
    /articles
    /info Vs long keyword /long-keyword/ ANyone have any input? Every time I search, i see things about Folders vs Subdomains, etc. Thanks everyone for your feedback!

    | inmn
    1

  • Hi everyone,
    We're a web agency and our site www.axialdev.com is not performing well. We have very little traffic from relevant keywords. Local competitors with worse On-page Grader scores and very few backlinks outrank us. For example, we're 17th for the keyword "agence web sherbrooke" in Google.ca in French. Background info: In the past, we included 3 keywords-rich in the footer of every site we made (hundreds of sites by now). We're working to remove those links on poor sites and to use a single nofollow link on our best sites. Since this is on-going and we know we won't be able to remove everything, our link profile sucks (OSE). We have a lot of sites on our C-Block, some of poor quality. We've never received a manual penalty. Still, we've disavowed links as a precaution after running Link D-Tox. We receive a lot of trafic via our blog where we used to post technical articles about Drupal, Node js, plugins, etc. These visits don't drive business. Only a third of our organic visits come from Canada. What are our options? Change domain and delete the current one? Disallow the blog except for a few good articles, hoping it helps Google understand what we really do. Keep donating to Adwords? Any help greatly appreciated!
    Thanks!

    | AxialDev
    2

  • I work with a site who specialise in life insurance for people with pre-existing medical conditions - http://goo.gl/Drwre6. The site has ranked really well historically, but was hit hard on 16th June when we saw an almost 100% drop in rankings overnight. We picked up on quite a few issues straight away and rectified these. A list of steps we've taken so far are below: removed CSS & JS files from robots.txt changed hosting provider back, as it had recently been moved somewhere new updated copy on main landing pages to remove small amounts that were duplicated requested removal of some suspicious looking backlinks and submitted a disavow found and removed a test site that was live and indexable found an external site that had scraped copy from our site - requested removal (this site is no longer live) cleaned up any 404 and ensured all redirects are working correctly updated the diabetes page to include more valuable info - including linking out to authority sites After taking all these steps, we have still seen no improvement. It could be that Google just hasn't yet re-crawled the site to take the changes into account...? We're aware of one other site in our industry that has noticed a drop in rankings in the last couple of months, but a number of our competitors are still ranking well for our target terms. We wonder if the site was caught up in the Payday Loans update, as the timings almost line up. Other sites with spammy medical content seem to have been hit, so we wonder if the "medical" type content on our site could have been penalised? Incredibly frustrating if so, as it's a valid, genuine service being offered! Really at a bit of a loss as to what to do next, so any help would be hugely appreciated! Katie

    | Digirank
    0

  • Hi guys I've just seen a website get a link from Google's Webmaster Snippet testing tool. Basically, they've linked to a results page for their own website test. Here's an example of what this would look like for a result on my website. http://www.google.com/webmasters/tools/richsnippets?q=https%3A%2F%2Fwww.impression.co.uk There's a meta nofollow, but I just wondered what everyone's take is on Trust, etc, passing down? (Don't worry, I'm not encouraging people to go out spamming links to results pages!) Looking forward to some interesting responses!

    | tomcraig86
    0

  • Good Morning! We currently have two websites which are driving all of our traffic. Our end goal is to combine the two and fold them into each other. Can I redirect the duplicate content from one domain to our main domain even though the URL's are different. Ill give an example below. (The domains are not the real domains). The CEO does not want to remove the other website entirely yet, but is willing to begin some sort of consolidation process. ABCaddiction.com is the main domain which covers everything from drug addiction to dual diagnosis treatment. ABCdualdiagnosis.com is our secondary website which covers everything as well. Can I redirect the entire drug addiction half of the website to ABCaddiction.com? With the eventual goal of moving everything together.

    | HashtagHustler
    0

  • Greetings MOZ Community: My firm operates www.nyc-officespace-leader.com, a commercial real estate brokerage in New York City. Prior to the first Penguin update in April 2012, our home page used to receive about 10% or 600 of total organic visits. After the first Penguin was launched by Google organic traffic to the home dropped to maybe 5% or 200 visits per month. Since May of this year, it appears we have been penalized by Penguin 4.0 and are attempting to recover. Now our home page only generates about 140 organic visits per month, or less than 4% of organic traffic. Our home enjoyed  good conversion rate, so this drop in traffic is a real loss. Does this very low level of traffic to the home page indicate something abnormal? Dropping from 10% to less than 4% is a major decline. Should we take specific steps regarding the home page like enhancing the content? Thanks, Alan

    | Kingalan1
    0

  • Hi there Mozzers, Running into a small issue. After a homepage redesign (from a list of blog posts to a product page), it seems that blog posts are buried on the http://OrangeOctop.us/ site. The latest write-up on "how to beat real madrid in FIFA 15", http://orangeoctop.us/against-real-madrid-fifa-15/ , has yet to be indexed. It would normally take about a day naturally for pages to be indexed or instantly with a manual submission. I have gone into webmaster tools and manually submitted the page for crawls multiple times on multiple devices. Still not showing up in the search results. Can anybody advise?

    | orangeoctop.us
    0

  • Does anyone have a good example of a photo gallery with an optional view all page implementation?  The only view all examples I can find are ecommerce pagination.

    | Aggie
    0

  • Our search traffic has been growing at a steady clip for the last year but is down about 30% this month. As part of a redesign, we've repurposed our home page (blog.getvero.com). Rather than serve as a feed of recent posts, it's now an email signup page. We created a new page (blog.getvero.com/posts/) to display new posts. I think this is likely the reason for the drop in search traffic but I'm frustrated that it's losing us thousands of visitors per month. A few questions: 1. How long will it take to recover from this? 2. Is there anything we can do to speed up the recovery process? 3. Why are some of our best performing posts seeing less search traffic even though the URL hasn't changed? Any help is greatly appreciated.

    | Nobody1611698302042
    0

  • Hi, Some months ago we created unique content for each of our product descriptions - basically, we removed manufacturers description and made our own unique content. But our content is now, I feel, stale but I'm trying to work out how we can produce fresh content for the product pages and how much is needed for Google to notice the changes? My question is: What does google class as freshness? Would a new photo count? I've always thought that it has to be mainly text content but we do not get that many reviews as our products are not mainstream and it's a small market so are we to create extra bits of text content in the descriptions, and how much?

    | bjs2010
    0

  • What is the good text font for health website, font size, inline spacing, character spacing etc.? Is there any study on it? what font looks to good to eyes?  (on what font user stay for long time etc) I personally like apple website text font.

    | MasonBaker
    0

  • Hi guys and girls, I have a client that has 4 very outdated websites with about 50 pages on each. They are made up like: 1 brand group and 3 for each individual key service they offer, so let's call them: brand.com (A) brand-service-1.com (B) brand-service-2.com (C) brand-service-3.com (D) We've rebuilt the main site and aggregated all the content from the others (99% re-written). Am I correct in thinking the process for the new lauch would be: 1. Launch the new site on brand.com (A) and 301 all the old brand.com (A) pages to the related pages on the new site. 2. Redirect the other websites (B,C,D) on a domain level to the new site on the brand.com (A) domain. 3. Clean up the old URL's, sitemaps, errors in Google WMT Is this right? Anything I missed/better practices? I was also wondering if I should redirect B,C,D in stages, or use page level redirects.

    | shloy23-294584
    0

  • Can I get you experts opinion? A few years ago, we customized our pages to repeat the page title at the bottom of the page. So the page title is in the breadcrumbs at the top, and then it's also at the bottom of the page under all the contents.  Here is a sample page:  bit.ly/1pYyrUl I attached a screen shot and highlighted the second occurence of the page title. Am worried that this might be keyword stuffing, or over optimizing? Thoughts or advice on this? Thank you so much! ron ZH8xQX6

    | yatesandcojewelers
    0

  • I am creating a list of all things I need to do for switching a site over to HTTPs. I can find great instructions for Google, but nothing for Bing. If I do everything that Google requires, is the only thing I need to do for Bing is the site move?

    | EcommerceSite
    0

  • This is real estate website related. For every neighborhood I have a "condos" and "houses" page. In the breadcrumb structure I may have: "home > island condos > city condos > region condos > neighborhood condos". Questions: Some breadrumb structures have 5-6 different breadcrumb link and repeating the word "condos" in each link seems redundant. Would it be better just to list "island", "city", "region", "neighborhood" and never use the word "condos" or "houses" in the breadcrumbs? For users this would be better. If I implement what I suggest in 1) - deleting "condos" or "houses" wording from breadcrumb links, then on a condos page the word "region" (as an example) will lead to the "region condos" page whereas the exact same word "region" on a house page will lead to the "region houses" page. This means I will have a situation where the anchor text in breadcrumbs become 100% identical for my "condos" and "houses" pages, however, the they lead to different pages. Is this OK? I have in past been told that when I use internal anchor text, that the link should always leads to the same page. Having same anchor leading to different pages would not be good….is that so? thank you

    | khi5
    0

  • I've got a fan website for a cult US TV show (that also sells merchandise as an affiliate). The site's design is about five years old, and I should probably revamp it at some point: http://www.btvsonline.com/ Anyway, the site's Cutline theme (http://cutline.tubetorial.com/) does not support native headers, and the header menu shows only a text list of my top-level pages. There are many subpages below these pages, but it's hard for users to find them without going to different pages to try to find things. What I'm looking for: if someone hovers above "Store" in the menu, a dropdown menu showing all the product pages would appear. I've tried to find Wordpress plugins or PHP hacks to add dropdown menus and I have gone through Cutline's support site, but I have had no luck. Anyone in the Moz community have any advice? Perhaps I just need to revamp the site with a new and modern Wordpress theme? Thanks in advance for any thoughts!

    | SamuelScott
    0

  • Alo everyone! The site I'm working on has had a homepage that essentially used the footer as the main form of navigation on the site and the PA of each of those pages reflects that. I'm helping them re-organize the site (I'm still a noob though), and was curious for some input on this particular situation. Some of the most authoritative pages are: 1. www.charged.fm/privacy - PA 29 2. www.charged.fm/terms - PA 29 My question: Is this just a consequence of previous mistakes that we live with, or is there something involving 301's and creation of new pages that could help us utilize the link juice on these pages. Or should we come up with ways to internally link to 'money' pages from these pages instead? Thanks for any input, Luke

    | keL.A.xT.o
    0

  • Good Morning! I have been trying to clean up this website and  half the time I can't even edit our content without breaking the WYSIWYG Editor. Which leads me to the next question. How much, if at all, is this impacting our SEO. To my knowledge this isn't directly causing any broken pages for the viewer, but still, it certainly concerns me. I found this post on Moz from last year: http://moz.com/community/q/how-much-impact-does-bad-html-coding-really-have-on-seo We have a slightly different set of code problems but still wanted to revisit this question and see if anything has changed. I also can't imagine that all this broken/extra code is helping our page load properly. Thanks everybody!

    | HashtagHustler
    0

  • Hi does anyone know any ways, tools to find when a page was first indexed/cached by Google? I remember a while back, around 2009 i had a firefox plugin which could check this, and gave you a exact date. Maybe this has changed since. I don't remember the plugin. Or any recommendations on finding the age of a page (not domain) for a website? This is for competitor research not my own website. Cheers, Paul

    | MBASydney
    0

  • Hi, sometimes, I just delete a page and not necessarily want to make a 404 to another page. So Google Webmaster Tools shows me 108 'not found' pages under 'Crawling Errors'. Is that a problem for my site?
    Can I ignore this with good conscience?
    Shall I make 404 to my homepage? I am confused and would like to hear your opinion on this. Best, Robin

    | soralsokal
    0

  • Hi, Just doing some analysis on a domain, and the (external) linking root domains show as:
    21 to Root Domain
    4 to Subdomain The site is hosted under the www. subdomain version and there is no 301 from domain to www.domain Should the site be: Hosted on the root domain instead of subdomain 301 all incoming requests on domain to point to www.domain (subdomain) Any comments and experience on this type of situation appreciated!

    | bjs2010
    0

  • Good Morning! Now, I'll admit, I may be obsessing a little too much on this, and it may not make that big of an impact in the long run, but with Google being introduced to the world if I were to start a business today I would try and include my keyword into the title of my business. For example Dollar Shave Club, at least they got the word shave in there. My business doesn't have a keyword in our name, is it beneficial to structure our URLs to include a keyword so that all of our URLs include that word? So if I sell organic bananas, but my company is called Evananas, is it worth it to have all domains become a child of Evananas.com/organic_bananas? That way at least we have the keyword "Organic Bananas" in our title? So I could then have things like: evananas.com/organic_bananas/recipes evananas.com/organic_bananas/benefits evananas.com/organic_bananas/taste_really_freeking_good Vs. evananas.com/recipes evananas.com/benefits evananas.com/taste_really_freeking_good I'm not sure it makes a difference. The other problem is I want to keep our URL's as short as possible. I feel like less is always more, but I was always under the impression domain/URL based keywords were rather powerful. What is the best practice in this case? Thanks Guys! Evan(ana)

    | HashtagHustler
    0

  • Hi Guys, We have a good website strong onsite and offsite seo. A year ago, we had a 15 pages website for all main keywords we needed and we were on top 3 for most of these keywords in google. We were happy but we wanted more.. So we created lots of unique content targeting long tail keywords and created 100 more pages for the website.  In next 4-5 months we lost positions for almost all our main keywords but got lots of longtails SERPs. Trafiic grew but the quality and the conversion rate shrinked. Everybody keep saying that it doesn't matter how many pages you have on the website as long as content is unique and I don't think it is true. I see lots of 3-5 paged websites without any seo in top 3 results in google. Does it mean that if I delete all these 100 pages that I created I will have more chances to get my main keywords SERP back? Basically does the seo juice that you have on domain is spreading across all pages and the more pages you have the less juice every page will get?

    | vadimmarusin10
    0

  • I have been noticing lately that quite a few of my client's sites are showing sitemap errors/warnings in Google webmaster tools, despite the fact that the issue with the the sitemap (e.g a URL that we have blocked in robots.txt) was fixed several months earlier. Google talks about resubmitting sitemaps here where it says you can resubmit your sitemap when you have made changes to it, I just find it somewhat strange that the sitemap is not automatically re-scanned when Google crawls a website. Does anyone know if the sitemap is automatically rescanned and only webmaster tools is not updated, or am I going to have to manually resubmit or ping Google with the sitemap each time a change is made? It would be interesting to know other people's experiences with this 🙂

    | Jamie.Stevens
    0

  • We run Magento and we're in the process of redesigning our site. We want the site to have separate storefronts for different countries, however we won't have the site language translated initially. We're thinking we'll use the Magento multi-store feature and have sites like /fr, /de  /en-us, /en-au, etc. Is the best practice to use hreflang and for the non-english stores which haven't yet been translated? For example set them as, for French users: Essentially saying, the page is aimed at French people, but is in English. The separate storefronts will have things like currency and tax localised to each country and will gradually be getting translated, especially the more generic stuff like "Add to Cart", "Checkout" etc. Or, should it be targeted at French language and country, despite not all being translated into French? Or is there a better way to do this?

    | seanmccauley
    0

  • Hey everyone, I know there is literature about this, but I'm always frustrated by technical questions and prefer a direct answer or opinion. Right now, we've got recanonicals set up to deal with parameters caused by filters on our ticketing site. An example is that this: http://www.charged.fm/billy-joel-tickets?location=il&time=day relcanonicals to... http://www.charged.fm/billy-joel-tickets My question is if this is good enough to deal with the duplicate content, or if it should be de-indexed. Assuming so, is the best way to do this by using the Robots.txt? Or do you have to individually 'noindex' these pages? This site has 650k indexed pages and I'm thinking that the majority of these are caused by url parameters, and while they're all canonicaled to the proper place, I am thinking that it would be best to have these de-indexed to clean things up a bit. Thanks for any input.

    | keL.A.xT.o
    0

  • I was wondering what the prime factors were to make something rank for a video on Google. Does anyone have any suggestions? I think that length may be important, but I don't know what the ideal run time is. Hypothetically for local SEO, would I be better off doing a tag like "Mercedes Buffalo NY" or do individual tags of "Mercedes" and "Buffalo" Thanks!

    | oomdomarketing
    0

  • Why are some of my keywords going to subdomains instead of the more general/targeted landing page. For example, on my ecommerce website, the keyword 'tempurpedic' is directing to the subdomain URL of a specific tempurpedic product page instead of the general landing page.  The product has a page authority of 15 and the Tempurpedic landing pages with all the products has an authority of 31. I have also noticed that my 'furniture stores in houston' keyword directs to my "occasional tables" URL! instead of a the much more targeted homepage. Is there something I am missing here?

    | nat88han
    0

  • Hi All -- I'm working with a publishing client who is launching a new site. They have a large product catalogue offered in a number of format types (print, ebook, online learning, packages) with each one possessing a unique ISBN code. From past experience, I know that ISBN codes can be a really important ranking factor. We are currently trying to sort out product page guidelines. The proposed methods are: A single product page for all formats. The user then has the option to select which format they wish to purchase. The page would contain all key descriptors for each format, including: individual ISBN, format, title, price, author, etc. We would then use schema mark-up just to assist search engines with understanding and crawling. BUT we worry that the single page won't rank as well as say an invidual product page with a unique ISBN in the URL (for example: http://www.wiley.com/WileyCDA/WileyTitle/productCd-0470573325.html) Which leads to the next option... Individual URLs for each format. We understand that most e-commerce guidelines state you shouldn't dilute link equity amongst multiple pages with very similar products and descriptions. BUT we want searchers to be able to search by individual ISBN and still find that specific format within the SERPs. This seems to rule out canonicalizing, because we don't prefer one format over the other and still want say the ebook to show up as much as the print version. If anyone has any other options or considerations that we haven't thought about, it would be greatly appreciated. Thanks, U

    | HarborOneBank
    0

  • Hey Moz, I am working with a client on more advanced SEO tactics. This client has a reputable domain authority of 67 and 50,000+ backlinks. We're wanting to continue SEO efforts and stay on top of any bad backlinks that may arise. Would it be worth asking websites (below 20 domain authority) to remove our links? Then, use the disavow tool if they do not respond. Is this a common SEO practice for continued advanced efforts? Also, what would your domain authority benchmark be? I used 20 just as an example. Thanks so much for your help. Cole

    | ColeLusby
    1

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.