Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • I have been doing general optimization for on-page, but still have many F's because SEOMoz considers the pages to be weak for keywords that are anyway not relevant. Is there a way to tease out keywords for specific pages so I can get a more accurate report card?

    | Ocularis
    1

  • Hi, This is a slightly odd one I was hoping someone could shed some light on. One of our staff just did a Google search and located these listings on Google UK Product Search: http://www.google.co.uk/search?q=ink+cartridges&hl=en&sa=X&biw=1074&bih=499&tbm=shop&prmd=imvns#q=ink+cartridges&hl=en&sa=X&tbs=store:3287803270081455254&tbm=shop&prmd=imvns&ei=xp5pUP6uN8i_0QXUuoHADQ&ved=0CI0BEMcMMAE&bav=on.2,or.r_gc.r_pw.r_qf.&fp=333b49ec245f6031&biw=1074&bih=499 Do you happen to have any idea where Google is getting this regionalised data from and in particular the pricing which is incorrect? We have a Google (UK) Product Feed however the prices given are different than those being displayed in this localised search.  Additionally the product feed that we supply relates to our main website and not a specific store. If you click through to compare prices from multiple merchants you'll see our prices being listed correctly under our company name and website rather than the incorrect pricing attributed to a specific store. I have checked our Google Places Account and our Google Product Feed Account but I just can't figure out where this data and incorrect pricing is coming from and indeed why it only affects our physical stores and not the more generalised website pricing. If someone could point me in the right direction so I can get this corrected I’d appreciate it! Many thanks Chris

    | ChrisHolgate
    0

  • Hi, In our e-commerce site on category pages we have pagination (i.e toshiba laptops page 1, page 2 etc.). We implement it with rel='next' and 'prev' etc. On the first page of each category we display a header with lots of text information. This header is removed on the following pages using display='none'. I wondered if since it is only a css display game google might still read it and consider duplicated content. Thanks

    | BeytzNet
    0

  • We are running a marketplace site, so we have thousands of vendors selling their products on our site. Each vendor has a Profile page and we are soon to launch a premium store-front that is white label. Many of these vendors will want to point a custom url to their premium store-front (which is a sub domain of the marketplace) and we are trying to get an understanding of how we should instruct them to point their url in a way that will give the main marketplace site the seo juice. We also want to understand what will show up in the address bar.  Will it be their url or our sub domain? Will any of the marketplace seo juice boost their url local listing status?

    | bloomnation
    0

  • We have about 18,000 pages submitted on our Google Sitemap and only about 9000 of them are indexed. Is this a problem? We have a script that creates a sitemap on a daily basis and it is submitted on a daily basis. Am I better off only doing it once a week? Is this why I never get to the full 18,000 indexed?

    | EcommerceSite
    0

  • I have read that now you can have multiple h1 tags on a page without it negatively impacting SEO. Previously it was advised to only have 1 h1 tag on a page. Example: with the new semantic mark up you could have separate h1 tags for the header, article, aside and footer. Is this really the case?

    | bronxpad
    0

  • I have a piece of content (that is similar) that legitimately shows up on two different sites.  I would like both to link, but it seems as if they are "flip flopping" in ranking.  Sometimes one shows up, sometimes another.  What's the best way to differentiate a piece of content like this?  Does it mean rewriting one entirely? http://www.simplifiedbuilding.com/solutions/ada-handrail/ http://simplifiedsafety.com/solutions/ada-handrail/ I want to the Simplified Building one to be found first if I had a preference.

    | CPollock
    0

  • About 6 weeks ago we completely redid our entire site. The developer put in 302 redirects. We were showing thousands of duplicate meta descriptions and titles. I had the redirects changed to 301. For a few weeks the duplicates slowly went down and now they are right back to where they started. Isn't the point of 301 redirects to show Google that content has permanently been moved? Why is it not picking this up? I knew it would take some time but I am right where I started after a month.

    | EcommerceSite
    0

  • Does anyone knows why Google is displaying my pages path instead of the URL in the search results, i discoverd that while am searching using a keyword of mine then i copied the link http://www.smarttouch.me/services-saudi/web-services/web-design and found all related results are the same, could anyone one tell me why is that and is it really differs? or the URL display is more important than the Path display for SEO!

    | ali881
    0

  • i've read somewhere that if you list too many links/articles on one page, google doesn't crawl all of them. In fact, Google will only crawl up to 100 links/articles or so. Is that true? If so, how do I go about creating a page or blog that will be SEO friendly and capable of being completely crawled by google?

    | greenfoxone
    0

  • Hi, I have a client called 'Ivana Daniell'. She is a Pilates/ Movement Therapy/ Postural Assessment practitioner based in London. She has been based in the UK since 2011. Prior to that her studio was in Singapore. Her website URL is: http://www.ivanadaniell.com If you Google her using the UK engine (www.google.co.uk) using the term 'London Pilates' she comes up in the top four, but her organic listing appears with a small tag which reads "- singapore". I have attached an image of how it appears. This makes many searchers overlook her, believing that she is based overseas. We host her website here in the UK and have removed any reference to Singapore from the website. We have even put an h card on her site to indicate that she is London based. The client believes that this might be the result of an old Google places account of hers, from her time in Singapore. However I have not been able to find any such listing by searching for it in Google and, because it was allegedly set up by her former marketing manager, she does not have her username or password to the account. She has lost touch with the marketing manager and has no way to get the login details. To reiterate however I have seen no proof to suggest that this listing even exists. So, the question! 1. What could be causing the word 'singapore' to be appearing next to the organic listing? 2. If it is the result of an inaccurate Google Places listing, how do I delete this listing without either the username or password? 3. If this is not what is causing 'singapore' to appear by the organic listing, how do I get rid of the 'singapore' word? Thanks very much and Happy New Year mozzers! Edward M6Wyd M6Wyd

    | GoUp
    0

  • I have a map listing showing for the keyword junk cars for cash nj. I recently created a new g+ page and requested a merge between the places and the + page. now when you do a search you see the following. Junk Cars For Cash NJ LLC
    junkcarforcashnj.com/
    Google+ page - Google+ page the first hyperlink takes me to the about page of the G+ and the second link takes me to the posts section within g+. Is this normal? should i delete the places account where the listing was originally created? Or do i leave it as is? Thanks

    | junkcars
    0

  • Brief question - SEOMOZ is teling me that i have duplicate content on the following two pages http://www.passportsandvisas.com/visas/ and http://www.passportsandvisas.com/visas/index.asp The default page for the /visas/ directory is index.asp - so it effectively the same page - but apparently SEOMOZ and more importantly Google, etc treat these as two different pages. I read about 301 redirects etc, but in this case there aren't two physical HTML pages - so how do I fix this?

    | santiago23
    0

  • If there aren't any crawling / indexing issues with your site, how important do thing sitemap errors are? Do you work to always fix all errors? I know here: http://www.seomoz.org/blog/bings-duane-forrester-on-webmaster-tools-metrics-and-sitemap-quality-thresholds Duane Forrester mentions that sites with many 302's 301's will be punished--does any one know Googe's take on this?

    | nicole.healthline
    0

  • Hi, First off im new here, so hello to everyone. Now to the reason why I have joined. I am currently trying to rank for 2 terms: **UK Bank Holidays 2013 (Term 1) **and Bank Holidays 2013 (Term 2) The page which im trying to rank these terms on is: http://www.followuk.co.uk/bank-holidays Now some background history: On the 29th Dec 2013, term 1 was 5th and term 2 was 7th - rankings achieved through guest blogging. Last night I changed the h1 tag from 'Bank Holidays 2013' to 'UK Bank Holidays 2013'. Re-worded the meta description to try and increase the CTR. And removed the term 'Bank Holiday' from the end of each sub-heading - Ex: 'New Year's Day Bank Holiday' to 'New Year's Day' - I did this because I felt it was to much so in total 'Bank Holiday' term had been removed from 5 sub-headings. Ok, so I went into WMT and resubmitted for indexing, over night the page got reindexed - the term 'UK Bank Holidays 2013' stayed at the same position (5) BUT the 'Bank Holidays 2013' term dropped into hell at roughly position 250. I'm thinking of changing everything back and crossing my fingers that term which dropped comes back BUT maybe im being to rash and it might jump back as the page stands. I did a grade test using SEOMOZ and both terms generate a grade of 'A'. Has anyone got any ideas? Sorry if the thread is a bit messy im currently crying all over the keyboard as im typing. Thanks

    | followuk
    0

  • Lets say I have a website in not to competitive niche. I was considering buying a few aged domains from godaddy auctions and 301 redirecting them to my new domain. Can this alone be enough to rank pretty high for a uncompetitive niche? Can this also be a link building technique in itself since the link juice from the domain purchased carries over? Thanks

    | junkcars
    0

  • I heard and read from different sources that 301 redirects from aged domains with healthy link profiles is great to boost a sites rank as oppose to building a site around the page and linking it to the domain you want to rank. Whats is the best practice for this strategy? Thanks

    | junkcars
    0

  • In its webmaster guidelines, Google says not to index search results " that don't add much value for users coming from search engines." I've noticed several big brands index search results, and am wondering if it is generally OK to index search results with high engagement metrics (high PVPV, time on site, etc). We have an database of content, and it seems one of the best ways to get this content in search engines would be to allow indexing of search results (to capture the long tail) rather than build thousands of static URLs. Have any smaller brands had success with allowing indexing of search results? Any best practices or recommendations?

    | nicole.healthline
    0

  • We are considering implementing a site-wide contextual linking structure. Does anyone have some good guidelines / blog posts on this topic? Our site is quite (over 1 million pages), so the contextual linking would be automated, but we need to define a set of rules. Basically, if we have a great page on 'healthy recipes,' should we make every instance of the word 'healthy recipes' link back to that page, or should we limit it to a certain number of pages?

    | nicole.healthline
    0

  • Hello All, I'm looking to perform a 'Standard' guest blog post link building tactic, but i'm a little unsure as where to start. Does anybody have a list/ guide to websites that accept guest posts? Preferably ones that are useful for SEO purposes, I have been link building for about 3 months now, but to be honest, most of these links are NoFollow, which isn't too great! Paul

    | Paul_Tovey
    0

  • Hi, I would try to summarize my query through an example. Lets say site A (www.siteA.com) have two sub domain (subdomain1.siteA.com & subdomain2.siteA.com) and another site B ( www.siteB.com ) have no sub domain. Due to some obvious reason we need re direct the site site A (www.siteA.com) to site B ( www.siteB.com ) and one of the sub domain (subdomain1.siteA.com) to site B (subdomain1.siteB.com). Now the question is that in case of ( subdomain2.siteA.com ) can we keep the sub domain to site A even though site A has been re directed to site B ? Reasons for keeping this can be traffic, earnings etc. Is it possible to keep it like that or provision for further optimization? Plz help.

    | ITRIX
    0

  • A client site has thousands of pages with unoptimized urls. I want to change the url structure to make them a little more search friendly. Many of the pages I want to update have backlinks to them and good PR so I don't want to delete them entirely. If I change the urls on thousands of pages, that means a lot of 301 redirects. Will thousands of redirected pages have a negative impact on the site? Thanks, Dino

    | Dino64
    1

  • Hi All, To clarify my question I will give an example. Let's assume that I have a laptop e-commerce site and that one of my main categories is Samsung Laptops. The category page shows lots of laptops and a small section of text. On the other hand, in my article section I have a HUGE article about Samsung Laptops. If we consider the two word phrases each page is targeting then the answer is the same - Samsung Laptops. On the article i point to the category page using anchor such as "buy samsung laptops" or "samsung laptops" and on the category page (my wishful landing page) I point to the article with "learn about samsung laptops" or "samsung laptops pros and cons". Thanks

    | BeytzNet
    0

  • Click this Google query: https://www.google.com/search?q=les+paul+studio Notice how Google has a rich snippet for Ebay saying that it has 229 results for Ebay's internal search result page: http://screencast.com/t/SLpopIvhl69z Notice how Sam Ash's internal search result page also ranks on page 1 of Google. I've always followed the best practice of setting internal search result pages to "noindex." Previously, our company's many Magento eCommerce stores had the internal search result pages set to be "index," and Google indexed over 20,000 internal search result URLs for every single site. I advised that we change these to "noindex," and impressions from Search Queries (reported in Google Webmaster Tools) shot up on 7/24 with the Panda update on that date. Traffic didn't necessarily shoot up...but it appeared that Google liked that we got rid of all this thin/duplicate content and ranked us more (deeper than page 1, however). Even Dr. Pete advises no-indexing internal search results here: http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world So, why is Google rewarding Ebay and Sam Ash with page 1 rankings for their internal search result pages? Is it their domain authority that lets them get away with it? Could it be that noindexing internal search result pages is NOT best practice? Is the game different for eCommerce sites? Very curious what my fellow professionals think. Thanks,
    Dan

    | M_D_Golden_Peak
    0

  • Hey all... I'm designing the structure for a website that has 53 pages. Can you take a look at the attached diagram and see if the website structure is ok? On the attached diagram I have numbered the pages from 1 to 53, with 1 being the most important home page - 2,3,4,5, being the next 4 important pages - 6,7,8... 15,16,17 being the 3rd set of important pages, and 18,19,20..... 51,52,53 being the last set of pages which are the easiest to rank. I have two questions: Is the website structure for this correct?  I have made sure that all pages on the website are reachable. Considering the home page, and page number 2,3,4,5 are the most important pages - I am linking out to these pages from the the last set of pages (18,29,20...51,52,53). There are 36 pages in the last set - and out of this 36, from 24 of them I am linking back to home page and page number 2,3,4,5. The remaining 8 pages of the 36 will link back to pages 6,7,8...15,16,17. In total the most importnat page will have the following number of internal incoming links: Home Page : 25 Pages 2,3,4,5 : 25 Pages 6,7,8...15,16,17 : 4 Pages 18,19,20...51,52,53 : 1 Is this ok considering home page, and pages 2,3,4,5 are the most important? Or do you think I should divide and give more internal links to the other pages also? If you can share any inputs or suggestions to how I can improve this it will greatly help me. Also if you know any references for good guides to internal linking of websites greater that 50 pages please share them in the answers. Thank you all! Regards, P.S - The URL for the image is at http://imgur.com/XqaK4

    | arjun.rajkumar81
    0

  • I have a trouble with Duplicate content and title, i try to many way to resolve them but because of the web code so i am still in problem. I decide to use robots.txt to block contents that are duplicate. The first Question: How do i use command in robots.txt to block all of URL like this: http://vietnamfoodtour.com/foodcourses/Cooking-School/
    http://vietnamfoodtour.com/foodcourses/Cooking-Class/ ....... User-agent: * Disallow: /foodcourses ( Is that right? ) And the parameter URL: h
    ttp://vietnamfoodtour.com/?mod=vietnamfood&page=2
    http://vietnamfoodtour.com/?mod=vietnamfood&page=3
    http://vietnamfoodtour.com/?mod=vietnamfood&page=4 User-agent: * Disallow: /?mod=vietnamfood ( Is that right? i have folder contain module, could i use: disallow:/module/*) The 2nd question is: Which is the priority " robots.txt" or " meta robot"? If i use robots.txt to block URL, but in that URL my meta robot is "index, follow"

    | magician
    0

  • Hi, This is my website: http://goo.gl/fl5a5 I am competing for high competitive keyword : Hemorrhoids Treatment But i started my link building slowly. Like i started with social engagement first. Getting Facebook likes about 500 and tweets i got about 700 retweets in a month. Later i went for building links, i personally wrote 2 squidoo lenses, and wrote about 4 articles for high popular article sites like ezine, amazines, apsense etc. and got approved. And now, i wrote a press release and submitted to prweb, sbwire and pressdoc website. Got 3 press releases. And now, i submitted my site to 2 web directories botw.org and v7n directory and got approved in them. Later i submitted my site to 10 social bookmarking sites manually. Thats it in 1 month. In the mean time, i used to rank for the term Hemorrhoids Treatment on 7th page. Now i am not even in 50th page, completely out of index for that term in the search. But i am still ranking for other keywords. My site has 100% unique content, i built my links with extreme carefull and got quality links only. I have varied my anchor text as much as possible. Still i dont understand why i am not even in top 50 pages atleast. Am i hit by Google's Sandbox?? If so, could someone help in getting out of sandbox? Will be waiting for your answers.

    | Vegit
    0

  • Hey. I have a client who sells electric cigarette here in Denmark. And have chosen the right keywords for the client, most located in - Keyword Difficulty from 35-48% "Do you have any advice for how I get his keyword blah top 10? Give me some tips. He just create google plus and very active on facebook. but nothing has really happened in 2 months?

    | Agger
    0

  • There was recently a huge increase in 404 errors on Yandex Webmasters corresponding with a drop in rankings. Most of the pages seem to be from my blog (which was updated around the same time). When I click on the links from Yandex the page looks like it is loading normal, expect that it has the following message from the Facebook plugin I am using for commenting Any ideas about what the problem is or how to fix it? Critical Errors That Must Be Fixed | Bad Response Code: | URL returned a bad HTTP response code. | Open Graph Warnings That Should Be Fixed | Inferred Property: | The 'og:url' property should be explicitly provided, even if a value can be inferred from other tags. |
    | Inferred Property: | The 'og:title' property should be explicitly provided, even if a value can be inferred from other tags. |
    | Small og:image: | All the images referenced by og:image should be at least 200px in both dimensions. Please check all the images with tag og:image in the given url and ensure that it meets the recommended specification. |

    | theLotter
    0

  • Does change of web hosting have any effects on a websites SEO?and what are the factors that need to be taken care off while changing web hosting company in terms of seo perspective.

    | HiteshBharucha
    0

  • Hey. Would hear whether it is possible to SEO a website which is flash site cms?

    | Agger
    0

  • I run a site that is in Google news and built on wordpress. I have seen this type of post on a few sites http://searchengineland.com/searchcap-the-day-in-search-december-21-2012-143315 Its basically a summary of posts for that day or week etc I am wondering, does anyone know of an easy way to create these types of pages? At the moment we simply copy and paste and create all links manually. Im wondering, is there a plugin that allows you to create a post like this but quicker? Any ideas at all would be great Thanks in advance

    | JohnPeters
    0

  • Hi, While looking to buy a Christmas gift to my wife I was searching for yellow diamonds. Being a bit familiar with SEO I gotta understand how the following page was ranked 4th for "yellow diamonds": http://www.bluenile.com/diamonds/fancy-color-diamonds The phrase yellow diamonds is not mentioned even once! Thanks

    | BeytzNet
    0

  • Hi...I am a newbie & trying to optimize the website www.peprismine.com. I have 3 questions - A little background about this : Initially, close to 150 pages were indexed by google. However, we decided to remove close to 100 URLs (as they were quite similar). After the changes, we submitted the NEW sitemap (with close to 50 pages) & google has indexed those URLs in sitemap. 1. My pages were indexed by google few days back. How long does google take to display the URL in SERP once the pages get indexed ? 2. Does google give more preference to websites with more number of pages than those with lesser number of pages to display results in SERP (I have just 50 pages). Does the NUMBER of pages really matter ? 3. Does removal / change of URLs have any negative effect on ranking ? (Many of these URLs were not shown on the 1st page) An answer from SEO experts will be highly appreciated. Thnx !

    | PepMozBot
    0

  • Hi Guys! I have two websites which link to each other but are on the same server. Both the sites have a great PR and link juice. I want to know what steps shall I take in order to make google feel that both the sites are not owned by me. Like shall i get different IP and different servers for both or something more? Looking forward for you thoughts and help!

    | HiteshBharucha
    0

  • Law firm has a main brand site (lawfirmname.com) with lots of content focusing on personal injury related areas of law.  They also do other unrelated areas of law such as bankruptcy and divorce.  They have a separate website for bankruptcy and a separate one for divorce.  These websites have good quality content, a backlinking campaign, and are fairly large websites, with landing pages for different cities. They also have created local microsites in the areas of bankruptcy and divorce that target specific smaller cities that the main bankruptcy site and divorce site do not target well.  These microsites have a good deal of original content and the content is mostly specific to the city the website is about, and virtually no backlinks. There are about 15 microsites for cities in bankruptcy and 10 in divorce and they rank pretty well for these city specific local searches. None of these sites are linked at all, and all 28 of the sites are under the same hosting account (all are subdomains of root domain of hosting account).  Question, should I link these sites together at all and if so how?  I considered making a simple and general page on the lawfirmname.com personal injury site for bankruptcy and divorce (lawfirmname.com/bankruptcy and lawfirmname.com/divorce) and then saying on the page something to the effect of "for more information on bankruptcy go to our main bankruptcy site at ....." and putting the link to the main bankruptcy site.  Same for divorce.  This way users can go to lawfirmname.com site and find Other Practice Areas, go to bankruptcy page, and link to main bankruptcy site.  Is this the best way to link to these two main sites for bankruptcy and divorce or should I be linking upward? Secondly, should I link the city specific microsites to any of the other sites or leave them completely separate?  Thirdly, should all of these sites be hosted on the same account or is this something that should be changed?  I was considering not linking the city specific sites at all, but if I did this I didn't know if I should create different hosting accounts for them (which could be expensive).  The sites work well in themselves without being linked, but wanted to try to network them in some way if possible without getting penalized or causing any issues with the search engines.  Any help would be appreciated on how to network and host all of these websites.

    | broca77711
    0

  • I recently move my site  (www.leatherhidestore.com/servlet/StoreFront) off of the ProStores platform because I could never get Google to show my homepage in SERP results - instead always selecting random product pages to rank.  However, I never had this problem with Yahoo and Bing as they always defaulted to the homepage except when the category was a better match. Fast forward and I have just launched the site (www.leatherhidestore) on Magento Community and I STILL CANNOT GET GOOGLE TO USE MY HOMEPAGE FOR SERP RESULTS although I'm getting okay SERPS for random pages.....ERRRRRR!  Of course, as if to rub salt in the wound, Yahoo and Bing are behaving just perfect.  Still, I must think that if my Google would recognize my homepage (where the PR is and backlinks point to) I would be doing 10x better. I am showing duplicate page content and title problems which the developer is trying to solve but I do not know if this will fix the homepage Google issue.  I feel like I must be in some sort of canonicalization death spiral.  Has anybody dealt with this issue before and will mercifully share what I should do to fix it...please! Hunter

    | leatherhidestore
    0

  • This might be a bit of a complex question as it has many implications and I’d really appreciate some expert advice. I have a client who has one specific .gov link which is absolutely amazing. It doesn’t have any anchor text and goes straight to the homepage. However it drives quite a lot of revenue! It’s lovely as SEO’s to think of links actually driving revenue rather than assisting rankings of course. Over the last 30 days this single govt.  link has driven: 418 clicks, 61 conversions, $7,500 revenue Not bad I hear you say! J I really want to extract maximum value from this and I’ sure it would perform far better if it went to a really tightly focused landing page rather than the homepage. It really is hyper-targeted buyer traffic (as the stats show) but could perform even better in my opinion. I can’t change the link, that’s not possible unfortunately. However I can do some server side stuff to redirect traffic from just this link to a desired new page. However, what are the SEO implications of this in the opinion of some of the experts here? Obviously the link itself is valuable from an SEO perspective and I don’t want to lose that. I’m not sure how Google would treat the link if this were to be done. Also I really want to A/B test the homepage versus a landing page to ensure that it really does give an improvement. I’m not even sure how to achieve that given the difficulty in this situation. Any advice would really be useful! Thanks J

    | HarveyP
    0

  • Hi I'm trying to promote an ecommerce site that sells vitamins and health goods. The site owner doesn't want to add texts in the product pages because it is medical material. therefore he Currently has non unique (duplicated) content in each product page' It is the same exact content all others have (taken From the manufacturer)' Any ideas? Thanks

    | BeytzNet
    0

  • We've had a lot of success using Raven Tools, as well as some other tools for SERP Rankings for our clients; however, most only go down to the country level. We're researching into some good hyper local trackers (down to the city/zip level). Does anyone have any suggestions?

    | BlastAM
    0

  • Hi, This is my website: http://goo.gl/fl5a5 Earlier i used to be on #1 and #3 page results for most of my results. But after 21st dec, my results went to 10th and 13th page of results. Is it due to latest panda update? http://www.seroundtable.com/google-update-maybe-16121.html If so, can you guys examine my website and provide me your suggestions please... PS: i have followed only genuine kinda link buildings, my content is 100% unique. Will be waiting for your replies.

    | Vegitt
    0

  • Hello all Moz fans I want to focus on and start getting clients locally for small to medium businesses and my ethos and vision is to help them compete with the big guys in there niche can this really be done with there small budget and if so how would you go about approaching it..?

    | ReSEOlve
    0

  • I am using Wordpress and am targeting a specific keyword..and am using Yoast SEO if that question comes up.. and I am at 100% as far as what they recommend for on page optimization. The target html page is a "POST" and not a "Page" using Wordpress definitions. Also, I am using this Pinterest style theme here http://pinclone.net/demo/ - which makes the post a sort of "pop-up"  - but I started with a different theme and the results below were always the case..so I don't know if that is a factor or not. (I promise .. this is not a clever spammy attempt to promote their theme - in fact parts of it don't even work for me yet so I would not recommend it just yet...) I DO show up on the first page for my keyword.. however.. instead of Google showing the page www.mywebsite.com/this-is-my-targeted-keyword-page.htm Google shows www.mywebsite.com in the results instead. The problem being - if the traffic goes only to my home page.. they will be less likely to stay if they dont find what they want immediately and have to search for it.. Any suggestions would be appreciated!

    | chunkyvittles
    0

  • Hi Folks, Over the last 10 months we have focused on quality pages but have been frustrated with competition websites out ranking us because they have bigger sites.  Should we focus on the long tail again? One option for us is to take every town across the UK and create pages using our activities.  e.g. Stirling
    Stirling paintball
    Stirling Go Karting
    Stirling Clay shooting We are not going to link to these pages directly from our main menus but from the site map. These pages would then show activities that were in a 50 mile radius of the towns.  At the moment we have have focused our efforts on Regions, e.g. Paintball Scotland, Paintball Yorkshire focusing all the internal link juice to these regional pages, but we don't rank high for towns that the activity sites are close to. With 45,000 towns and 250 activities we could create over a million pages which seems very excessive!  Would creating 500,000 of these types of pages damage our site? This is my main worry, or would it make our site rank even higher for the tougher keywords and also get lots of traffic from the long tail like we used to get. Is there a limit to how big a site should be? edit

    | PottyScotty
    0

  • Hi there, I am seo-managing a travel website where we are going to make a new site structure next year. We have about 4000 pages on the site at the moment. The structure is only 2-levels at the moment: Level 1: Homepage Level 2: All other pages (4000 individual pages - (all with different urls)) We are adding another 2-3 levels, but we have a challenge: We have potentially 2 roads to the same product (e.g. "phuket diving product") domain.com/thailand/activities/diving/phuket-diving-product.asp domain.com/activities/diving/thailand/phuket-diving-product.asp I would very much appreciate your view on the problem: How do I solve this dilemma/challenge from a SEO standpoint? I want to avoid DC if possible, I also only want one landing page - for many reasons. And usability is of course also very important. Best regards, Chris

    | sembseo
    0

  • Hi, I've seen a new article by Dr. Pete on diversifying links for 2013 (http://www.seomoz.org/blog/top-1-seo-tips-for-2013), now my question is this: Dr. Pete talks about mixing up the anchor text for links, is so we don't get caught out by Google or actually mixing it has a better impact? For example: 1. 20 anchor text links targeting just the target term. 2. 20 anchor text links targeting 4 variations of the target term. Is number 2 recommended so things look natural or does it actually have a better impact on SEO. Thanks

    | activitysuper
    0

  • A client of mine." Ross X Bute" current meta title is "Luxury designer clothing | Womens designer clothing" for the homepage. If i search for luxury designer clothing it will show the full meta title for the homepage. however if i search for the brand name.... "Ross & Bute" will show instead of the meta title. Whats the problem? Well my client a few month ago has decided to re brand the business to have a "X" to show instead of the "And". The rest of the site is branded with an "X" rather than "And" The URL www.rossandbute.com, so you can understand where google is getting this assumption from. Is there anyway to change this so it reads the the meta title in the SERPs? Thanks

    | Martin_Harris
    0

  • We are planning on a mega menu which will have around 300 links and a mega slider which will have around 175 links if our developer has their way. In all I could be looking at over 500 links from the home page. The Mega Menu will flatten the site link structure out but I am worried this slider on the home page which is our 4th most visited page behind our 3 core category pages. What are your thoughts?

    | robertrRSwalters
    0

  • One of our sites utilizes a single .com domain but offers differentiated article page for users depending on their location. For example: example.com/articles/how-to-learn-seo-gb-en for UK example.com/articles/how-to-learn-seo-us-en for US example.com/articles/how-to-learn-seo-au-en for Aus Currently we use example.com/articles/how-to-learn-seo as the relative link on the site and then the user is redirected by 302 to the correct article for them based on their location. I've read countless pages about 302 redirects (and largely why you shouldn't use them because of link juice, indexing etc) but what alternative can we use since we don't want to permanently redirect to one URL but rather redirect to the relevant URL based on the users location. All the stuff I've read talks about redirecting using 301s but this surely only works when you are redirecting from one URL to one permanent new URL as opposed to redirecting to one of many country specific URLs. It's not really a solution for us to set up separate TLDs for each country so what is the best mechanism for redirecting user to the correct article for them and making sure that link juice is shared, pages are indexed etc? I hope I've explained this well enough for any of you to offer advice. Many thanks in advance.

    | simon_realbuzz
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.