Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hey Guys Just wanted to get some friendly feedback on ways that you like to promote linkbait. Personally, I like to: a) develop the companies main social media channels (twitter and facebook) b) develop accounts on digg and reddit etc c) email journalists and blog owners who appear to be part of my clients 'linkerati' Has anyone got any interesting approaches or strategies for getting the viral ball rolling?

    | SebastianDyer
    0

  • Hi all, I have a listings based website that just doesn't seem to want to pass rank to the inner pages. See here for an example: http://www.business4sale.co.uk/Buy/Hotels-For-Sale-in-the-UK I know that there are far too many links on this page and I am working on reducing the number by altering my grid classes to output fewer links. The page also displays a number of links to other page numbers for these results. My script adds the string " - Page2" to the end of the title, description and URL when the user clicks on page two of these results. My question is: Would an excessive amount(200+) of links on a page result in less PR being passed to this page(looking spammy)? And would using rel canonical on page numbers greater than 1 result in better trust/ranking? Thanks in advance.

    | Mulith
    0

  • Going to setup a blog for a 4 year old ecommerce website and was wondering if it would be a good idea to put a blog on the sub domain or just a folder like www.domain.com.au/blog I'll be using the blog to Link bait articles Social bookmark traffic Linking keywords to products on the ecommerce site. I wanted to know if The link juice would be greater if we cross link from sub-domain to main domain? Any major dis-advantages in having it on a sub-domain vs folder? Any other major differences? Cheers!

    | upick-162391
    0

  • I'm getting duplicate content errors, but it's for pages with high-res images on them.  Each page has a different, high-res image on it.  But SEOMoz keeps telling me it's duplicate content, even though the images are different (and named different). Is this something I can ignore or will Google see it the same way too?

    | JHT
    0

  • Hello I have assembled a list of web-sites that have "Links" section that has a list of persons` favorite tools. Those pages have a link to my competitor. I know my tool is just as good if not better and want to request a link. I`m thinking of sending an email asking for a link and offering a small amount of money for it. Questions: A) How much should I offer? Should I offer anything at all B) Is there an email style that someone can suggest that has been tested and proven to work for this type of situtation?

    | hellopotap
    0

  • Howdy Guys, I imagine you've seen this question a bunch of times before. I've search the old questions and I thought our situation was slightly unique so I thought I'd ask the question. Have we been blocked! Our website – danz.co.uk no longer appears in the search hot tubs. Infact, it no longer even ranks for its own name! Up until a few days ago we had one of those site maps showing up on the main page – its all gone! What is really weird is – and the part I thought was slightly unique - is every else one our website, apart from our home page, is still showing up in Google? If you search site:danz.co.uk, Google lists hundreds of our pages and if we search for less important keyword terms, our sites still show up. So clearly, our whole domain hasn't been blocked, but I fear the home page may well have been? I can't think where we've gone wrong? I've started some serious link build recently. I've checked all of the links over, none of them have been blacklisted. They're all decent links, mainly directories, some 'follow' blogs, social media etc. Most of the serious links have been posting to the main page – danz.co.uk. We had a guy doing some SEO for us in the past, he had some really crappy non-follow links but nothing I thought that would have set us back? It may be useful for you to know that we have a 301 direct from our main page (danz.co.uk) to another page on the same domain – www.danz.co.uk/shop. However, this has been like this for years and never seems to have caused a problem. Your thoughts? Have we been blocked?

    | danzspas
    0

  • Oftentimes my site has content which I'm not really interested in having included in search engine results. Examples might be a "view cart" or "checkout" page, or old products in the catalog that are no longer available in our system. In the past, I'd blocked those pages from being indexed by using robots.txt or nofollowed links. However, it seems like there is potential link juice that's being lost by removing these from search engine indexes. What if, instead of keeping these pages out of the index completely, I use to reference the home page (http://www.mydomain.com) of the business? That way, even if the pages I don't care about accumulate a few links around the Internet, I'll be capturing the link juice behind the scenes without impacting the customer experience as they browse our site. Is there any downside of doing this, or am I missing any potential reasons why this wouldn't work as expected?

    | cadenzajon
    1

  • This is a canonicalization type question, so I believe it should be a pretty straightforward answer. I just haven't had much experience with using the canonical tag so I felt I should ask so I don't blow up my site 🙂 Ok, let's say I have a product page that is at: - www.exampledomain.com/products/nameofproduct Now on that page I have an option to see all of the specs of the product in a collapsible tab which I want to link to from other pages - So the URL to this tab ends from other pages ends up being: - www.exampledomain.com/products/nameofproduct?=productspecs This will link to the tab and default it to open when someone clicks that link on another page. Correct me if I'm wrong, but if I understand canonicalization correctly I believe creating this link is going to cause a duplicate page that has the opportunity to be indexed and detract from our SEO to the main product page. My question is... where do I put the "rel=canonical" tag to point the SEO value back to the main page since the page is dynamically generated and doesn't have its own file on the server? - or do even need to be concerned with this? Feel free to correct me if I'm wrong on any of the above. Like I said - this is something I am fairly familiar with how it works, but I haven't had much experience with using. Thanks!

    | CodyWheeler
    0

  • Right now I have an E-Commerce website that has layered menu properties. I have one page trying to rank for "NextGen Digital Ballast" that is the main category page. However, on that category page I link out to three product pages which would be "NextGen 400W Digital Ballast", "NextGen 600W Digital Ballast" and "NextGen 1000W Digital Ballast". The on page ranking factors tools is saying I may need to consider making adjustments because of the potential self-cannibalization, but I wanted to get some feedback to see what others thought about that. Thanks.

    | JerDoggMckoy
    2

  • So I started making a sitemap for our new golf site, which has quite a few "low level" pages (about 100 for the golf courses that exist in the area, and then about 50 for course architects), etc etc. My question/open discussion is simple.  In a sitemap that already has about 50 links, should we include these other low level 150 links?  Of course, the link to the "Golf Courses" is there, along with a link to the "Course Architects" MAIN pages (which, subdivides on THOSE pages.) I have read the limit is around 150 links on the sitemap.html page and while it would be nice to rank long tail for the Golf Courses. All in all, our site architecture itself is easily crawlable as well. So the main question is just to include ALL the links or just the main ones? Thoughts?

    | JamesO
    0

  • I don't know what ranking factors they are using for this feed. The results vary greatly from a search done at google.com or google.com/news and google.com/finance I'm working with a website that regularly publishes finance-related news and currently gets traffic from google finance. I'm wondering what we can do to optimize our news articles to possibly show more prominently or more often. Thanks

    | joemascaro
    0

  • appliance-repair-ny.com I recently got some new good page rank backlinks (4-5) to my site. After a nice increase in rankings i left it for a couple of day (holiday) when i came back i found out that my rankings dropped dramatically for many different keywords when in some cases the backlinks are listed higher than me for the keyword. Any suggestions? should i keep on building links in the same paste, or slow down a little? UPDATE: I now noticed that the seomoz app has a warning that my homepage is missing title tag and meta tag. I'm using wordpress and headspace plugin. I recently updated wordpress.

    | atohad
    0

  • Hey everyone. We are about to create a sitemap.html page and have always just kept the site theme in place and put the sitemap in the "content" section of the page, with the header navigation, sidebars and footer in place. Well, now with the new "only first link counts" Google rule, wouldn't it be better to just have a "plain" html sitemap page without any other links on it?

    | JamesO
    0

  • We are working on a new website that is golf related and wondering about whether or not we should set up a subnavigation dropdown menu from the main menu. For example: GOLF PACKAGES
      >> 2 Round Packages
      >> 3 Round Packages
      >> 4 Round Packages
      >> 5 Round Packages GOLF COURSES
      >> North End Courses
      >> Central Courses
      >> South End Courses This would actually be very beneficial to our users from a usability standpoint, BUT what about from an SEO standpoint? Is diverting all the link juice to these inner pages from the main site navigation harmful?  Should we just create a page for GOLF PACKAGES and break it down on that page?

    | JamesO
    0

  • Hi All, I'm launching a new website with a number of country specific sub-domains and I wanted to know if Google will calculate the number of new links as a root domain or if it will treat each subdomain seperately? For instance if I built 50 links per month to each of my five proposed subdomains would google see it as 250 links built to one root domain(and penalise me as a result) or will they view these subdomains independantly and accept these 50 links per page as an acceptable amount per sub domain. Thanks in advance. Ross

    | Mulith
    0

  • All things considered, directories, blogs, articles, press releases, forums, social profiles, student discount pages, etc, what do you consider to be a strong, phased, link building strategy? I'm talking beyond natural/organic link bait, since many larger accounts will not allow you to add content to their website or take 6 months to approve a content strategy. I've got my own list, but would love to hear what the community considers to be a strong, structured, timeline-based strategy for link building.

    | stevewiideman
    1

  • My company website recently got its site links in google search... WooHoo! However, when you type TECHeGO into Google Search one of the links is spelled incorrectly. Instead of 'CONversion Optimization' its 'COversion Optimization'. At first I thought there was a misspelling on that page somewhere but there is not and have come to the conclusion that Google has made a mistake. I know that I can block the page in webmaster tools (No Thanks) but how in the crap can I get them to correct the spelling when no one really knows how to get them to appear in the first place? Riddle Me That Folks! sitelink.jpg

    | TECHeGO
    0

  • Hey All, First off I wanna thank everyone who has responded to all my previous questions! Love to see a community that is so willing to help those who are learning the ropes! Anyways back to my point. We have a main site that is a PR 3 and our main focal point for lead generation. We recently acquired 50 additional sites (all with a PR of 1-3) that we would like to use as our own little back linking campaign with. All the domains are completely relevant to our main site as well as specific pages within our main site. I know that reciprocal links will get me no where and that google is quickly on to the attempted 3 way link exchange. My question is how do I best link these 50 sites to not only maintain there own integrity and PR but also assist our main site. Thanks All!

    | deuce1s
    0

  • Hi, I'm running a wordpress blog (modhop.com) and am getting the "too many links" on almost all of my pages. It appears that in addition to basic site navigation I have plug-ins that create invisible links that are counted in the crawl...at least that's my guess. Is there a good way to control this in wordpress? A nofollow in the .htaccess? A plug-in that does this? (I'm sort of at novice-plus level here so the simplest solution is ideal.) Thanks! Jake modhop.com

    | modhop
    0

  • I came across this today when doing a Google search - a site that has a small promo code listed on their page shows a preview with that promo highlighted...I dug around their code a little bit, but couldn't find if it was something they were doing to manipulate their preview in the search results. Is Google automatically highlighting promotions in previews, or did getmarried.com/magazine/ somehow manipulate the page to make their promo highlighted?  And if so, how? You can see the site at http://www.getmarried.com/magazine/ and you can see the preview with the promo highlighted attached. ZDY52.png

    | klars524
    0

  • We've recently (in the past 6 months) implemented rich snippets into one of our websites, which lists events around the USA. We've run our code through Google's testing tools and all looks good. When we search for terms which we compete for, many of our competitors display the event sub-links in the SERPs but we still do not. Does anyone have any experience with rich snippets with regards to whether site reputation or  (we're a new website) others factors will impact if/when google will display our rich snippets in search results? Thanks!

    | BTeubner
    0

  • I've got a small, 3 month old site that was ranking for a few low-competition keywords. Then, yesterday, it dropped out of the rankings almost completely. The only way to find it is to google the URL/site name, and then it does come up. There are pages in the index, they're just not ranking like they did two days ago. I'm not doing anything black hat or even slightly shady - just writing articles and clean link building. Is this a normal part of the Google process? I've never seen it happen on any other sites I've been involved with.

    | damoncali
    0

  • I'm in the process of buying and running an existing forum that is running on DotNetNuke 5.2.0 and Active Fours 4.1. As part of the transfer, I'm asking that the site be upgraded to the latest version of DNN and AF 4.3. AF 4.3 has SEO-friendly URLs instead of the current long, ugly default URLs, and I'm looking forward to implementing that feature. My specific question is: What would you do to prepare for this upgrade in terms of the content, especially related to the URL changes? I've gone into Google Analytics and downloaded content by page title, exported the first 1000 results, and put those titles into Word and corrected spelling errors in the title so URLs will be based on correct spellings. General background: The site is not currently monetized, and there will not be an initial focus on monetization and likely only smaller efforts (affiliate Amazon links in a resource section) in the future. The site is free for users. I'm fine with taking a hit in organic traffic in the short term. About 1/3 of the traffic is from search engines right now, and less than 30% of the visitors are new visits. The site is going to continue much the same as it has until now. Same moderators, same purpose, same skin, etc. I have access to GA, site is verified in GWT, need to verify in Bing, and I do have root access to the server. I've already started working on image file sizes, both of user-submitted images and site-related images like the header. Until now, I have no experience with DNN or AF or any of the extensions (and am appalled at the price and lack of features of some of those extensions, compared to what I'm used to for WordPress). More general questions: In terms of SEO, I'm intending to treat the upgrade of the forum with the friendly URLs as a re-launch. I'm wanting good URLs, put in a site map, fix non-www to www, etc. When I start making the changes and submitting the site map and generally drawing Google's attention, I want Google to like what it sees, and have as much optimized as possible when googlebot comes around. My goal is to draw more targeted visitors from search that are interested in the content in the site. What other suggestions do you have for the site prep, both from being a forum in general and specifically on DNN/AF? I'm not putting the URL out just yet, as we haven't announced to the users the change of ownership is taking place. Thanks everyone!

    | KeriMorgret
    1

  • A site we are working on is a large gift retailer in Canada. They have a language option for French, but the page URLs are the same. If you click 'French' in the header, a cookie is set and then all pages are dynamically served the French content (and all nav/site elements of course change to French). The URLs then are exactly the same as it's the cookie that determines the language option to serve. e.g. www.site.ca/index.php?category=7&product=99....    would be the same regardless of if I'm set for English or French. Question: Does this setup have a negative impact on any SEO factors? The site has several thousand pages.

    | BMGSEO
    0

  • I am doing SEO for a website that has constantly rotating and only temporarily pertinent subjects on it.  Let's say these information and subject cycles go for about 6 months.  Assuming this would it be more effective to optimize exact match domains for each 6 month cycle or make a main domain with a few of the keywords and just target a page for each roaming subject? Advantage of the subject is I get domain authority to feed off of, advantage of the exact match is, of course exact match domains are a powerful tool to rank highly and it is only a medium competitive market, usually about 40 domain and page authority. What do you guys think?  Do you have any techniques to dominate temporary and rotating markets?

    | MarloSchneider
    0

  • I have a client that is creating separate websites to be used for different purposes.  What is the best practice here with regards to not looking spammy.  i.e. do the domains need to registered with different companies?  hosted on different servers, etc? Thanks in advance for your response.

    | Dan-171803
    0

  • Hi All My site is live for a year now. I;m getting tons of traffic (alexa 54k) and business are good. The only problem is that I have 0 page rank....I have checked again and again the site;s structure to see if there is anything wrong with the site but everything seems to be ok. Google just added search links to the site (megamoneygames) which looks very nice. For example, none of my competitors have search links but they all have page rank of 4 while I have 0. In addition, for some reason the site's age (days) shows 0 although it is live for a year now... Do you have any idea of what is going on? do I have errors in the site? Thanks

    | Pariplay
    0

  • We are a web hosting company and some of our best links are from our own customers, on the same IP same IP, but different Class C blocks. How do search engines treat the uniqie scenario of web hosting companies and linking?

    | FirePowered
    0

  • Looking over the discussion of underrated SEO tactics at http://sphinn.com/story/178993/ , I'm curious if folks here have any favorite SEO tactics that they feel are ignored, underrated, or somehow not appreciated by the community at large.  Any thoughts? Among the tactics listed in the Sphinn post: Blog commenting Analytics to identify low-hanging keyword fruit Getting your site set up properly at the server level Unique and relevant imagery Internal links Google Place page optimization Several more... Any others that should be included?  I'd personally add segmenting your keyword traffic into trademark (those that mention your brand name) versus non-trademark segments for more thorough analysis.

    | jcolman
    2

  • Hello, I know that if its good for the user, its not a bad move. But for this question I am specifically asking for how it affects my ranking. Does it help my ranking to link to appropriate authority sites?
    Have you done any tests to see if linking out to authoritative sites like .gov info pages, industry leaders, etc. help with a sites ranking. I am thinking about taking of all of these outgoing links and just link to my important pages. Thank you, Tyler

    | tylerfraser
    0

  • The site didn't go down. There were no drop in rankings, or traffic. But we went from averaging 150,000 pages crawled per day, to ~1000 pages crawled per day. We're now back up to ~100,000 crawled per day, but we went more than a week with only 1000 pages being crawled daily. The question is, what could cause this drastic (but temporary) reduction in pages crawled?

    | Fatwallet
    0

  • hey.. I"m working in review blog one day per 1 or 2 weeks and I post up to 6 articles one time; is it unnatural for SEO ? how many articles should I post in blog per day? another question..how many backlinks should I get to just one post? I'm using Magic Submitter software to get help but I don't get more than 50 backlinks one time..what's real number of backlinks should I get and for how much time to be 100% natural for Google? any helpful info about backlinks techniques worth to hear..thnx

    | akitmane
    0

  • I own a movie trailer website. (Where you can watch movie trailers) Will having links on each page that are for "offical website" of each movie, increase my SEO?

    | rhysmaster
    0

  • so I was wondering if you think a bit of software such as bookmarkingdemon (or similar) would be worth investment, or just a couple of "proper" accounts on digg, stumble, etc is a better route? As an agency would you create an account for each client, or just have one for your own agency and tag your clients within? Im not sure even how much impact it will have on SERPs? thanks!

    | SEOwins
    0

  • We have just run our first campaign for our site and have found over 5,335 errors! It would appear that the majority of which are where the crawl has duplicated the product page with the "write a review - Tell a friend page"...hence a large number of errors. In addition we also have over 5,000 302 warnings for the following URL: URL: http://www.collarandcuff.co.uk/index.php?_a=login&redir=/index.php?_a=viewCat&catId=105 Please bear in mind we are fairly new to this type of data....so go easy on us. In short, will these errors have a significant bearing on our rankings etc and if so how do we rectify? Many thanks. Tony

    | collar64
    0

  • Does anyone know of a page template or code I might want to base a blog on as part of an eccomerce website? I am interested in keeping the look (includes) of the website and paying attention to Source Ordered Content helping crawlers index the new great blogs we have to share. I could just knock up a page with a template from the site but I would like to investigate SOC at this stage as it may benefit us in the long run. Any ideas?

    | robertrRSwalters
    0

  • Hi there, A linkbuilding company that has been building links for us has not gained any sustained results. They have advised that our domain may be toxic, and that we should consider permanent redirecting from .co.uk to another domain extension in order to remedy this. Is this a recommendation worth considering?

    | Maximise
    0

  • I have developed a successful portal based website but would like to grow my portfolio of sites by expanding into new niches and sectors. I would like to use the same source code to fast track new sites but I'm not sure of the dangers involved. Content, meta details etc. will all be unique and the only similarity will be the html code. Another example of how I want to use this is that my current site targets the UK but I want to target a global market with a .com domain and this would involve using the same source. Is this possible without a penalty or am I overlooking something?

    | Mulith
    0

  • Hello, We have a new client that has several sites with the exact same content. They do this for tracking purposes. We are facing political objections to combine and track differently. Basically, we have no choice but to deal with the situation given. We want to avoid duplicate content issues, and want to SEO only one of the sites. The other sites don't really matter for SEO (they have off-line campaigns pointing to them) we just want one of the sites to get all the credit for the content. My questions: 1. Can we use the rel canonical element on the irrelevent pages/URL's to point to the site we care about? I think I remember Matt Cutts saying this can't be done across URL's. Am I right or wrong? 2. If we can't, what options do I have (without making the client change their entire tracking strategy) to make the site we are SEO'ing the relevant content? Thanks a million! Todd

    | GravitateOnline
    0

  • I have a website in a niche that's highly graphical in nature. Most of the pages that I rank well for are mainly textual at the moment, but I'm gradually adding image galleries to these pages. The galleries consist of a number of thumbnails that are html linked to the large version of the image (via the Lightbox script). My question: will the page lose pagerank because of the many links from the thumbnails to the images (upto 30/page besides the normal links)?

    | dirkla
    0

  • Hey All, So to make a long story short, we own a site that has been passed through many hand and many strategies. We are in the financial field and rank high for many relevant search terms. My job is now to audit/optimize and purge out site of the garbage that has collected over the years (since 2002). During the audit I have found many issues, fized them, but I am not sure own how to proceed with the follwing issues. Any advice to solve the following would be greatly appreciated! 9932 orphan files - does just removing them affect my SEO.. I like a clean house, can I somehow use them to my benefit? Hundreds of 404s with many external "follow" links that we are no longer getting juice from 8 Sitelinks in webmaster tools, but only 4 show in our search I am straight n00b so sorry if this is 101 for anyone you but your input would be greatly appreciated!! Thanks!

    | deuce1s
    0

  • I have not found a clear answer to this particular aspect of the "first link priority" discussion, so wanted to ask here. Noble Samurai (makers of Market Samurai seo software) just posted a video discussing this topic and referencing specifically a use case example where when you disable all the css and view the page the way google sees it, many times companies use an image/logo in their header which links to their homepage. In my case, if you visit our site you can see the logo linking back to the homepage, which is present on every page within the site. When you disable the styling and view the site in a linear path, the logo is the first link.  I'd love for our first link to our homepage include a primary keyword phrase anchor text. Noble Samurai (presumably seo experts) posted a video explaining this specifically http://www.noblesamurai.com/blog/market-samurai/website-optimization-first-link-priority-2306 and their suggested code implementations to "fix" it http://www.noblesamurai.com/first-link-priority-templates which use CSS and/or javascript to alter the way it is presented to the spiders. My web developer referred me to google's webmaster central: http://www.google.com/support/webmasters/bin/answer.py?answer=66353 where they seem to indicate that this would be attempting to hide text / links. Is this a good or bad thing to do?

    | dcutt
    0

  • Hi. I am responsible for choosing an ecommerce platform and overseeing the implementation of a large ecommerce site. The site will have tens of thousands of products and will be fairly complex. Eventually the site will integrate with the suppliers back end inventory/order management system, which is some sort of custom Windows/.NET system. (I'm not very technical if you haven't noticed....) Primarily I want a platform that is SEO friendly, and I have to be sure that the site is developed properly from an SEO and usability perspective. I thought I would go with an asp.net solution (aspdotnetstorefront, specifically) to facilitate the future integration, but I am questioning this choice after reading some of the comments I have found here at SEOMoz. So is asp.net really a bad choice SEO-wise? I almost considered Magento, but was having trouble finding a good solution provider to work with. I also worried about integration issues down the road. I would appreciate any advice or input anyone may have. Thank you!

    | sbg781
    0

  • I know people suggest having a H1 with your keyword on your page somewhere to help with SEO. My current wordpress theme has the post title as a H2. Is there a big difference in using H1 vs H2

    | DojoGuy
    0

  • I have a website that is nearly all about videos and is based on Wordpress. Does anyone know of a way to create a video sitemap that updates automatically as I write a new post? The video files and other data are all stored in separate meta-post locations... So it needs to be able to grab them. Any help is appreciated.

    | DojoGuy
    0

  • Hi mozzers, Anyone know how to do a yahoo search to return results that meet have a specific URL For example, say I wanted to search for the keyword finance and I only wanted it to return results with the word "business" anywhere in the URL. I have done in the past, but I just cant remember how to do it! Thanks

    | PeterM22
    0

  • I need to redirect a ton of duplicate content, so I want to try redirect 301 /store/index.php /store redirect 301 /store/product-old /store/product-new redirect 301 /store/product-old1 /store/product-new1 redirect 301 /store/product-old2 /store/product-new2 redirect 301 /store/product-old3 /store/product-new3 redirect 301 /store/product-old4/file.html /store/product-old4/new4/file.html and then a whole bunch of old dead links to homepage. So we've had /index.php redirected to / on other parts of the site for awhile, and for the most part /store is a friendly URL, but then we have tons of dup content and work arounds that preceded my job here. I'm wondering if those redirects above would be considered a redirection chain? Since the all the redirects below the /index.php -> /store count on that one redirect. Thanks for any insight you may be able to give!

    | Hondaspeder
    1

  • I've recently deployed rich snippets for a site that includes reviews on individual company profile pages.  We passed the Rich Snippet Testing Tool and notified Google of our pages.  It's been 2 months since we deployed this and still nothing.  Does anyone have any feedback on the average time it takes Google to recognize these markup and include in on the SERPs? Or any articles that reference which industries they will include reviews in the SERPs for and which ones they won't? Thanks.

    | DustinSEO
    0

  • A couple of years ago I used to receive a lot of traffic via my category pages but now I don't receive as much, in the past year I've modified the category pages to canonical. I have 15 genres for the category pages, other than the most recent sorting there is no sorting available for the users on the cat pages, a recent image link added can over time drop off to page 2 of the category page, for example mysite.com/cat-page1.html = 100 image links per page with numbered page navigation, number of cat pages 1-23. New image link can drop off to page 2. mysite.com/dog-page1.html = 100 image links per page with numbered page navigation, number of cat pages 1-53. New image link can drop off to page 2. mysite.com/turtle-page1.html = 100 image links per page with numbered page navigation, number of cat pages 1-2. New image link can drop off to page 2. Now on the first page (eg mysite.com/cat-page1.html) I've set this up to rel= canonical =  mysite.com/cat-page1.html One thing that I have noticed is the unique popup short description tooltips that I have on the image links only appears in google for the first pages of each category page, it seems to ignore the other pages. In view of this am I right in applying canonical ref or just treating it as normal pages.? thanks

    | Flapjack
    0

  • Hi all, One of my sites http://www.nieuwsspion.nl lost it's rankings on December 15th last year. eg a search for "nieuwspion" shows the site at about #75 I can't figure out what could be wrong... no paid links, it's not the farmer/panda algo wich came up only this year... G.webmastertools is saying: i get 18.052 links from zappen.blog.nl (anchor "nieuwsspion" to a sub-page) and 7.910 links from linkspagina.eu but i don't see those in Open Site Explorer, and they shouldn't harm me as far as i understand? Any input very much appreciated, Elgoog

    | elgoog
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.