Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi There, I am in  the process of formulating a listing policy for my site and I'm not sure whether I should add something in there for swear words. My site is an adult site and swear words come with the territory, unfortunately. Will user generated content with swear words affect my ranking? Thank you

    | Mulith
    0

  • I would think it would make the most sense to optimize the homepage for 'NYC apartments'. Then have two pages, one for 'apartment rentals' and another for 'apartment sales' directly underneath in the site's hierarchy.  Is this is how you would do it? The competition for these keywords mostly have 'NYC Apartment Rentals' and 'NYC Apartment Sales' bunched together on their homepage. Is it possible to have 3 separate pages from the same domain rank on the first SERP? Also, the company I work for is called Platinum Properties. If we fail to include 'Platinum Properties' in the title tag would that negatively effect our current position for 'Platinum Properties' in the search results? What would be an effective way to get listed for both the keyword and company name?

    | platinumseo
    0

  • Hi, We have a directory of 25,000 odd companies who use our site. We have a strong PR site and want to rank a page for each company name.  Some initial testing on one or two company names brings us to #2 after the company's own web site in the format:  "Company Name Reviews and Feedback" - so it works well. We want to do this for all 25,000 of our members, however we do not wish to make it easy for our competitors to scrape through our member database!! e.g. using: www.ourdomain.com/randomstring/company-name-(profile).php unfortunately with the above performing a search on google for site:domain.com/()/()(profile).php would bring up all records. Are there any tried and tested ways of achieving what we're after here? Many Thanks.

    | sssrpm
    0

  • The following scenario:
    We have a domain with .info (good Domain Authority) which ranks well in google.at, google.com but not in googe.de.
    We want to rank well in google.de, too
    (same language in .at = Austria + .de = Germany though). The paths of the websites are: www.example.info/de/keyword.html  --> so we do use the path for Germany already to target that country. What strategy you suggest?

    | petrakraft
    0

  • Hello All Firstly new to SEO MOZ but what a fantastic resource, good work! I help run a platform at ethical community (dot) com (have phrased it like that so google doesn't pick up this thread hope thats ok). We seem to have something glaringly obvious with the SEO ability of our product pages. We now have over 7000 products on the site and would like to think we have done a pretty good job in terms of optimisning them, lots of nice keywords, relevant page titles, good internal links, and even recently have reduced the loading speeds a fair amount. We have a sitemap set up feeding in URLS to Google and some of them are now nearly a year old. The problem, when doing an EXACT google search on a product title the product pages dont show up for the majority of the 7000 products. HOWEVER.... we get fantastic ranking in google products, and get sales through other areas of the site, which seems even more odd. For example, if you type in "segway" you'll see us ranking on the first page of google in google products, but the product page itself is nowhere to be seen. For example "DARK CHOCOLATE STRANDS 70G CAKE DECORATION" gets no results on google (aside from google products) when we have this page at OURDOMAIN/eco-shop/food/dark-chocolate-strands-70g-cake-decoration-5592 Can anyone help identify if there is a major bottleneck here our gut feeling is there is one major factor that is causing this.

    | ethicalcommunity
    0

  • My client has a classified ads website with hundreds of thousands of classified ads. These ads expire quite fast. When the ad expires it gets removed. At the moment this results in a 404 page and thus hundreds of thousands of 404 erros in Webmasters Tools. From what I know this damages SERP results due to slow indexing of important sites and 404 being just plain bad SEO. I suggested doing a 301 from the expired ads to a upper category but this feels like cheating. The content hasn't actually moved, it has been removed. What would you suggest?

    | PanuKuuluvainen
    0

  • I've noticed nofollow links showing up in my Google Webmaster tools "links to your site" list. If they are nofollow why are they showing up here? Do nofollow links still count as a backlink and transfer PR and authority?

    | NoCoGuru
    1

  • Are there any advantages of dis-advantages to running a static homepage as opposed to a blog style homepage. I have be running a static page on my site with the latest posts displayed as links after the homepage content. I would like to remove the static page and move to a more visually appealing homepage that includes graphics for each post and the posts droppping down the page like normal blogs do. How will this effect my site if I move from a static page to a more dynamic blog style page layout? Could I still hold the spot I currently rank for with the optimized index content if I turn to a more traditional blog format? cheers,

    | NoCoGuru
    0

  • We have a client that provides 5 different services all under one brand and one site.  The challenge is getting good web popularity to each major brand.  It looks like this: Brand ( No real Internet traffic to Brand keywords) Service A Service B Service C Service D Service E All the services are displayed on the home page.  The SEOmoz scores are now (in general) as high or higher than the competition in the service keywords, but, of course, we are only ranking on the top for one of the services.  It's a local business. We are just finishing a re-design of the site so that each service has an internal "micro site" within the domain.  Each of the services is linked from the home page to its micro sites Anchor page.  The anchor page is linked to 2 to 4 other pages all directly related to that services information. SO my question is,  should we continue to work on building the home page popularity, or focus a mix between the Home page (Brand)  and the Micro sites Anchor pages. or just go for popularity of the Internal Anchor pages for the services? Hope this makes sense.  And thanks in advance for your thoughts.

    | MBayes
    0

  • They have a new site, no links, no content, their page isn't optimized for this keyword (it's not even one on the page or their page title)... They only have 5 incoming links with the keyword in it, but its competitors have way more. Can someone solve this mystery?

    | elcrazyhorse
    0

  • I'm thinking about purchasing http://employeesurveyresource.com/ .. $300 It's showing a PR5, but I'm not seeing many backlinks in Y! site explorer or open site explorer. Has anyone been successful in absorbing the PR from older domains / domains without current content? I know sometimes people can fake PR by 301'ing from a high PR site, to sell the domain and then cancel the 301, leaving you with a dud. Can someone shed some light on this please? Thank you!

    | CareerBliss
    0

  • We are rolling out our canonicals now, and we were wondering: what happens if we decide we did this wrong and need to change where canonicals point? In other words, how bad of a thing is it to have a canonical tag point to page a for a while, then change it to point to page b? I'm just curious to see how permanent of a decision we are making, and how bad it will be if we screwed up and need to change later. Thanks!

    | CoreyTisdale
    0

  • Hi All, I am setting up a new site and I want to make use of subdomains to target multiple countries as follows: uk.mydomain.com us.mydomain.com australia.mydomain.com etc. Now i know what you're all going to say, why not use folders as they are more effective. Well I did think of this but decided against it because I would like to make the best of a low competition industry. I want to push my competitors as far down in the SE's as possible and i plan to do this by targeting generic non locational search terms with both sites so I can hog the top 4 spots.as follows: www.mydomain.com www.mydomain.com/keyterm uk.mydomain.com uk.mydomain.com/keyterm-in-the-UK Whats steps can I take to ensure rank passes to my subdomains? Is it better to start the site with folders like www.mydomain.com/us/keyterm and then 301 them to subdomains at a later stage or should i start with the subdomains?

    | Mulith
    1

  • We have been hosting our website with a provider (their design and CMS) and we are now moving to a new design, better content focussing on keywords in a different CMS platform on different servers but want to retain the link juice from the old site. We have used Open Site Explorer Report to determine all the links to the old site and the pages they link to. What is the best strategy to keep the link juice flowing to the new site? Example This site <http: www.dogslifedownunder.com="" what-is-worse-then-going-to-the-v-e-t="">links to this page <http: 19105="" www.sydneyanimalhospitals.com.au="" ourstaff="" thevets="" tabid="" default.aspx="">on the old site.</http:></http:> We will have a similar page on the new site with the same staff members called for example: How do we ensure that the we retain the link juice? Any thoughts most welcome.

    | Peter.Huxley59
    0

  • I've seen the cross domain canonical not work at all in my test cases. And an interesting point was brought to my attention today. That point was that in order for the canonical tag to work, the page that you are referencing needs to have the exact same content. And that this was the whole point of the canonical tag, not for it to be used as a 301 but for it to consolidate pages with the same content. I want to know if this is true. Does the page you reference with a canonical tag have to have the same exact content? And what have been your experiences with using the canonical tag referencing another page on a different domain that has the same exact subject matter but not the exact duplicate content?

    | GearyLSF37
    2

  • Hi, I won't bore you with all the details but we may have to temporarily move part of an existing domain onto a separate domain for a couple of months. The content being moved includes most of our key branded and organic SERP pages. We've owned the new domain for years but it's never been live or indexed. After a couple of months, all the content will move back to the original domain but will move to a slightly different structure and different page names. Most of the page content will remain largely the same. I"m concerned, but don't really have any experience with this kind of thing. Can anyone shed some light. Perhaps on a scale of 1 to 5 you could give me your thoughts: 1. Should be fine, as long as you set up all the redirects properly 5. Do everything in your power not to do it! Using the new domain and other factors will be problematic. Thanks for any help you may be able to provide!

    | rfjc
    0

  • I had 12 to 15 1st page Google rankings in the iPhone, iPad, app review vertical. As of 04/26/11 I have lost all rankings, traffic has gone from 1,000 to 1,200 a day to 150 to 350  a day. I was using a plugin for auto press releases, but have removed this and deleted the urls. I also have changed themes and hosting over the last 3 weeks. I have been trying to get SEO help, but cannot seem to get anyone to help me. thank you Mike

    | crazymikesapps
    1

  • We're working on revamping the URL structure for a site from the ground up. This firm provides a service and has a library of case studies to back up their work. Here's some options on URL structure: 1. /cases/[industry keyword]-[service keyword]   (for instance: /cases/retail-pest-control) There is some search traffic for the industry/service combination, so that would be the benefit of using both in URL. But we'd end up with about 70 pages with the same service keyword at the end. 2. /cases/[industry keyword] (/cases/retail) Shorter, less spam potential, but have to optimize for the service keyword -- the primary -- in another way. 3. /cases/clientname (/cases/wehaveants) No real keyword potential but better usability. We also want the service keyword to rank on its own on another page (so, a separate "pest control" page). So don't want to dilute that page's value even after we chase some of the long tail traffic. Any thoughts on the best course of action? Thanks!

    | kdcomms
    1

  • I know there has been discussion on using expired domains in the past.  This is not so much a question as to how to do it or whether it works, but rather I would love to see how many of you use this in your backlink strategy. I have a domain in a low to moderately competitive niche that ranks really well, mostly on the power of a couple of expired domains.  I bought the domains, created a quick wordpress site and pointed some anchor texted links to the site.  It took some time for the expired domains to regain their PR, but when they did, the benefit was great. I'm considering whether I want to do this with another domain of mine.  On one hand, it's a relatively inexpensive way to get some good quality anchor texted links.  But, on the other hand, something in it feels "immoral" or "sneaky" to me. What do you think?

    | MarieHaynes
    0

  • I usually do SEO myself but now its time to move on to getting on with running the business. I have found a fantastic PPC company who ONLY focus on PPC and am looking for same but for SEO. Must be based in UK and have a great portfolio of mid/large tier companies with some real life stats to back them up. Pricing must be clear and transaparent. Results must be measurable. How would you find such a company? Ironically searching on Google doesn't seem to produce the right results 😞 Alastair

    | alastairc
    0

  • We are building a mobile site that will be launching in another month. I’m concerned that the mobile site will start catabolizing our traditional rankings. Is there a way to keep this from happening? Should we utilize the cross domain canonical tag and point back to the traditional site URLs?

    | SEO-Team
    0

  • Our corporate site provides MLS data to users, with the end goal of generating leads. Each registered lead is assigned to an agent, essentially in a round robin fashion. However we also give each agent a domain of their choosing that points to our corporate website. The domain can be whatever they want, but upon loading it is immediately directed to a subdomain. For example, www.agentsmith.com would be redirected to agentsmith.corporatedomain.com. Finally, any leads generated from agentsmith.easystreetrealty-indy.com are always assigned to Agent Smith instead of the agent pool (by parsing the current host name). In order to avoid being penalized for duplicate content, any page that is viewed on one of the agent subdomains always has a canonical link pointing to the corporate host name (www.corporatedomain.com). The only content difference between our corporate site and an agent subdomain is the phone number and contact email address where applicable. Two questions: Can/should we use robots.txt or robot meta tags to tell crawlers to ignore these subdomains, but obviously not the corporate domain? If question 1 is yes, would it be better for SEO to do that, or leave it how it is?

    | EasyStreet
    0

  • Hi guys, We've just bought a 3 letter .co.uk domain to replace our current 20 character old domain. Our existing domain is PR5 with quite a few links (that we can modify no problem) We're currently .301 redirecting the new domain to the old domain. I was looking at the procedure in one of the guides but as it's slightly different - is this the correct procedure? 1. prep the duplicate site on new domain and prep the individual htaccess .301 redirects 2. Add new the domain to google webmaster tools  bing Webmaster centre 2. On the switchover date - modify all possible incoming links from external sites 3. On the switchover date - apply the .301 redirects and make the new site live 4. On the switchover date - apply the new sitemaps to google & bing 5. on the switchover date - fill out the change of address form in webmaster tools 6. Do the happy dance? many thanks in advance, Tony.

    | posh_tiger
    0

  • As we all know Google Local Business/Places now has significant real estate for many searches.  What I find hard to understand is what makes the difference between the different positions.  Is it solely based on the content in Google Places itself or is it regular ranking factors. I am (like everybody) on a hell for leather search to try and rank above my competition but having studied their Places information I do not think there is much I more I can do. Suggestions hat have actually worked for you?

    | kdaly100
    0

  • Google and Yahoo have very little market penetration in the Korean markets. Instead, the popular search engines are Naver and Daum. Naver seems to like keeping traffic within its own network of sites. Does anyone have tips for what things might work to increase search visibility in Korea?

    | art-boy
    0

  • So, google places showing up on search results is great feature . . . But how can we get our results to the top? I mean I can see some terrible websites appearing at the top of the google places with their places page having no activity whatsoever. Is there a trick to this at all? What can we do to increase our ranking on Google Places because our old GOOD rankings are now appearing BELOW the map results Cheers

    | kayweb
    0

  • Hi All I have been working busily on my Google Places to try and improve my listings and here are some updates (with no visible progress unfortunately). I added reviews - only one of my competitors has more reviews than me and these reviews are genuine. Based on other posts in this forum I looked closely at how my address was formatted as Google Maps does not recognize my street so my address format wise is identical to the top 3 <street ,="" street="" name,="" city="" county=""> this is in Ireland so that is a typical address.</street> I am still below the fold and I have analysed the links of the competition and to be honest I see that they are in business longer and their links from the many websites that they have created over the years.  They have more links than me by a factor of 3x and 4x. Am I screwed?  As you all appreciate being over the fold is so important and I am currently below the fold and it is not an ultra competitive keyword combination  (1600-2000 searches) website design Cork on google.ie. As a websie person even getting 2-3 extra contacts for me per month can mean a HUGE difference.  I have read long and hard on this topic for the past 12-18 months. I have not put any serious attention to link building myself - should I get emailing people to try and get those quality links to raise my profile.  The problem is getting the time to do this.  Does the forum think that if I sent (for example) 20-30 emails per day to a range of related businesses asking rfor links for 'say' 30 days I would get some links without actually having to pay for them  Is link building the critical item that I should focus on considering the problems above? All help appreciated and bottle of Jameson for the person with idea that works - (really!) 🙂

    | kdaly100
    0

  • Hello everbody! I am the owner of a price compare website and have been running it succesfully for over two years now. However, since february the news and articles section of our website lost a great deal of it's traffic. We did not completely lose traffic but only for items that were posted after february 2011. We have skilled content writers who do good research on the topics covered in our news section, i can honestly say we write our content for our visitors and not just for the search engines. We have investigated every part of our source code but we did not find anything there that was violating any guidelines. So my next guess was that maybe some incoming links could harm our news section. The most backlinks we receive are directed to our news and article section. These links are generally put on sites which use our RSS feed. There is just one website that we think could be the reason. It had included our RSS feed on each page which resulted in over 2,500,000 backlinks from a single domain which hosts a very poor quality website. We never considered it to be harmfull so we never did anything about it. My question is if this case could be the reason for the drop in traffic? kind regards, Jeroen from the Netherlands

    | jeroenpf
    0

  • My first website crawl indicating many issues. I corrected the issues, requested another crawl and received the results.  After viewing the excel file I have some questions. 1. There are many pages with missing Titles and Meta Descriptions in the Excel file. An example is http://www.terapvp.com/threads/help-us-decide-on-terapvp-com-logo.25/page-2 That page clearly has a meta description and title. It is a forum thread. My forum software does a solid job of always providing those tags. Why would my crawl report not show this information? This occurs on numerous pages. 2. I believe all my canonical URLs are properly set. My crawl report has 3k+ records, largely due to there being 10 records for many pages. These extra records are various sort orders and style differences for the same page i.e. ?direction=asc. My need for a crawl report is to provide actionable data so I can easily make SEO improvements to my site where necessary. These extra records don't provide any benefit. IF the crawl report determined there was not a clear canonical URL, then I could understand. But that is not the case. An example is http://www.terapvp.com/forums/news/ If you look at the source you will clearly see Where is the benefit to including the 10 other records in the Crawl report which show this same page in various sort orders? Am I missing anything? 3. My robots.txt appropriately blocks many pages that I do not wish to be crawled. What is the benefit to including these many pages in the crawl report? Perhaps I am over analyzing this report. I have read many articles on SEO, but now that I have found SEOmoz, I can see I will need to "unlearn what I have learned". Many things such as setting meta keyword tags are clearly not helpful. I wish to focus my energy and I was looking to the crawl report as my starting point. Either I am missing something, or the report design needs improvement.

    | RyanKent
    0

  • Added 2 comments on 2 questions and my points are still at the same level. Anyone knows why? I don't.

    | mosaicpro
    0

  • I've been trying to run several truck accessory affiliate websites for a quite a while now. I've recently decided to combine all of my affiliate websites into a single community website. This way I'll be able to focus all my energy and link building into a single place and build up a single brand. My question is, how many websites do I try to redirect to the new website at a time? Do I need to spread this out? Or is it ok if I move all of my content and websites at a single time? I have around 30 websites that I could move to this new domain. Thanks! Andy

    | daenterpri
    0

  • Hello SeoMoz, As a new member I first want to thank you guys for your service, seomoz is by far the best resource and toolbox I have ever found. I have a question, or more of a request if you could advise me on what I do wrong.
    I have a website: www.letsflycheaper.com with a Domain Authority of 80, and my target keywords are keywords like: cheap business class, business class flights.
    My target page is: www.letsflycheaper.com/business-class.php. With all my keywords I am page 2 and I have a real hard time getting on the first page, but if I look at my competitors like: www.wholesale-flights.com with a Domain Authority of 'just' 50, crappy backlinks and so on, they are all on the first page with almost all of my keywords that I want to target. What do I do wrong? Can you maybe give me a couple tips on where I should focus on more? Hopefully you guys can help me... Kind Regards, Ramon van Meer

    | DennisForte
    0

  • I've read previous Q&A where people have been a bit dismissive of the ranking significance of exact match domains but my experience recently using the keyword competiveness tool is that exact match domains seem to outrank other sites regardless of domain or page authority or other on-site/on-page optimisation. I'm interested in other people's opinions and experiences.

    | bjalc2011
    0

  • I have a site, PricesPrices.com where I'm steadily building inbound links and pagerank. I have about 4600 pages on the site, most of which are baby products in the baby gear sector. There are many outdated items that aren't really my focus, but do pop up in long-tail search queries from time to time. My question is a pretty basic one. Theoretically if a site has say 28/100 link juice, then as you go deeper and deeper into the site, the link juice is divided more and more. My question: Is this really true or just a concept? My thoughts are to hide many of the products that i don't really need to focus on therefor passing more link juice to the products that remain, but I also don't want to that if it won't necessarily make the remaining pages rank higher or have more link juice. I also have to keep in mind the merchandising aspect of the site and providing a good user experience. If i only have 300 products on the site, there will be a ton of unhappy people who can't find the products they are looking for. Any thoughts and/or pointers in the direction of funneling that pagerank down into my site would be much appreciated. Thanks!

    | modparent
    0

  • Question about changing url from dynamic to static to improve SEO but concern about hurting progress made so far.

    | h3counsel
    0

  • Hi, Mozzers- I have a client that has a bunch of pretty nice keyword-rich domain names. Their traffic and rankings are good. They provide legal services in the Chicago area. I have lots of good content that I could use to start a blog using a domain like keyword,keyword-blog.com. Good idea? Currently I have a resources area on their website but feel like this area could be getting a little bloated and some news-related stuff isn't really appropriate. 2 Questions: Should I use one of the decent domains for a blog and build up the rankings, traffic, and link to the main site?  Or is this lots of work for little payout? Both sites would be hosted in the cloud. Some of the domain names are related to their name, others are keyword or geo-targeted. Would it be wise to setup 301 redirects going to their website? Pros/cons? If you need additional info, please PM me for details. Thank you, friends! LHC

    | lhc67
    0

  • Anyone have a good local search "best practices" resource for advising a client who is changing business addresses (aside from cross your fingers). For example, order of updating local citations (website first, google places, others). Time frame for update to take effect? Other issues folks have faced in updating addresses? I regularly follow David Mihm, Mike Blumenthal, & Andrew Shotland, I was just curious what the Moz community might be able to add. Thanks.

    | Gyi
    0

  • Your feedback here is definitely appreciated, but I'm also doing a public study and would be honored and humbled if you answered the 5 questions in my survey as well. For those who do not wish to participate, I'd appreciate your general feedback on permalink structure best practices based on what Amazon.com and eBay.com have done to their URLs in recent times. Thanks!

    | stevewiideman
    0

  • I've recently added a campaign within the SEOmoz interface and received an alarming number of errors ~9,000 on our eCommerce website. This site was built in Magento, and we are using search friendly url's however most of our errors were duplicate content / titles due to url's like: domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=1 and domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=4. Is this hurting us in the search engines? Is rogerbot too good? What can we do to cut off bots after the ".html?" ? Any help would be much appreciated 🙂

    | MonsterWeb28
    0

  • Hi, Recently I've been trying to tackle an issue on one of my websites.  I have a site with around 400 products and 550 pages total.  I've been pruning some weaker pages and pages with shallow content, and it's been working really well. My current issue is this:  There are about 20 store brands of 6 products on my site that each have their own page.  They are identical products just re-branded.  Writing content for each of these pages has been difficult, as it's a fairly dry product too.  So I have around 120 pages of dry content that is unique but not much different from one another.  I want to consolidate but I am not sure how yet.  Here is what I am thinking: 1. 301 - I pick one product page as the master, 301 all the other duplicate products to it and then make one page of great content that encompasses all of them.  If the 301 juice gets diluted over time I might miss out on some long tails, but I could also gain a lot more from a great content page with 500+ words of really good content as opposed to pages with 150-250 words of just so so content. 2. Canonical - Similar to above.  I pick a master page and canonical the other pages to it.  Then I could use the great content on all the pages, and still have pages for the specific products.  The pages might not show up in search engines but would still be searchable on my site. 3. Coded solution - In my CMS I could always make a workaround where the  products still appear on the brands page (just their name with a link to the product page) but all the links direct to a master page. I realize all the solutions are fairly similar, although I am not sure which is ideal.  Option 3 is the most expensive/time consuming but it would drop my page total down to around 450 pages.  For a while now (dating back to before Panda) I've been trying to get rid of the low quality and outdated product pages so I could focus on the more popular and active pages.  Dropping my page total would also help in the SEO efforts as the sheer volume of pages that need links right now is high, and obviously the less pages I have the more time I can spend on each page (content and link building). So what do you think?  Should I do any of the 3, a combination of the  3 or something different? Cheers, Vinnie

    | vforvinnie
    0

  • So today I was taking a look at http://www.seomoz.org/top500 page and saw that the AddThis page is currently at the position 19. I think the main reason for that is because their plugin create, through javascript, linkbacks to their page where their share buttons reside. So any page with AddThis installed would easily have 4/5 linbacks to their site, creating that huge amount of linkbacks they have. Ok, that pretty much shows that Google doesn´t care if the link is created in the HTML (on the backend) or through Javascript (frontend). But heres the catch. If someones create a free plugin for wordpress/drupal or any other huge cms platform out there with a feature that linkbacks to the page of the creator of the plugin (thats pretty common, I know) but instead of inserting the link in the plugin source code they put it somewhere else, wich then is loaded with a javascript code (exactly how AddThis works). This would allow the owner of the plugin to change the link showed at anytime he wants. The main reason for that would be, dont know, an URL address update for his blog or businness or something. However that could easily be used to link to whatever tha hell the owner of the plugin wants to. What your thoughts about this, I think this could be easily classified as White or Black hat depending on what the owners do. However, would google think the same way about it?

    | bemcapaz
    0

  • Hi! I am not very familiar with the canonical tag. The thing is that we are getting traffic and links from affiliates. The affiliates links add something like this to the code of our URL: www.mydomain.com/category/product-page?afl=XXXXXX At this moment we have almost 2,000 pages indexed with that code at the end of the URL. So they are all duplicated. My other concern is that I don't know if those affilate links are giving us some link juice or not. I mean, if an original product page has 30 links and the affiliates copies have 15 more... are all those links being counted together by Google? Or are we losing all the juice from the affiliates? Can I fix all this with the canonical tag? Thanks!

    | jorgediaz
    0

  • On a blog from an SEO perspective how do you choose keywords to use in the "meta keyword tag" vs. "post tags"? Will it be different based on the search volume/competition of the keywords targeted?

    | saravanans
    0

  • I have seen several sites that put a div feature at the bottom of a page to hide content.  If you click on the button, it will extend the page down and be loaded with paragraphs of text rich with keywords. Does anyone know is this is viewed as a negative with Google?

    | netmkting
    0

  • I run a site that was a Wordpress blog with Edirectory software for a directory on the back end. I've scrapped the Edirectory and built the entire site on Wordpress. After the site change I'm seeing about 700 404 Not Found crawling errors, which appear to be old Edirectory pages that no longer exist. My understanding is that they'll cycle out eventually. What troubles me is the linking data I'm seeing. In the "Links to My Site" area of Webmaster tools, I'm seeing 4,430 links to the "About" page, another 2,900 to an obscure deleted directory listing page and only 2,050 to the home page. I show 1,700 links to a terms and conditions pdf and other strange data. To summarize, I'm showing huge numbers of links to obscure pages. Any help would be greatly appreciated.

    | JSOC
    0

  • I met with a new client last week. They were very negatively impacted by the Panda update. Initially I thought the reason was pretty straight-forward and had to do with duplicate content. After my meeting with the developer, I'm stumped and I'd appreciate any ideas. Here are a few details to give you some background. The site is a very nice looking (2.0) website with good content. Basically they sell fonts. That's why I thought there could be some duplicate content issues. The developer assured me that the product detail pages are unique and he has the rel=canonical tag properly in place. I don't see any issues with the code, the content is good (not shallow), there's no advertising on the site, XML sitemap is up to date, Google webmaster indicates that the site is getting crawled with no issues. The only thing I can come up with is that it is either: Something off-page related to links or Related to the font descriptions - maybe they are getting copied and pasted from other sites...and they don't look like unique content to Google. If anyone has ideas or would like more info to help please send me a message. I greatly appreciate any feedback. Thank you, friends! LHC

    | lhc67
    0

  • I am reorganizing the data on my informational site in a drilldown menu. So, here's an example.  One the home page are several different items.  Let's say you clicked on "Back Problems".  Then, you would get a menu that says: Disc problems, Pain relief, paralysis issues, see all back articles. Each of those pages will have a list of articles that suit.  Some articles will appear on more than one page. Should I be worried about these pages being partially duplicates of each other?  Should I use rel-canonical to make the root page for each section the one that is indexed.  I'm thinking no, because I think it would be good to have all of these pages indexed.  But then, that's why I'm asking!

    | MarieHaynes
    0

  • We have a client who has two sites for different countries, 1 US, 1 UK and redirects visitors based on IP. In order to make sure that the English site is crawable, we need to know where the googlebot searches from. Is this a US IP or a UK IP for a UK site / server?

    | AxonnMedia
    0

  • Hi!  OK, I am semi - new to SEO Moz  but have been self-teaching for 3 years.  However I am stuck.. I have been operating my e-commerce site from www.shopadornonline.com for the past 3 years.  I just purchased www.shopadorn.com Right now Shopadorn.com re-directs to www.shopadornonline.com because all my products and links go to shopadornonline.com/productblahblahblah I guess I am stuck.  Not sure what to tell my web designer to do?  Do I give up on having shopadorn.com OR do I start re-directing customers and doing 301 re-directs? I think from what i have read that it is bad to have traffic going to both shopadorn and shopadornonline as they compete for rankings? Where should I start?

    | Shopadorn
    0

  • Can anyone point me to resources that helps website owners balance these two issues? Or how to SEO a site meant for disabled users? or how to make an SEO'd site more accessible? Thanks!

    | mjcarrjr
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.