Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • We're renaming all of our Product URLs (because we're changing eCommerce Platforms), and I'm trying to determine the best strategy to take. Currently, they are all based on product SKUs. For example, Bacon Dental Floss is: http://www.stupid.com/fun/BFLS.html Right now, I'm thinking of just using the Product name. For example, Bacon Dental Floss would become: http://www.stupid.com/fun/bacon-dental-floss.html Is this strategy the best for SEO? Any better ideas? Thanks!

    | JustinStupid
    0

  • Notice how Amazon has the reviews for the Kindle showing up right in the Organic Results; http://www.google.com/search?q=kindle&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a Is this a rich snippit example? If you have an ecommerce store and lots of reviews, how do you go about getting the same thing? Is Google just going to let Amazon do this, and other huge brands or is it fair game for everyone?

    | iAnalyst.com
    0

  • I am working on a website that has a regularly updated Wordpress blog and am unsure whether or not the category and tag pages should be indexable. The blog posts are often outranked by the tag and category pages and they are ultimately leaving me with a duplicate content issue. With this in mind, I assumed that the best thing to do would be to remove the tag and category pages from the index, but after speaking to someone else about the issue, I am no longer sure. I have tried researching online, but there isn't anything that provided any further information. Please can anyone with any experience of dealing with issues like this or with any knowledge of the topic help me to resolve this annoying issue. Any input will be greatly appreciated. Thanks Paul

    | PaulRogers
    0

  • Fellows, We are deciding whether we should include our category description on all pages of the category listing - for example; page 1, page 2, page 3... The category description is currently a few paragraphs of text that sits on page 1 of the category only at present.  It also includes an image (linked to a large version of it) with appropriate ALT text. Would we benefit from including this introductory text on the rest of the pages in the category?  Or should we leave it on the first page only?  Would it flag up duplicate signals? Ideas please! Thanks.

    | Peter264
    0

  • Hello, I've reading all about Google's Author Biography tag but I am not sure how I can use this in my business. Can anyone explain ( in plain simple English) how I can leverage this tag? Is there any implications in SEO and higher rankings? Just trying to wrap my head around this concept and why it's important...or not. Thanks, Bill

    | wparlaman
    0

  • I am looking for a great US based SEO company referral.  I try to do as much as I can on my end but the more I learn, then more I find I need help to do a great job.  Lately running the business takes most of my time, and we need a team who really specializes in SEO. Can anyone recommend a SEO company with a great reputation, someone who has done great work for you in the past?  I get many people contacting me with SEO promises, but I need some advice from someone with more experience than me. I appreciate all of your insights.

    | fertilityhealth
    0

  • Website is currently in Dot Net Nuke (DNN) and planning on moving it into a different platform, possibly Sitecore. How will shifting CMS affect SEO efforts? Thank you

    | Unidev
    0

  • Hi there A while back I asked whether I should move my established blog on wordpress over to my main website http://www.gardenbeet.com. The overwhelming response was to move it. I still have not seen any benefit other than problems (2mths later). Maybe something is Not Quite Right? Here is a list of issues? Any insights would be welcome. I have installed Yeost SEO plugins 1. A blog category (garden art) ended up outranking my main website for the term Garden Art - so I did a 301 on the garden art category - main website has now regained its ranking in the top 3 position (was number 2 before the move). 2. I have removed the categories from the blog as a 301 on the Garden Art catgegory would not make sense to a user. I decided to use tags as a navigational tool instead. I figured that because I have over 500 tags I could not have had the tags out rank my main website for any key term.  So far correct. But now i have wanings from SEO moz 500 missing meta tags,(for the tags) - do i really have to write over 500 meta? over 160 long urls and titles - when i commenced the blog I had no idea that Urls and headings were linked in wordpress ( my developer does not think i should rename the urls and use a 301 as I already have a tonne due to a site rebuild) - is it OK to leave the long urls and fix the title only? On wordpress I had bewteen 400-900 users on my blog a day (using wordpress analytics) - now only 200 (using GA) - Yes I now have increased links to my website but have seen no imrpovement on my SERPS - will i see an improvement in my rankings? my wordpress  site use to have page rank of 3 -

    | GardenBeet
    0

  • Hello, my company sells used cars though a website. Each vehicle page contains photos and details of the unit, but once the vehicle is sold, all the contents are replaced by a simple text like "this vehicle is not available anymore".
    Title of the page also change to a generic one.
    URL remains the same. I doubt this is the correct way of doing, but I cannot understand what method would be better. The improvement I am considering for pages of no longer available vehicles is this: keep the page  alive but with reduced vehicle details, a text like: this vehicles is not available anymore and automatic recommendations for similar items. What do you think? Is this a good practice or do you suggest anything different? Also, should I put a NOINDEX tag on the expired vehicles pages? Thank you in advance for your help.

    | Darioz
    0

  • Just received my first crawl report from SEOmoz for my blog. I've rreceived a number of warnings / errors about having too many outbound links on my pages. These are simply comments from people (some pages have 300+) and the links are nofollowed. It seems like you guys must have a reason why this warning is in place, so I would love your theories...

    | ViperChill
    0

  • Here's my problem -- which is actually a pretty good problem to have. My client is a speciality service provider in an extremely competitive field. It charges 3 to 5 times what others do for providing a super-premium level of service. It doesn't have -- nor does it want -- many customers. I can't go into details, but let's just say the business model is a bit like the charity or premium newsletter publishing model. It is extremely hard to recruit new members -- but once recruited, members tend to stay for a long time at high price points. Personal referral is key. As result of my efforts over the last 90 days, the client's SEO results have skyrocketed. After a couple of false starts, we have focussed on key terms the target demographic is likely to search, rather than the generic terms others in the industry use. We have also had great success with a social media strategy -- since the few people likely to be interested in paying such high prices know like-minded folks. For the first time, my client is getting "walk in" prospects. They are delighted! But they are not really walk-ins. They have already found the site -- either through SERPs or Facebook or Twitter. Now we need to get to the next level. Here's the problem: the client's domain name sucks. It is short, but combines an acronym with one of the words in its long-version name. It uses the British spelling version of the long name fragment, even though most Canadians now use American spelling. And it is a .ca, rather than a dot.com So I think we have to bite the bullet and change to the long, dot com version of the name, which is available and has the additional benefit of having embedded within it a key search term. I am basically an editorial/content guy and not a tech guy. The IT guys at my firm are strongly encouraging me to make the change...in very "colorful" language. We can certainly do 301 redirects at the page level. But I would like some additional validation before proceeding. My questions are: how much link juice might we lose? I've seen the figure of 10% bandied around. Is it accurate? might we see a temporary dip in results? If so, how long would it last? what questions did I forget to ask? What additional info do you need to offer informed advice ?

    | DanielFreedman
    0

  • Has anyone ran into SEO issues with sites utilizing load balancing systems? We're running into some other technical complications (for using 3rd party tracking services), but I'm concerned now that the setup could have a not-so-good impact from an SEO standpoint.

    | BMGSEO
    0

  • Hello, I run an e-commerce website. I just realized that Google has "pagination" pages in the index which should not be there. In fact, I have no idea how they got there. For example, www.mydomain.com/category-name.asp?page=3434532
    There are hundreds of these pages in the index. There are no links to these pages on the website, so I am assuming someone is trying to ruin my rankings by linking to the pages that do not exist. The page content displays category information with no products. I realize that its a flaw in design, and I am working on fixing it (301 none existent pages). Meanwhile, I am not sure if I should request removal of these pages. If so, what is the best way to request bulk removal. Also, should I 301, 404 or 410 these pages? Any help would be appreciated. Thanks, Alex

    | AlexGop
    0

  • SEO Moz Community: After twice changing directory software, I have a ton of 404 errors in Webmaster Tools (over 3,000). I've decided to do 301 redirects but can't manually enter in each 404 URL. How can you redirect pages from the same folder on a mass scale? For example, mysite.com/autos has hundreds of pages associated with it (/autos/ford, toyota etc.) How can you do a 301 that redirects all those pages without manually entering in each URL? Site is built on Wordpress

    | JSOC
    0

  • How are you supposed to <rel canonical="" tag="">a page with a query string that has already been indexed? It's not like you're serving that page from a CMS where you have an original page with content to add to the head tag.</rel> For example.... Original Page = http://www.example.com/about/products.php Query String Page = http://www.example.com/about/products.php?src=FrontDoorBox Would adding the <rel canonical="" tag="">to the original page, referencing itself, be the solution so that the next time the original page is crawled, the bot will know that the previously indexed URL with query string should actually be the "original"? That's the only solution I can come up with because there's no way to find the query string rendered page to tag with the canonical.....</rel>

    | Yun
    0

  • Before answering this, let me explain my goals.  I know that Google made a change a couple years ago that discounts the amount of Page Rank passed to dofollow links when there is a nofollow link present on the page. My goal is to keep the most page rank possible on my home page and pass a specified amount of Page Rank to 7 out of 10 of the pages linked to from my home page.  I realize that creating 3 of the outgoing links as no follow links is not going to increase the Page Rank being sent to the other 7 pages. My question is will my home page be able to retain the Page Rank that would have been used by the three nofollow links or is that Page Rank value just lost when I implement a nofollow?

    | MyNet
    0

  • Hi. My baseball blog (mopupduty.com) shows up as www.mopupduty.com in Google Webmaster tools. This is an issue for me, as my Wordpress plug-in sitemap will only show up on http://mopupduty.com/sitemap.xml , not the www. version Is there any way in changing the www. in webmaster tools without deleting my existing index. The website currently has sitelinks in search results, and I'm not too keen in giving them up via deletion. Thanks

    | mkoster
    0

  • We're looking for a recommendation for a very good SEO agency that has experience with link building (white hat only). Any suggestions?

    | BruceMillard
    0

  • I searched the current questions and found similar questions but nothing as specific as what I wanted, so... We are a graphic design school in Melbourne Australia. We have a website - www.graphicschool.com.au - that ranks fairly well in Google for our particular search terms. We have rebranded the organisation and want to change the website domain name to the new branding - www.grenadi.vic.edu.au - but obviously do not want to lose our hard earned SEO and rankings. We only have two strategies so far, and are not sure what the pros and cons to either strategy are, or whether there is a better way. The two ideas we have are: Option 1) Just swap the domain name. We were thinking about swapping the domain name and setting up 301 redirects to tell Google that the old page that ranked well is now this page 'x' on the new site. I've read that you lose all your valuable links doing this because they are domain specific and the 301 doesn't forward your links. Option 2) Build a second website. This idea is that we would build a second website with our new domain name and branding and build up that site until it ranks as highly as the first site and then start to remove the first site. We're planning on completely redeveloping the current website anyway and changing and adding lots more content as well so this option is not out of the question. Any help, thoughts, suggestions or further options would be greatly appreciated. Feel free to discuss. Can I also please suggest that a new category is added under 'The SEO Process' - something like 'rebranding / migration' Cheers, Anthony

    | Grenadi
    0

  • I have searched and searched for the answer to this question and can't find it. We are going to be launching a Wordpress blog on our domain shortly, however we have a much larger site that is mixed with static and dynamic pages full of custom programming tied to databases, etc. that we are running around the blog and can't integrate that into Wordpress due to its complexity. What this means is we have to install Wordpress on our servers somehow separate from the pages of our website. What I am wondering is if we run Wordpress in the /blog directory of our site as a separate installation if it will inherit the domain authority of our domain (currently around 60) or if it will be viewed as a separate site and get no ranking. Also, will our main site inherit the additional link juice from the inbound links that we get from the blog with it being separate from the main site? How does this need to be setup on our webservers to ensure the blog gets authority of the domain, and the blog contributes maximum SEO value to the domain? Any help would be appreciated.

    | CodyWheeler
    0

  • Ok, I have a quick question about these, i keep seeing them. There has been talk of Google showing dual inline sitelinks (the extra links it shows under the number 1 results). It used to show 8 links under many number 1 results. It was reported it was showing 2. Now it’s showing 3 …for example, for comparestore prices, compare the market and pricerunner (for a search on compare). How do I get these, or go about getting started with being able to attain them?

    | TomBarker82
    0

  • Our site's IP address is being indexed in addition to the canonical www.example.com domain. As soon as it was flagged a 301 was implemented in the .htaccess file to redirect the IP address to the canonical. Does this usually occur? Is it detrimental to SEO? In my time in SEO I've never heard of this being an issue, or being part of a list of things to be checked. It sounds more like a server that wasn't configured correctly when hosting was set up? It didn't seem to be affecting the site at all, but is it more common and I've just never heard of it? 😛 Should it be something I'm usually looking for in future? Responses are greatly appreciated!

    | mikeimrie
    0

  • So, AddThis just added a cool feature that attempts to track when people share URL's via cutting and pasting the address from the browser. It appears to do so by adding a URL fragment on the end of the URL, hoping that the person sharing will cut and paste the entire thing. That seems like a reasonable assumption to me. Unless I misunderstand, it seems like it will add a fragment to every URL (since it's trying to track all of 'em). Probably not a huge issue for the search engines when they crawl, as they'll, hopefully, discard the fragment, or discard the JS that appends the fragment. But what about backlinks? Natural backlinks that someone might post to say, their blog, by doing exactly what AddThis is attempting to track - cutting and pasting the link. What are people's thoughts on what will happen when this occurs, and the search engines crawl that link, fragment included?

    | BedeFahey
    0

  • Having a website designed for a car dealership and deciding what attributes to use in the URL. Should I include the city name in the URL? Or does that help for SEO purposes? Other ideas of what to research or try are appreciated too. Thanks 🙂

    | kylesuss
    0

  • We are launching a new site for the Australian market and the URL will just be siteAU.com. Currently the tech team (before we came on board) has it setup with almost exactly the same content (including the site css/nav/structure etc). Some product page content is slightly different, and category pages have different product orders, plus there are location pages that are specific to AU, but otherwise it's the same. The original site: site.ca has been around for 6+ years, with several thousand pages and solid organic ranking (though the last few months have dropped ) Will the new AU site create issues for the original domain? We also have siteUSA.com  which follows the same logic and has been live for a while.

    | BMGSEO
    0

  • Hey Everyone - I work for a company that is just getting into SEO.  We have had some successes, but one project lately has got us stumped.  We have been working hard, but have been unable to make an impact in Google rankings with the following site: http://stoneycreekinn.com/locations/index.cfm/DesMoines We are trying to optimize for the keyword phrase, "des moines hotel" This hotel is a branch location of a hotel chain in the Midwest. *Note we've already moved up some other branch locations for this hotel chain successfully. We've used several tools including the SEOmoz tool and seem to have higher marks than those sites that rank above us in Google surprisingly. Any idea what we're missing? Thanks!

    | markhope
    0

  • I'm looking to create a page about Widgets and all of the more specific names for Widgets we sell: ABC Brand Widgets, XYZ Brand Widgets, Big Widgets, Small Widgets, Green Widgets, Blue Widgets, etc. I'd like my Widget page to give a brief explanation about each kind of Widget with a link deeper into my site that gives more detail and allows you to purchase. The problem is I have a lot of Widgets and this could get messy: ABC Green Widgets, Small XYZ Widgets, many combinations. I can see my Widget page teetering on being a link farm if I start throwing in all of these combos. So where should I stop? How much do I do? I've read more than 100 links on a page being considered a link farm, is that a hardline number or a general guideline?

    | rball1
    0

  • I just finished watching a documentary on Ray Kurzweil, Transcedent, and began to familiarize myself with his book, The Singularity is Near.  During Ray's explanation of prediction through data gathering and extrapolation, he predicts that by 2029 AI and humans will have merged.  We can debate this at another time, but I was wondering if any of the statistics/data is used by SEO professionals to predict where SEO may be going in the next 12-48 months. It has been my experience that SEO is very reactionary, and very few put their neck out on a limb to predict where it is going. I was just hoping that some of you may share your thoughts on what you are focusing on and where you are steering your clients in order to be ahead of the curve without hurting current placement. Anyone care to share?

    | dignan99
    0

  • My site has about 50 pages. All of them are unique 500-700 word articles. Almost every page is on its keyword at the 4-8 position in google/yahoo/bing. I can add a lot of relative unique pages on my site, about 100-200 word content per page. They all will be unique, with unique description and title. I can make about 1000+ pages. Would you suggest me to do this? Will this action boost my SERP position. Do more pages mean better SERP position?

    | nycdwellers
    1

  • So, I have a website, and I have a few pages in directories, and the rest are normal with extensions (i.e. example.com/blah.html instead of example.com/blah/) Now, the directory page isnt ranking yet for a targeting keyword (although I am still in the process of link building to the page w/ anchor text), however, could it be because it is the odd man out being one of the only pages within a directory? Also, I would really like to move all my pages into directories, however some of the internal pages are ranking really well and I do not want to lose that once switching. Has anyone has experiences with using 301s to redirect so sub directories without loosing rankings?

    | Aftermath_SEO
    0

  • I posted this on my blog and wanted to get everyones opinion on this (http://palatnikfactor.com/2011/06/07/seo-correlation-between-code-and-search-engine-rankings/) I’m always looking to see what top ranking websites may be doing to get the rankings they do. One of the tasks of any SEO I guess is to really analyze competitors, right? I want to really stress that what I am writing here is completely opinion based and have not (due to time) validated this correlation enough but would like to get the discussion started. Nevertheless, I did enough research to see that there may be a correlation between code validation and top ranking websites, at least for certain queries where the number of real big players/brands is limited or non-existent. So, what do I mean? http://validator.w3.org/ validates code on websites. This tool shows you errors and warnings that may be making it harder for search engines to crawl your website. Looking at top competitors for certain niches, I was surprised to find that top sites had very few errors compared to 2+ page rankings. That’s not to say that all the sites on the first page had fewer errors (cleaner code) than websites in the 2<sup>nd</sup> page plus. However, again, top ranking websites for keywords that I was looking at had cleaner code which may have a correlation in regards to organic rankings. What’s your take? Does this have any effect in regards to SEO?

    | PaulDylan
    0

  • I realized the only way to get massive amounts of links to websites that do not actively blog, is to create a free infographic or free software tool that they can anyone can use on their website. Here is my question. Once we create the tool or infographic, what is the best way to spread it like wildfire? Contact people on twitter? Mail popular blog owners and tell them about it directly? Comment on blogs and forums and post a link? Is thier anything else I can do to push these infographics out? Also, I think custom software's like jquery calculators and tools are better than infographics. What do you think?

    | DanHenry
    0

  • We have thousands of pages on our website; news articles, forum topics, download pages... etc - and at present they all reside in the root of the domain /. For example: /aosta-valley-i6816.html
    /flight-sim-concorde-d1101.html
    /what-is-best-addon-t3360.html We are considering moving over to a new URL system where we use directories.  For example, the above URLs would be the following: /images/aosta-valley-i6816.html
    /downloads/flight-sim-concorde-d1101.html
    /forums/what-is-best-addon-t3360.html Would we have any benefit in using directories for SEO purposes?  Would our current system perhaps mean too many files in the root / flagging as spammy?  Would it be even better to use the following system which removes file endings completely and suggests each page is a directory: /images/aosta-valley/6816/
    /downloads/flight-sim-concorde/1101/
    /forums/what-is-best-addon/3360/ If so, what would be better: /images/aosta-valley/6816/ or /images/6816/aosta-valley/ Just looking for some clarity to our problem! Thank you for your help guys!

    | Peter264
    0

  • Hi, I have two sites, lets call them site A and site B, both are sub domains of the same root domain. Because of a server config error, both got indexed by Google. Google reports millions of inbound links from Site B to Site A I want to get rid of Site B, because its duplicate content. First I tried to remove the site from webmaster tools, and blocking all content in the robots.txt for site B, this removed all content from the search results, but the links from site B to site A still stayed in place, and increased (even after 2 months) I also tried to change all the pages on Site B to 404 pages, but this did not work either I then removed the blocks, cleaned up the robots.txt and changed the server config on Site B so that everything redirects (301) to a landing page for Site B. But still the links in Webmaster Tools to site A from Site B is on the increase. What do you think is the best way to delete a site from google and to delete all the links it had to other sites so that there is NO history of this site? It seems that when you block it with robots.txt, the links and juice does not disappear, but only the blocked by robots.txt report on WMT increases Any suggestions?

    | JacoRoux
    0

  • What is the best practice with having multiple locations in Google Places. Does having multiple Google Places set up for each business have a big effect on local rankings for the individual areas? Should I use the home page for the website listed on each page or is it better to have a specific landing page for each Google Places listing? Any other tips? Thanks, Daniel

    | iSenseWebSolutions
    0

  • I've seen some good threads developed on this topic in the Q&A archives, but feel this topic deserves a fresh perspective as many of the discussion were almost 4 years old. My webmaster tools preferred domain setting is currently non www. I didn't set the preferred domain this way, it was like this when I first started using WM tools. However, I have built the majority of my links with the www, which I've always viewed as part of the web address. When I put my site into an SEO Moz campaign it recognized the www version as a subdomain which I thought was strange, but now I realize it's due to the www vs. non www preferred domain distinction. A look at site:mysite.com shows that Google is indexing both the www and non www version of the site. My site appears healthy in terms of traffic, but my sense is that a few technical SEO items are holding me back from a breakthrough. QUESTION to the SEOmoz community: What the hell should I do? Change the preferred domain settings? 301 redirect from non www domain to the www domain? Google suggests this: "Once you've set your preferred domain, you may want to use a 301 redirect to redirect traffic from your non-preferred domain, so that other search engines and visitors know which version you prefer." Any insight would be greatly appreciated.

    | JSOC
    1

  • The company I work for had an old SEO company that did a lot of reciprocal links with websites that are not what we want to be associated with. Does anyone know of a tool that might be able to tell us if there are still reciprical links to our site? I want to try and find them, but the old pages we had with links going out have been deleted.

    | b2bcfo
    0

  • Hello, We are designing a very large site with hundreds of landing pages that will need to get some of the Pagerank and trust our homepage has, so we are trying to make sure our navigation architecure is well set up correctly from the beggining. I'm curious though if I need to have left side CSS dropdown navigation (I know no javascript) like www.adventurebound.com or if we can just use the top style dropdown like www.adventurefinder.com has? I know straight HTML links would be best but unfortunately our site will be too large and complex for that. Thanks in advance!

    | iAnalyst.com
    0

  • One of my clients that is managing their own server and website recently moved servers. Which then broke their custom 404 page. Instead of fixing this or putting the site back to the old server they redirected the 404 to the home page. I've been working on getting their 404's appropriately redirected, or old urls redirection using a 301 for a month or two. I read the HTTP Status Codes best practices. It just discusses usability. What technical seo back lash can happen?

    | triveraseo
    0

  • I have a site that has been indexed in Google since 2002.  Back then, I secured all of the highly recommended links of the time, like DMOZ and Yahoo Directory, and got just a couple very high PR links from highly relevant sites.  That was enough to get us top listing on our best "niche" keywords and many long tail searches.  Once we got to that point, we got lazy and have just relied upon our original links and any natural links that came our way.  We also have a very highly detailed Adwords campaign in which we bid on almost any keyword that has every resulted in an organic conversion. A few months ago, I decided to kick our SEO efforts up a notch and hired a company to do an aggressive link building campaign and target some very high search volume terms that we had previously given up on.  The campaign has been very successful in getting high ranking for several targetted terms.  However, I am seeing zero impact on our site traffic or sales. I am beginning to wonder if Google's algorithms are so efficient that all of this extra SEO work is to no avail.  Is there a point of diminishing returns where it is not productive to optimize a site's organic listings any further?  Between our Adwords campaign, our already pretty good organic results, and google's ability to divine a searchers intent and lead them to the most relevant results, how do you decide when there is little benefit to further optimization?  It is an important question for me because I have been considering putting a lot of work into adding content to our ecommerce site and I would hate to do all that work for nothing.

    | mhkatz
    0

  • i dont mean to be calling any site out im just scratching my head on this one.- I can't see any signal that would make it worthy of ranking on the first page for keyword loose diamonds. page http://www.jewelryexchange.com/DiamondResults.aspx Am i missing something all tools seem to say the site isn't worthy- All help appreciated

    | DavidKonigsberg
    0

  • Can someone recommend an excellent SEO who can perform a full site audit of my fairly large Wordpress site? The site receives about 14,000 visits per month but traffic is waining one month after a recent change. Need analysis of some funky stuff in my Webmaster tools and overall site review.

    | JSOC
    0

  • Hello... i just started with a new client this week, before working with us his last domain-hosting-webdev provider cancel their account and took off the entire site and left them with a nice "under construction page" (NOT) and added the noindex, nofollow tags. 4 weeks after that, we come into the scene and of course our client it's expecting us to reinsert at least for branded terms the site, and he wants it done on a matter of hours... I tried my best to explain that it's not possible and we are doing everything we can't.... now i ask you guys.. I already created de GWT account, Created a well structured Sitemap and submitted it to google and bing, did the onpage optimizitation at least the basics... there is a way to speed up the process? kind of like "hey you! google bot, forget about the noindex nonsense a come crawl again?" Any help would be great Daniel

    | daniel.alvarez
    0

  • I have a wordpress blog with settings currently set so that Google does not index tag pages. Is this a best practice that avoids duplicate content or am I hurting the site by taking eligible pages out of the index?

    | JSOC
    0

  • I have a client who wants to forward their website traffic to a campaign on Facebook for two week. I think it's a horrible idea on so many levels, but need a solid reason why. My gut says that their Google rankings will suffer, but I can't find any research/articles that state such. Help?

    | Axis41
    0

  • We recently took the decision to consolidate 3 domains for .com.au, .eu and .us. This decision was made before I arrived here and I'm not sure it's the right call. The proposal is to use a brand new .co (not .com isn't available) domain. The main reason is in terms of trying to build domain strength towards one domain instead or trying to grow 3 domains.  We re-sell stock simlar to hotel rooms (different industry) and our site is heavily search based. So duplicate content is an issue that we hope improve on with this approach. One driver was we found for example that our Autralian site was outranking out european site in european searches. We don't want to only hold certain inventory on certain sites either because this doesn't work with our business rules. Anyway if we are to go about this, what would be the best practise in terms of going about this. Should we suddenly just close one of the domain and to a * 301 redirect or should we redirect each page individually? Someone has proposed using robots text to use a phased approach, but to my knowledge this isn't possible with robots.txt, thought a phased individual page 301 using htaccess may be possible? In terms of SEO is 1 domain generally better that 3? Is this a good strategy? What's the best 301 approach? Any other advice? Thanks J

    | Solas
    0

  • We're currently working on an SEO project for http://www.gear-zone.co.uk/. After a crawl of their site, tons of duplicate content issues came up. We think this is largely down to the use of their brand filtering system, which works like this: By clicking on a brand, the site generates a url with the brand keywords in, for example: http://www.gear-zone.co.uk/3-season-synthetic-cid77.html filtered by the brand Mammut becomes: http://www.gear-zone.co.uk/3-season-synthetic-Mammut-cid77.html?filter_brand=48 This was done by a previous SEO agency in order to prevent duplicate content. We suspect that this has made the issue worse though, as by removing the dynamic string from the end of the URL, the same content is displayed as the unfiltered page. For example http://www.gear-zone.co.uk/3-season-synthetic-Mammut-cid77.html shows the same content as: http://www.gear-zone.co.uk/3-season-synthetic-cid77.html Now, if we're right in thinking that Google is unlikely to the crawl the dynamic filter, this would seem to be the root of the duplicate issue. If this is the case, would rewriting the dynamic URLs to static on the server side be the best fix? It's a Windows Server/asp site. I hope that's clear! It's a pretty tricky issue and it would be good to know your thoughts. Thanks!

    | neooptic
    0

  • Are article submissions still relevant after the panda update? Many of these sites (ezinearticles) are still hit from the panda update.

    | qlkasdjfw
    0

  • OK I'm looking to try and generate traffic for people looking for accommodation.  I'm a big believer in the quality of the domain being used for SEO both in terms of the direct benefit of it having KW in it but also the effect on CTR a good domain can have. So I'm considering these options: Build a single site using the best, broad KW-rich domain I can get within my budget.  This might be something like CheapestHotelsOnline.com Advantages: Just one site to manage/design One site to SEO/market Better potential to resell the site for a few million bucks Build 5 sites, each catering to a different region using 5 matching domains within my budget.  These might be domains like CheapHotelsEurope.com, CheapHotelsAsia.com etc Advantages: Can use domains that are many times 'better' by adding a geo-qualifier. This should help with CTR and search Can be more targeted with SEO & Marketing So hopefully you see the point.  Is it worth the dilution of SEO & marketing activities to get the better domain names? I'm chasing the longtail searchs whetever I do.  So I'll be creating 5K+ pages each targeting a specific area.  These would be pages like CheapestHotelsOnline.com/Europe/France/Paris or CheapHoteslEurope.com/France/Paris to target search terms targeting hotels in Paris So with that thought, is SEO even 100% diluted? Say, a link to the homepage of the first option would end up passing 1/5000th of value through to the Paris page.  However a link to the second option would pass 1/1000th of the link juice through to the Paris page.  So by thet logic, one only needs to do 1/5th of the work for each of the 5 sites ... that implies total SEO work would be the same? Thanks as always for any help! David

    | OzDave
    0

  • love this seo tactic but it seems hard to get people to adopt it Has anyone seen a successful badge campaign for a b2b site? please provide examples if you can.

    | DavidKonigsberg
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.