Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • What will I need to make amormensagens.com.br is in position 1 in Google to the word "mensagens"? Only anchor text will?

    | tibtos
    0

  • Howdy 🙂 My client has a .com site they are looking at hosting via Akamai - they have offices in various locations, e.g UK, US, AU, RU & in some Asian countries. If they used Akamai, would the best approach be to set up seperate sites per country: .co.uk .com .com.au .ru .sg etc Although my understanding is that Googlebot is located in the US so if it crawled any of those sites it would always get a US IP address? So is the answer perhaps to go with Akamai for the .com only which should target the US market and use different / seperate C class hosts for the others? Thanks! Woj

    | wojkwasi
    0

  • Hello Mozers, We're currently under going quite a large infrastructure change to our website and I wouldn't to hear your thoughts on the type of things we should be careful of. We currently have close to 4,000 individual products each with their own page. The seo work is then driven behind certain pages which house a catalog display of groups of products. The groups are done by style.  An example is we have a page called "Style A" which displays 8 different colours of style A. We then seo the style A page and the individual items received minimal seo work. The change would involve having one individual product page for each style but on that page the user would have the ability to purchase the different colours/variations via menus. This will result in approximately a %70 reduction in the size of our site (as several products will no longer be published) The things we are currently concerned with are: 1. The lose of equity to those unwanted 'style A' pages - I think a series of careful planned 301s will be the solution. 2. Possible loss of long tail traffic to the individual products which might not be caught by one individual page per style. 3. Internal link structure will need to be monitored to make sure that we're still highlight the most important pages as well, important. Sorry for the long post, it's a difficult change to explain without revealing the clients name - any other things we should be thinking about would be greatly appreciated! Thanks Nigel

    | NigelJ
    0

  • We work with a number of pharmaceutical sites that under FDA regulation must include an "Important Safety Information" (ISI) content block on each page of the site.  In many cases this duplicate content is not only provided on a specific ISI page, it is quite often longer than what would be considered the primary content of the page.  At first blush a rel=canonical tag might appear to be a solution to signal search engines that there is a specific page for the ISI content and avoid being penalized, but the pages also contain original content that should be indexed as it has user benefit beyond the information contained within the ISI.  Anyone else running into this challenge with regulated duplicate boiler plate and has developed a work around for handling duplicate content at the paragraph level and not the page level? One clever suggestion was to treat it as a graphic, however for a pharma site this would be a huge graphic.

    | BlooFusion38
    0

  • How do I set up a 301 redirect if the default settings for our web servers create multiple URLs for the same page but only views it as one page?

    | ibex
    0

  • Recently I have switched servers and was very happy about the outcome. However, every friday my site shuts down (not very cool if you are getting 700 unique visitors per day). Naturally I was very worried and digged deep to see what is causing it. Unfortunately, the direct answer was that is was coming from "rogerbot". (see sample below) Today (aug 5) Same thing happened but this time it was off for about 7 hours which did a lot of damage in terms of seo. I am inclined to shut down the seomoz service if I can't resolve this immediately. I guess my question is would there be a possibility to make sure this doesn't happen or time out like that because of roger bot. Please let me know if anyone has answer for this. I use your service a lot and I really need it. Here is what caused it from these error lines: 216.244.72.12 - - [29/Jul/2011:09:10:39 -0700] "GET /pregnancy/14-weeks-pregnant/ HTTP/1.1" 200 354 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)" 216.244.72.11 - - [29/Jul/2011:09:10:37 -0700] "GET /pregnancy/17-weeks-pregnant/ HTTP/1.1" 200 51582 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)"

    | Jury
    0

  • Are there any specific guidelines that should be followed for setting up a mobile site to ensure it isn't counted as duplicate content?

    | nicole.healthline
    0

  • Could profanity such as swear words or cursing decrease the trust Google has in a website?  For example, abusive comments... etc?  I'm not talking about the SafeSearch filter, I'm talking about the overall trust of the website. Your thoughts and suggestions are greatly appreciated.

    | Peter264
    0

  • Regarding traffic from twitter, I want to track this url -  http://www.ultraseo.com/white-hat-vs-black-hat/ and  generated the following URL using URL builder - http://www.ultraseo.com/white-hat-vs-black-hat/?utm_source=Twitter&utm_medium=Social%2Bmedium&utm_campaign=seo Should i now pass it through a URL shortener bitLy or googles own. and tweet it. Is this what i should do ? Please reply ... where will i get a report in GA ( under which heading ? )

    | seoug_2005
    0

  • Its the same old story, we all know it well.  I have a client that has a site with 20k+ pages (not too big) and traffic levels around 450k/month. Now we have identified 15 pages with various conversion points/great backlink metrics etc. that we are going to explicitly target in the first round of recs.  However, we are looking at about 18,000 dup title tags that I'd like to clean up. The site is not on a CMS and in the past I've had the dev team write a script to adopt the h1 tag or the name of the page etc as the title tag.  This can cause a problem when some of these pages that are being found in long tail search lose their positions etc.  I'm more hesitant than ever to make this move with this current client because they get a ton of long tail traffic spread over a ton of original content they wrote. How does everyone else usually handle this?  Thoughts? Thanks in advance Mozzers!

    | MikeCoughlin
    0

  • Hi everyone, for a few years now I've allowed school clients to pipe their news RSS feed to their public accounts on my site. The result is a daily display of the most recent news happening on their campuses that my site visitors can browse. We don't republish the entire news item; just the headline, and the first 150 characters of their article along with a Read more link for folks to click if they want the full story over on the school's site. Each item has it's own permanent URL on my site. I'm wondering if this is a wise practice. Does this fall into the territory of duplicate content even though we're essentially providing a teaser for the school? What do you think?

    | peterdbaron
    0

  • I have used OSE to look at links of a competitors site and notice they have dozens for links from Google Notebook pages eg http://www.google.pl/notebook/public/05275990022886032509/BDQExDQoQs8r3ls4j This page has a PA of 48 Is this a legitimate linking strategy?

    | seanmccauley
    0

  • With our e-commerce store, we can customize the URL for the product categories, so we could have: http://www.storename.com/product-category-keywords/ or http://www.storename.com/product-category-keywords.html From an SEO standpoint (or even from a "trying to get links" standpoint), which would be better to have? I feel like having a *.html category page would be easier for link building, but that's just my personal feelings. Side Note: Our product pages are: http://www.storename.com/product-name.html Thanks in advance

    | fenderseo
    0

  • Hi! In looking at my site's crawl diagnostics, I came across 2 pages that were flagged as duplicate content. I can't quite figure out why. The only difference in the URLS is an uppercase '"B" vs a lowercase "b" following the "~". Here are the URLS: lowercase b example:
    http://www.admissionsquest.com/~boardingSchoolNotes/ShowArticle.cfm/ArticleID/142/ArticleTypeID/12/Topic/What-Makes-a-Progressive-Boarding-School uppercase B example:
    http://www.admissionsquest.com/~BoardingSchoolNotes/ShowArticle.cfm/ArticleID/142/ArticleTypeID/12/Topic/What-Makes-a-Progressive-Boarding-School Is that the problem? Any advice is very much appreciated. Thanks! Peter

    | peterdbaron
    0

  • Ann Smarty mentions in a post (http://www.searchenginejournal.com/200-parameters-in-google-algorithm/15457/)  the addition of Google Analytics adds SEO value. We have a different analytics tool, do you think it is necessary to add Analytics? How important do you think it is?

    | nicole.healthline
    0

  • I was wondering if there was still effectiveness in the finding of directories via: plastic surgery "submit site". Below is an example of the directories I found that offer paid listings into their directory. How do you measure what a good price is versus too expensive? How do you evaluate which directories are worth it? Alexa rank, pr and inbound links? If so what are the metrics you use? Obviously we are purely looking from a rankings/seo perspective because nobody actually uses these directories right? Avia directory pr5 - the featured page your link would be on is pr 4. http://www.avivadirectory.com/Health/Cosmetic-Surgery/
    A permanent link is $149 and a annual link is $49 Findelio pr 4 - however page where my liink would be is pr0 http://www.findelio.com/5981/Cosmetic_and_Plastic_Surgery/
    One time fee is $39. So although this is much cheaper but the pr is none. So in this instance do you not buy the submission? I do notice some of my competition in some of these directories. Should that be my indicator?
    I thought that maybe these companies bought into these directories a long time ago and wouldn't still do so today. Is there more effective uses of my $50, $100 or $150 or whatever? Interested to see what peoples thoughts are on this type of linkbuilding in todays world. Or if that even matters and this will always be beneficial? Thanks!

    | PEnterprises
    0

  • Lets say I want to branch out from the normal ".com" genre,  can you Rank a ".ly" in google search results?

    | HCGDiet
    0

  • Some sites (including 1 or two I work with) have a legitimate reason to have duplicate content, such as product descriptions. One way to deal with duplicate content is to add other unique content to the page. It would be helpful to have guidelines regarding what percentage of the content on a page should be unique. For example, if you have a page with 1,000 words of duplicate content, how many words of unique content should you add for the page to be considered OK? I realize that a) Google will never reveal this and b) it probably varies a fair bit based on the particular website. However... Does anyone have any experience in this area? (Example: You added 300 words of unique content to all 250 pages on your site, that each had 100 words of duplicate content before, and that worked to improve your rankings.) Any input would be appreciated! Note: Just to be clear, I am NOT talking about "spinning" duplicate content to make it "unique". I am talking about adding unique content to a page that has legitimate duplicate content.

    | AdamThompson
    0

  • I know this has been discussed but was wondering what would be the best approach from an SEO perspective. I quite like the idea of setting up websites with domains without www but always worry that setting up domains without www has a disadvantage because user are use to referring to sites with the www included. Thus one of my fears are that users would link back using www version which will mean even if you do a 301 redirect that some of the link juice would be lost. I know some famous sites have used this convention such as http://searchenginewatch.com/ so think it would be possible but still concerned that for new sites it would be better to rather stick to conventions. What are your opinions about this?

    | SABest
    0

  • I'm looking for a tool that can do the following:
    Organize a keyword universe and its data/metrics:
    -track keyword data over time (search volumes/trends, relative competition metrics, rankings,
    etc)
    -Sort keywords into buckets/silos/ad groups
    -allow you to assign individual keywords to multiple silos/groups and show the relationships between groups based on keyword relationships.
    -incorporate a site map 
    -tie keyword targets to static pages, informational content (SEO) & landing pages (PPC)
    -help with KW and/or competitive research (optional)
    -tie into web analytics / marketing on-demand software (optional) I know that this is a lot of functionality, but for enterprise search marketing, this could be a game changer for my strategy (if it exists currently) or for the industry (if it doesn't exist). Please share you solution suggestions here...

    | PTC4SEO
    0

  • What would be the best SEO META Description tag for forum topics on a forum type website? I can think of a few options so far Snippet of first post. Title of the topic with templated trailing text Remove description tag completely Your thoughts and suggestions are greatly appreciated.

    | Peter264
    0

  • Here's a technical question. Suppose I have a page relevant to the term "Mobile Phones". I have a piece of text, on that page talking about "mobile phones", and within that text is the term  "cell phones". Now if I link the text "cell phones", to the page it is already placed on (ie the parent page) - will the page gain more relevancy for the term "cell phones"?? Thanks

    | James77
    0

  • I have been using multiple xml sitemaps for products for over 6 months and they are indexing well with GMT. I have been having this manually amended when a product becomes obsolete or we no longer stock it. I now have the option to automate the sitemaps from a SQL feed but using .asp sitemaps that I would submit the same way in GWT. I'd like your thoughts on the Pro's and cons of this, pluses for me is realtime updates, con's I percieve GMT to prefer xml files. what do you think?

    | robertrRSwalters
    0

  • Not sure if this has been discussed before, but if you make certain searches in google i.e. "sports authority" you will find that the site ranked #1 has extra links below it like a "library" into their site. How do you get this "library" to appear on your site when your site is in #1? Attatched is a URL to flickr with a screenshot of the search. http://www.flickr.com/photos/65949972@N07/6006729328/in/photostream/

    | HCGDiet
    0

  • If my audience searches both "theatre" and "theater", how can I keyword optimize for both terms while maintaining consistency on my site? Does google at all see these as interchangeable, or are they treated as completely different words? city] theater and [city] theatre are our 3 and 6 best performing keywords, with theater getting roughly twice as many hits.

    | RyanWhitney15
    0

  • I am new to SEO but have been hired to handle the SEO for a martial arts school.  They had previously attained top three rankings primarily using nofollow on every link on the homepage except footer links which had anchor text of their keywords pointing to their second tier pages.  Each of those second tier page also had all nofollow except a single footer link that had a keyword anchor text link going back to the home page. Seemed to work for them. I was going to keep it as as well as focus on creating about 10 separate wordpress blogs.  They want to give each blog to a student who will post daily and from each post link to their site via anchor text.  Anything wrong with this? Thanks Wiliam

    | whorneff310
    0

  • Hello , I've been trying for several days to understand how keyword research works for a multi purpose website,I've read guides, articles even some chapters from the book" The Art of Seo" by O'Reilly and still no luck. It seems i can't wrap my head around keyword research,lets say I have a social gaming community website and I'm trying to rank it first on some low competition keywords + some long tail keywords.The website has functions like leaderboards, profiles,events, competitions,etc so it's not actually a news related website but it will have a blog. My website being on the games niche It would imply that I should target words that contain the word "Games" but this word generates millions of searches globally so ranking first its nearly impossible if the website is brand new. This made me pursue generic keywords formed with 2 / 3 words like fresh games, new games, mmorpg games, fps games,etc which still generate lets say 30.000 searches globally each. Due to the different areas of the website like latest game events,latest games competitions,etc I'm confused If i should pursue website specific keywords like latest games events, fresh games events, latest games competitions, upcoming games competitions but these too generate 30.000 global searches each,so... 0.should i use generic keywords or keywords that include site features? So let's say I decide to pursue generic "games" keywords,due to a high competition based on the keyword I decide to go a layer deeper and for the keyword "fresh games" I obtain keywords like** "fresh games 2011,top fresh games 2011, upcoming fresh games** " and thus building a list of 30 keywords that contain " fresh games".If i do this for the rest of the keywords: ** new games, mmorpg games, fps games,etc**  I end up with a list of 10.000 keywords or more since each keyword generates other keywords. Is this the correct approach ? since generating 10.000 keywords sounds a lot and I'm getting the feeling that It's not how it supposed to be done,like were would I insert 10.000 keywords? So how do I know which keywords to pick and aim in order to try to get no.1 ranking? and why those? How many keywords should I use? and where should i put them? since it's not a news website so writing a lot of articles isn't an option. Should I focus on 2 words keywords with around 10.000-30.000 seaches or 2 words keywords + long tail keywords with less traffic like 100 - 5000? Is there a guide for the Keyword Analysis Tool since if i enter "fresh new games" i get a 39% keyword difficulty,is that hard to rank? and I don't know what all those color mean since some of them have higher numbers then others that are found at the top and how can i get beat a website that has has rank 10. So hopefully with your help & by some miracle I will finally be able to build a keyword list. Thank you !

    | arching
    0

  • I have a situation where a site that was ASP.net has been replaced with a WordPress site.  I've performed a Open Site Explorer analysis and found that most of the old pages, ie www.i3bus.com/ProductCategorySummary.aspx?ProductCategoryId=63 are returning a HTTP Status = NO DATA ... when followed ends up at the 404 catch-all page. Can I code the standard 301 Redirects in the .htaccess file for these ASP URLs? If not, I'm open to suggestions.... Thanks Bill

    | Marvo
    0

  • Hi, We recently moved our community website (around 50K web pages) to our main domain. It now resides as a sub-domain on our main website. e.g. Before - we had www.mainwebsite.com and www.communitywebsite.com After - we have www.communitywebsite.mainwebsite.com This change took place on July 19th. After a week, we saw 16% drop in organic traffic to mainwebsite.com. Our ranks on most of the head keywords including brand keywords have dropped. We had created 301 redirects from pages on www.communitywebsite.com before this change was made. Has anybody seen this kind of impact when domains are merged? Should we expect that within 3-4 weeks Google will be able to re-index and re-rank all the pages? Is there anything else we could do to rectify the situation? Any feedback/suggestions are welcome!

    | Amjath
    0

  • Hello again.. Last blog question for a while, I promise! 🙂 The annoying folk behind my website say that the only way for my blog to be at http://www.celynnenphography.co.uk/blog would be to frame forward it, because of how they are hosting, managing it etc Is this an acceptable and useful thing regarding SEO? (I want my website to benefit from my blog's content) Thanks a lot guys! Ioan

    | IoanSaid
    0

  • I have a few URLs... Is there any benefit for me to frame forward these empty domains?

    | IoanSaid
    0

  • Since we weren't the ones who designed above mentioned website there is something we really don't understand. They have replaced images with text using css as below examples. CSS ---------> div#logo { background: #fff url(../images/logo.gif) no-repeat 20px 0px; margin-bottom: 30px; } div#logo a { height: 148px; text-indent: -1000em; display: block; } HTML ----------- > ****************** What my question is out of 100 scale how much does this affect SEO ? What if we keep H1 tags black without putting any text between tags ?

    | Osanda
    0

  • I've never seen a SERP like this, has anyone else? Image no longer available

    | ATShock
    1

  • A client's site where both the www. and non-www. versions are both being indexed.  The non-www. version have has roughly 1000 or so links where the www. version has over twice as much pointing back to the site.  In addition, the www. version has higher domain authority. Their programmer has suggested that they can't implement 301's permanent redirects across their site for a few reasons. My question is, what would be the best alternative to block/redirect the non-www. version from being indexed yet still pass link-juice?

    | VidenMarketing
    0

  • Hi there, We're currently looking into integrating a new internal search function to our site which will involve housing the search results on a sub domain of our site. We have no intention of these search result pages becoming landing pages for organic traffic but would the inclusion of a sub domain affect the optimization of the main domain? i.e. could it effect our authority? Nige

    | NigelJ
    0

  • This is just a general theoretical discussion to provoke some thought. Suppose I have a 2 synonym keywords - which mean identical things. EG - "Golf Holiday" "Golf Break" - probably you can think of better ones, but you get the idea. On the following assumptions: Google knows these synonyms have identical meaning. Google want to provide the searcher with the "best possible result set". By definition there can only be 1 "best possible results set" If the above is true, then Google should produce identical result sets for either of these terms - **So why don't they?  **

    | James77
    0

  • What is the reason that 301 is preferred and not rel canonical tag when it comes to implementing redirect. Page rank will be lost in both cases. So, why prefer one over the other ?

    | seoug_2005
    0

  • I have a real estate website that has a city hub page. All the homes for sale within a city are linked to from this hub page. Certain small cities may have one home on the market for a month and then not have any homes on the market for months or years. I call them "Ghost Cities". This problem happens across many cities at any point in time. The resulting city hub pages are left with little to no content. We are throwing around the idea of 301 redirecting these "Ghost City" pages to a page higher up in the hierarchy (Think state or county) until we get new homes for sale in the city. At that point we would remove the 301. Any thoughts on this strategy? Is it bad to turn 301s on and off like that? Thanks!

    | ChrisKolmar
    0

  • I have an interesting problem SEOmozers and wanted to see if I could get some good ideas as to what I should to for the greatest benefit. I have an ecommerce website that sells tire sensors. We just converted the old site to a new platform and payment processor, so the site has changed completely from the original, just offering virtually the same products as before. You can find it at www.tire-sensors.com We're ranked #1 for the keyword "tire sensors" in Google. We sell sensors for ford, honda, toyota, etc -- and tire-sensors.com has all of those listed. Before I came along, the company I'm working for also had individual "mini ecommerce" sites created with only 1 brand of sensors and the URL to match that maker. Example : www.fordtiresensors.com is our site, only sells the Ford parts from our main site, and ranks #1 in Google for "ford tire sensors" I don't have analytics on these old sites but Google Keyword Tool is saying "ford tire sensors" gets 880 local searches a month, and other brand-specific tire sensors are receiving traffic as well. We have many other sites that are doing the same thing. www.suzukitiresensors.com (ranked #2 for "suzuki tire sensors") Only sells our Suzuki collection from the main site's inventory etc We need to get rid of the old sites because we want to shut down the payment gateway and various other things those sites are using, and move to one consolidated system (aka www.tire-sensors.com) Would simply making each maker-specific URL (ie. fordtiresensors.com) 301 redirect to our main site (www.tire-sensors.com) give us to most benefit, rankings, traffic etc? Or would that be detrimental to what we're trying to do -- capturing the tire sensors market for all car manufacturers? Suggestions? Thanks a lot in advance! Jordan

    | JordanGodbey
    0

  • Since starting our business back in 2006 we've gone through alot of branding, and as a result URL and architectual migrations. This has always something that has been driven by usability, brand awareness and technical efficiency reasons, while knowing that there would be SEO hits to take from it....but ultimately hoping to have a much stronger foundation from an SEO perspective in the long run. Having just gone through our most recent (and hopefully final) migration, we are now about 15% down on traffic (although more like  35% - 40% in real terms when seasonality is stripped out). Below is a timeline to our structural history: 2007 - 2009 = We operated as a network of inidividual websites which started as 1, www.marbellainfo.com, but grew to 40, with the likes of www.thealgarveinfo.com, www.mymallorcainfo.com, www.mytenerifeinfo.com, www.mymaltainfo.com etc.. 2009 - 2010 = We decided to consolitdate everything onto 1 single domain, using a sub-domain structure. We used the domain www.mydestinationinfo.com and the subdomains http://marbella.mydestinationinfo.com, http://algarve.mydestinationinfo.com etc.. All old pages were 301 redirected to like for like pages on the new subdomains. We took a 70% drop in traffic and SERPS disappeared for over 6 months. After 9 months we had recovered back to traffic levels and similar rankings to what we had pre-migration. Using this new URL structure, we expanded to 100 destinations and therefore 100 sub-domains. 2011 = In April 2011, having not learnt our lesson from before :(, we undwent another migration. We had secured the domain name www.mydestination.com and had developed a whole new logo and branding. With 100 sub-domains we underwent a migration to the new URL and used a sub-directory folder. So this time www.myalgarveinfo.com had gone to <a></a>http://algarve.mydestinationinfo.com and was now www.mydestination.com/algarve. No content or designs were changed, and again we 301 re-directed pages to like for like pages and with this we even made efforts to ask those linking to us to update their links to use our new URL's. The problem: The situation we fine ourselves in now is no where near as bad as what happend with our migration in 2009/2010, however, we are still down on traffic and SERPS and it's now been 3 months since the migration.  One thing we had identified was that our re-directs where going through a chain of re-directs, rather than pointing straight to the final urls (something which has just been rectified). I fear that our constant changing of URL's has meant we have lost out in terms of the passing over of link juice from all the old URL's and loss of trust with Google for changing so much. Throughout this period we have grown the content on our site by almost 2x - 3x each year and now have around 100,000 quality pages of unique content (which is produced by locals on the ground in each destination). I'm hoping that someone in the SEOmoz Community might have some ideas on things we may have slipped up on, or ways in which we can try and recover a little faster and actually get some growth, as opposed to working hard and waiting a while just for another recovery. Thanks Neil

    | Neil-MyDestination
    0

  • Howdy, I work on a fairly large eCommerce site, shop.confetti.co.uk. Our CMS doesn't allow us to have 1 product with multiple colour and size options so we created individual product pages for each product variation. This of course means that we have duplicate content issues. The layout of the shop works like this; there is a product group page (here is our disposable camera group) and individual product pages are below. We also use a Google shopping feed. I'm sure we're being penalised as so many of the products on our site are duplicated so, my question is this - is rel="canonical" the best way to stop being penalised and how can I implement it? If not, are there any better suggestions? Also, we have targeted some long-tail keywords in some of the product descriptions so will using rel-canonical effect this or the Google shopping feed? I'd love to hear experiences from people who have been through similar things and what the outcome was in terms of ranking/ROI. Thanks in advance.

    | Confetti_Wedding
    0

  • Hello, I heard that 301 redirect can be good for newly registered domain names can i buy a old domain name and put 301 redirect on it to my newly registered niche market domain name. Shall i buy only 1 domain name and put 301 redirect to my newly registered domain names or i can do this for more than 1 old domains i purchased?

    | anand2010
    0

  • I discovered a while ago that we had quite a number of links pointing back to one of our customer's websites. The anchor text of these links contain porn that is extremely bad. These links are originating from  forums that seems to link between themselves and then throw my customers web address in there at the same time. Any thoughts on this? I'm seriously worried that this may negatively affect the site.

    | GeorgeMaven
    0

  • Should I exclude the tag pages? Or should I go ahead and keep them indexed? Is there a general opinion on this topic?

    | NikkiGaul
    0

  • I am being advised by an SEO that each page of my ecommerce site must have a significant block of unique text "above the fold" to do well in Google post-Panda.  This recommendation is at odds with what my design/usability/conversion people want to see.  The current site design features eye-catching graphics just below the header and goes right into product listings, with SEO text near the bottom of the page. How important is it to have SEO text near the top of a page?

    | mhkatz
    0

  • I have a client whose site isn't necessarily penalized since they still show for many terms in the SERPS, however at one point they did an xrummer blast of 13,000 links for two anchor texts they were trying to rank for. They have purchased a new domain and have gone white hat and want to 301 some of the old site to the new purely for the users sake so past visitors still find them at t the new location. Will creating 301 redirects pass on to the new domain any bad Karma from the old one in Google's eyes? Thanks for the help.

    | JoshGill27
    0

  • Hello, I am doing some seo for a company that has a ranking of 63 in opensiteexplorer and over 600 mostly high quality linking domains. A problem I can see is that there are no stand out keywords that can be used for there business, all keywords I have researched have at best 4000 searches in google.com.au. they get around 50k visits per month from google and the 8 highest performing keywords besides the biz name range from 60-140 visits! all long long tail results, 58,771 total visits via 44,879 keywords! It is a business directory and the site is broken up into 15 categories. each category has no backlinks to them or minimal. The company has editorial staff and produces high quality news on there site. I was thinking article marketing, what sites are good for high quality submissions? Also the site has many links from wikipedia articles but these dont show up in opensiteexplorer, why could this be? Could social bookmarking of there wikipedia links help? Thanks for any help

    | adamzski
    0

  • Hi everyone, I'm implementing some fairly significant changes on a clients website and wanted to know if it was better to implement all the changes at once or if I should implement the changes gradually. The changes are: 1. Amended information architecture 2. Completely new URL's 3. New meta data and some new on page content 4. Meta robots 'no index, follow' approximately 90% of the site Can I make all these changes in one go (that would be my preference), or should I gradually implement? What are the risks? Many thanks James

    | jamesjackson
    1

  • So this website -> http://bailbondsripoffreport.com/ Ranks on the First Page for the term "Bail Bonds" It's the spammiest crappiest piece of junk website ever! lol - How does this site rank so well, it's not even a year old and it's link structure is crap. Can I like report them and have them removed lol. Any ideas would be appreciated. Thanks!

    | utahseopros
    0

  • When I type in a competitor name (in this case "buycostumes") Google shows several related websites in it's "Pages Similar to..." section at the bottom of the page: My question, can anyone tell me where the text comes from that Google uses as the link.  Our competitors have nice branded links and our is just a keyword.  I can find nothing on-page that Google is using so it must be coming from someplace off-page, but where?

    | costume
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.