Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • After signing up to SEOmoz as a pro user and sorting out all the things that the search flagged up with our website (htyp://www.whosjack.org) we jumped very slightly in search only to continue going down again. We are a news based site, we have no dup content, we have good writers and good orangic links etc I am currently very close to having to call it a day. Can anyone suggest anything at all from looking at the site or suggest a good SEO firm that I could talk to who might be able to work out the issue as I am totally at a loss as to what do do now. Any help or suggestions greatly appreciated.

    | luwhosjack
    0

  • Dear colleagues, I have quite an unusual situation with one of my client's websites, and I could use an advise from someone who experienced the same circumstances: They are currently planning on launching a new site under the same domain (by September), when several key current pages are intended to be replaced with new equivalent pages under new URLs. So far it's pretty simple, BUT - due to a merger with another company they will be migrating their entire website to a different domain within a year. My question is - what would be the optimal solution for redirects? We are considering a 301 from the current pages to the new pages under the same domain, and once the new domain is activated - aside from defining 301 redirects from the new pages under the same domain to the new domain, we will cancel the original 301 from the old pages to the new pages on the same domain, and instead define new 301 for those pages to the new domain. What do you think? Is there a better solution - like using 302 redirects for the first stage? Has anyone tried such a procedure? Your input will be highly appreciated! Thanks in advance, Omer

    | Usearch
    0

  • Hello, I have a client who has a large ecommerce website. Some category names have been created with comma's in - which has meant that their software has automatically generated URL's with comma's in for every page that comes beneath the category in the site hierarchy. eg. 1 : http://shop.deliaonline.com/store/music,-dvd-and-games/dvds-and-blu_rays/ eg. 2 : http://shop.deliaonline.com/store/music,-dvd-and-games/dvds-and-blu_rays/action-and-adventure/ etc... I know that URL's with comma's in look a bit ugly! But is there 'any' SEO reason why URL's with comma's in are any less effective? Kind Regs, RB

    | RichBestSEO
    0

  • My site has suffered greatly since the recent Google update. I have done everything as suggested.  I have had all bad links removed over 2 months ago.  I have lowered keyword density (not easy since the keyword is in our company name!).  I have rewritten various content and bolstered our existing content. What gives?  What can I do? As an example the keyword, "maysville plumber"  - I rank about 40th for this keyword.  The first three pages are filled with websites with literally NO content or no added value.  Maysville is a town of about 1k residents - there is no competition.  Before the update I was #1 for years on this particular keyword.  And this is the case with 35 other cities (mostly small cities, but a few larger ones). Please help me understand or suggest what I can possibly do at this point.  We have hundreds of pages of unique content on each and every page.  We have zero duplicate content (I have ran tests and crawlers).  We have no fishy links.  I have not gotten any messages from google on Webmasters. PLEASE HELP!! I asked a similar question a little while back and fixed all of the suggestions. My site is www.akinsplumbing.net.

    | chuckakins
    0

  • Hello everyone, I'm testing the pro software and recently I installed an SSL Certificate on one of the websites I'm monitoring, I put in place an .htaccess directive to force all traffic to the secure version of the site (https) and I noticed how this raised a warning because my directive is forcing the traffic with a 302 redirect. These are the lines: _RewriteCond %{SERVER_PORT} 80 _ RewriteRule ^(.*)$ https://example.com/$1 [R,L] I understand that this is not good so I figured since I'm already redirecting all www to -www I can force traffic that arrives trying to use www to the secure version like so: RewriteCond %{HTTP_HOST} !^example.com$ RewriteRule (.*) https://example.com/$1 [R=301,L] But this is not 100% effective because if someone visits the site directly on the -www version this person wont get redirected hence it wont be forced to see the https. So my question is: does anybody know of an alternate way to force traffic to the secure socket using a 301 instead of a 302? Oh boy, just by writing the question I think I may have figured it out, I'll post it anyways because (1) I could be wrong and (2) It could help someone else. It just hit me but the directive that is forcing www to -www specifies what type of redirect to do here [R=301,L]. So to try to answer my own question before even posting it this could probably do the trick: _RewriteCond %{SERVER_PORT} 80 _ _RewriteRule ^(.*)$ https://example.com/$1 [_R=301,R,L] I'll be testing it out ASAP and again I'll post the question anyways just in case it doesn't work, in case someone has a good suggestion or to help someone that could be in the same situation. If this is turns out right I will need someone to slap me in the face 😐

    | stevenpicado
    0

  • I have duplicate content and am trying to figure out which URL to remove. What should I take into consideration? Authority? How close to the root the page is? How clear the path is? Would appreciate your help! Thanks!

    | Ocularis
    0

  • Hi, I'm pretty new to SEO and I'm trying to figure out if I'm on the right track. Mid May I rebuilt a website completely from .asp to .php, the original site was completely makeshift and had some really un-pretty code in it, not to mention the design was completely out of date. Since I rebuilt it the search has slowed and then picked up a little now it's slowed to almost a stop. I know it hasn't been that long, but am I doing something wrong? I know the think i'm missing most is link-building which we are working on now, but is there something else I'm missing to make the search drop like this this past week? I feel like it's soo much better then before but I'm not seeing any results and now I'm really second guessing myself? Any suggestions? please help! the site: http://www.moondoggieinc.com THANKS! KristyO

    | KristyO
    0

  • Hi, Trying to diagnose the fall of our site. We fell mainly with Panda 3.4 and then a little more with Penguin. We have a main site with 200 pages and an attached blog.  example domain.com/blog Then blog that was really small with only 7 posts. One keyword phrase example: "ace widget software" has ranked # 2 and 3 through the entire storm.  The page that is ranking is in our main root site (not the blog). We used to rank for 200 phrases now only rank for about 10 Over the past week I stumbled on the fact that if I create a new post in my blog, those pages rank in 3 days. Good rankings, #2 on one and at least first page on the other 5 pages. One page ranked #2 in 17 hours. The test I am conducting: I am now testing to see if maybe there is some coding issue on our site, we do not use a template but a 3 column design built in Dreamweaver using older style tables etc. 1. Putting a new page on the old design. 2. Taking an existing page and putting into new design without side columns. 3. Already testest - adding new page to blog (success on this test) Seems if it was a coding issue/ design the two or three keywords phrases that stayed steady through the storm would have fallen. our site: www.TranslationSoftware4u.com Has anyone else been adding new content to see it rank really good but cannot get the other pages to bounce back up in rankings? Open to ideas of why this is happening. Thanks in advance! Force7

    | Force7
    0

  • I have a bunch of duplicate pages/duplicate title issues because of Joomla's item/category/menu structures. I want to tell search engines not to crawl, and also to unindex anything in those directories in order to solve the duplicate issues. I thought of disallowing in robots.txt, but then I realized that might not remove the URLs if they've already been indexed. Please help me figure this out.

    | Ocularis
    0

  • I found a smart search suggestion plugin that looks amazing and thought I'd see if anyone has experience with it.  www.suggest.io looks like a great usability improvement and even helps to do keyword research with what your visitors are searching on your site. I'm excited about it and feel like it is too good to be true.  Does anyone have experience with this plugin or something similar?

    | TheDude
    0

  • Am I jumping the gun on expecting results? I recently switched our ecommerce website from .asp to .php (new site went up May 15th) but we did not switch domain names. we seems to be doing better until a few days ago when traffic took a steep drop... (not like we were getting that much in the first place either) I was wondering if I'm doing something completely wrong or do i need to wait longer? Are big swings normal when relaunching a website? am I just being too anxious? The internal pages are getting no juice and I dont know why... I know i need to build on the links to the site?  but am I doing something else completely wrong to see the orgranic search results drop down to almost nothing? I'm really new to SEO and would love if i could get another opinion on http://www.moondoggieinc.com. Thanks! Kristy

    | KristyO
    0

  • We have multilingual websites with some content variations but 60% of the content on site remains the same. Is it still advisable to use:rel=alternate hreflang option on ccTLDs  when ccTLDs are in itself strong signal for Google to display result in respective countries 1. example.com 2. example.co.uk 3. example.co.jp 4. example.de

    | CyrilWilson
    0

  • Hi Mozzers I have a WordPress site with Relevanssi the search engine plugin, free version. Questions: Should I let Google index my site's SERPS? I am scared the page quality is to thin, and then Panda bear will get angry. This plugin (or my previous search engine plugin) created many of these "no-results" uris: /?s=no-results%3Ano-results%3Ano-results%3Ano-results%3Ano-results%3Ano-results%3Ano-results%3Akids+wall&cat=no-results&pg=6 I have added a robots.txt rule to disallow these pages and did a GWT URL removal request. But links to these pages are still being displayed in Google's SERPS under "repeat the search with the omitted results included" results. So will this affect me negatively or are these results harmless? What exactly is an omitted result? As I understand it is that Google found a link to a page they but can't display it because I block GoogleBot. Thanx in advance guys.

    | ClassifiedsKing
    0

  • Should I change my wordpress permalinks to include the keyword? For examples at the minute my url is http://www.musicliveuk.com/home/wedding-singer. Is it better to be http://www.musicliveuk.com/live-bands/wedding-singer. 'home' is not relevant so surely 'live-bands' would be better? If I change the urls won't I lose 'link juice' as external links will all point to a url that no longer exists? Or will wordpress automatically redirect the old url to the new one? Finally, if I should change the url as described how do I do it on wordpress? I can only see how to edit the last bit of the url and not the middle bit.

    | SamCUK
    0

  • I'd love to hear what you guys do.  I think the game has changed now in the era of Penguin.  Gone are the days where you can build a few quick links and suddenly your new site is ranking fairly well. Obviously, the first steps are to get the on page SEO right - titles, good keyword use, etc. Eventually, the goal may be to build some content that will attract natural links.  But, we all know that is going to take time. So, here's the scenario: You've got a new client with a website in a fairly non-competitive niche, say, a piano mover in Seattle.  The site is indexed, with no external backlinks.  You're happy with the current on page SEO.  The site is currently ranking on page 3 for "Seattle Piano Mover".  What would you do next?

    | MarieHaynes
    0

  • Hi, I have various pages set up such as http://www.musicliveuk.com/home/wedding-singer. Is there any benefit in SEO terms to changing it to http://www.musicliveuk.com/live-music/wedding-singer? I would presume that a keyword would be better for SEO than 'home' which is irrelevant? Also if I were to change it would all the links external I have on other sites pointing to http://www.musicliveuk.com/home/wedding-singer be lost as the url no longer exists? I suppose I could set up a manual redirect from http://www.musicliveuk.com/home/wedding-singer to http://www.musicliveuk.com/live-music/wedding-singer or would wordpress automatically redirect from the old to new? By redirecting I understand that some 'link juice' is lost along the way so is including the keyword in the url of enough benefit to warrant losing some link juice? Finally if I do change the url to include the keyword how do I do it in wordpress? I can only see how to change the page title using the 'edit' button when editing a page?

    | SamCUK
    1

  • We have following scenario: Our main website - www.esedirect.co.uk which gets a 1800 visits a day with around half of those from organic search. It's been around since 2004. Our original website - www.ese.co.uk which gets around 30 visits a day and really is nothing more than a doorway page with links to the above site and couple of other sites that belong to the same company. This is an old domain that's had content since 1997 and has good domain authority with some good links. We are considering doing a 301 redirect from www.ese.co.co.uk to www.esedirect.co.uk to redirect the link juice. I welcome opinions to any possible negative effects this could give and how beneficial doing this will be. Thanks, Lee

    | ese
    0

  • Hi, Just a general discussion really, what sort of thing have you been up to regarding social media and it helping SEO. One thing that does interest me is generating tweets/likes and also using Google plus profile to help SEO.

    | activitysuper
    0

  • Posting new PDF.  Is there anything beyond maintaining the PDF metadata, filename and URL I need to do to keep its current top ten search ranking on Google for a specific generic noun which it currently ranks for? Many thanks ahead of time for your help.

    | nabarro
    0

  • Hey everyone, We are designing a new website with over 50 pages and I have a question regarding the layout. Should I target my long tail keywords via blog pages? It will be easier to manage and list and link out to similar articles related to my long tail keywords using a word press blog. For this example - lets suppose the website is www.orange.com and we sells 'Oranges' Am I going about this in the right way? Main Section: Main Section 1 : Home Page - Keyword Targeted - Orange Main Section 2 : Important Conversion page - 'Buy oranges' Long Tail Keyword (LTK) 1: www.orange.com/blog/LTK1 Subsection(SS): www.orange.com/blog/LTK1/SS1 www.orange.com/blog/LTK1/SS1a www.orange.com/blog/LTK1/SS1b Long Tail Keyword (LTK) 2: www.orange.com/blog/LTK2 Long Tail Keyword (LTK) 3: www.orange.com/blog/LTK3 Subsection(SS): www.orange.com/blog/LTK1/SS3 www.orange.com/blog/LTK1/SS3a www.orange.com/blog/LTK1/SS3b All these long tail pages and sub sections under them are built specifically for hosting content that targets these specific long tail keywords. Most of my traffic will come initially via the sub section pages - and it is important for me to rank well for these terms initially. _E.g. if someone searches for the keyword 'SS3b' on Google - my corresponding page www.orange.com/blog/LTK1/SS3b should rank well on the results page. _ For ranking purposes - will using this blog/category structure hurt or benefit me? Instead do you think I should build static pages? Also, we are targeting more than 50 long tail keywords - and building quality content for each of these keywords - and I assume that we will be doing this continuously. So in the long term term which is more beneficial? Do you have any suggestions on if I am going about this the right way? Apologies for using these random terms - oranges, LKT, SS etc in this example. However, I hope that the question is clear. Looking forward to some interesting answers on this! Please feel free to share your thoughts.. Thank you! Natasha

    | Natashadogres
    0

  • Has anyone noticed a significant drop in indexed pages within their Google Webmaster Tools sitemap area? We went from 1300 to 83 from Friday June 23 to today June 25, 2012 and no errors are showing or warnings. Please let me know if anyone else is experiencing this and suggestions to fix this?

    | datadirect
    0

  • It appears that the free online tools limit the number of URLs they'll include.  What tools have you had success with?

    | NaHoku
    1

  • I'm using the SEO on page tool from this site but I get no help on how to fix the problem. I'm looking for the all in 1 tool that tells me everything about my site (description, keyword density, what to improve, what to add, what to remove, etc). If someone would be kind enough to review it for me, that would be great !

    | 678648631264
    0

  • If my keyword is dog training and I wanted to make 5 posts on a blog, do I target all the posts with the keyword of dog training or what?

    | 678648631264
    0

  • Hi there, Quick overview of a client who came to us after having the vast majority of their link value slashed by Penguin. Client only has a limited 'recovery budget' Client had been outsourcing SEO to a foreign company The website was 'keyword stuffed' when we arrived Links were of poor quality, the company clearly majoring on quantity rather than quality. Client was ranking #4 or  #3 for a keyword which was bringing in sales. Post penguin, dropped to page 5 and then out of the top 100 for that keyword, losing 70% of sales. The client, under our supervision has Rewritten spammy content so they work for human beings (so it now reads well) Gone through the website and is removing old/duplicate and low-quality content De-emphasised other pages for the target keyword so that only one page majors on it. After doing the above has submitted a reconsideration request (about 2 weeks ago, so I know there's time). We are focussing on ensuring her content is written well and on building decent links to the site (i.e. to put some good-uns where the bad ones were). We're into month 2 of the 'clean-up exercise' and the site is still only ranking #90 for the keyword. Given the client's budgetary limitation, could it be more beneficial to consider a new brand identity and domain name to start afresh (without a 301 redirect) or should we just continue along the track we are doing with this client? Thanks!

    | Nobody1560986989723
    0

  • Hello, all SEOers. Today, I would like to get some ideas about handling multiple domains. I have a client who bought numerous domains under purpose of prevent abuse of their brand name and at the same time for future uses. This client bought more than 100 domains. Some domains are paused, parked, lived and redirected to other site. I don't worry too much of parked domains and paused domains. However, what I am worrying is that there are about 40 different domains are now redirected to single domain and meta refresh was used for redirections. As far as I know, this can raise red flag for Google. I asked clients to clean up unnecessary domains, yet they want to keep them all. So now I have to figure out how to handle all domains which are redirect to single domain. So far, I came up with following ideas. 1. Build gateway page which shows lists of my client sites  and redirect all domains to gateway page. 2. Implement robots.txt file to all different domains 3. Delete the redirects and leave it as parked domains. Could anyone can share other ideas in order to handling current status? Please people, share your ideas for me.

    | Artience
    0

  • Hi All, Quick question ,  Are we correct in thinking that for any given URL it's not possible to do a 301 redirect AND a canonical tag? thanks Sarah

    | SarahCollins
    0

  • Hello, I recently changed our websites url structures removing the .html at the end. I had about 55 301's setup from the old url to the new. Within a day all the new URL's were listed in Google, but the old .html ones still have not been removed a week later. Is there something I am missing? Or will it just take time for them to get de-indexed? As well, so far the Page Authority hasn't transfered from the old pages to the new, is this typical? Thanks!

    | SeanConroy
    0

  • Hi Guys, This might be more of a joomla thiing than a SEO thing but it is correlated as I need to seo this pgage and i cant find it. Please help if you can, while my developer is on hols, this is driving me nuts!! I can find the article sections in Joomla 2.5 to edit all the text in my other pages but for some reason cannot find the text for the home page!!??? any ideas? Please...?? He set a lot of it up using CSS and Jquery / php etc....so im a little confused as to why I can find the html to edit.......aaahhhhhhhh Thanks guys, Im sure its quite easy!! Thanks in advance. Craig

    | craigyboy
    0

  • Hello. I have a burning question which I have been trying to answer for a while. I keep getting conflicting answers and I could really do with your help. I currently run an animal fancy dress (onesie) company in the UK called Kigu through the domain www.kigu.co.uk. We're the exclusive distributor for a supplier of Japanese animal costumes and we've been selling directly through this domain for about 3 years. We rank well across most of our key words and get about 2000 hits each day. We're about to start selling a Kids range - miniature versions of the same costumes. We're planning on doing this through a different domain which is currently live - www.kigu-kids.co.uk. It' been live for about 3-4 weeks. The idea behind keeping them on separate domains is that it is a different target market and we could promote the Kids site separately without having to bring people through the adult site. We want to keep the adult site (or at least the homepage) relatively free from anything kiddy as we promote fancy dress events in nightclubs and at festivals for over 18s (don't worry, nothing kinky) and we wouldn't want to confuse that message. I've since been advised by an expert in the field that that we should set up a redirect from www.kigu-kids.co.uk and house the kids website under www.kigu.co.uk/kids as this will be better from an SEO perspective and if we don't we'll only be competing with ourselves. Are we making a big mistake by not using the same root domain for both thus getting the most of the link juice for the kids site? And if we do decide to switch to have the domain as www.kigu.co.uk/kids, is it a mistake to still promote the www.kigu-kids.co.uk (redirecting) as our domain online? Would these be wasted links? Or would we still see the benefit? Is it better to combine or is two websites better than one? Any help and advice would be much appreciated. Tom.

    | KIGUCREW
    0

  • SEOmoz recommends a bunch of directories and some cost money. How much influence do these directories have? Is it worth investing in some where the category makes sense or all where the category makes sense?

    | SEODinosaur
    0

  • On this site http://www.austintenantadvisors.com/ I have my main landing pages listed in the navigation under "Types". The reason why I did this is because I am not sure where to insert those on the home page where it does not look spammy to Google and looks natural for users. Obviously they need to appear somewhere on the home page for Google to be able to continue crawling and indexing them. Any thoughts or suggestions would  be appreciated.

    | webestate
    0

  • Hi, Any assistance would be greatly appreciated. Basically, our rankings and traffic etc have been dropping massively recently google sent us a message stating " Googlebot found an extremely high number of URLs on your site". This first highligted us to the problem that for some reason our eCommerce site has recently generated loads (potentially thousands) of rubbish urls hencing giving us duplication everywhere which google is obviously penalizing us with in the terms of rankings dropping etc etc. Our developer is trying to find the route cause of this but my concern is, How do we get rid of all these bogus urls ?. If we use GWT to remove urls it's going to take years. We have just amended our Robot txt file to exclude them going forward but they have already been indexed so I need to know do we put a redirect 301 on them and also a HTTP Code 404 to tell google they don't exist ? Do we also put a No Index on the pages or what . what is the best solution .? A couple of example of our problems are here : In Google type - site:bestathire.co.uk inurl:"br" You will see 107 results. This is one of many lot we need to get rid of. Also - site:bestathire.co.uk intitle:"All items from this hire company" Shows 25,300 indexed pages we need to get rid of Another thing to help tidy this mess up going forward is to improve on our pagination work. Our Site uses Rel=Next and Rel=Prev but no concanical. As a belt and braces approach, should we also put concanical tags on our category pages whereby there are more than 1 page. I was thinking of doing it on the Page 1 of our most important pages or the View all or both ?. Whats' the general consenus ? Any advice on both points greatly appreciated? thanks Sarah.

    | SarahCollins
    0

  • Hi! I'm kinda new to the SEO game and struggling with this site I'm working on: http://www.moondoggieinc.com I set the preferred domain to www. in GWT but I'm not seeing it reroute to that? I can't seem to get any of my internal pages to rank, and I was thinking it's possiblly b/c of a duplicate content issue cause by this problem. Any help or guidance on the right way to set preferred domain for this site and whiy I can't get my internal pages to rank? THANKS! KristyO

    | KristyO
    0

  • On my recent crawl, there were a great many duplicate content penalties.  The site is http://dailyfantasybaseball.org. The issue is: There's only one post per page.  Therefore, because of wordpress's (or genesis's) pagination, a page gets created for every post, thereby leaving basically every piece of content i write as a duplicate. I feel like the engines should be smart enough to figure out what's going on, but if not, I will get hammered. What should I do moving forward? Thanks!

    | Byron_W
    0

  • Hi everybody, after some discussions i decided to keep my page on the old domain for better seo rankings; However, the new third level domain sounds better: poltronafraubrescia.zenucchi.it.... the question is: i'm going to recive a high value link and i don't know if i should link directly to the old adress ( www.zenucchi.it/ITA/poltrona-frau-brescia.it ) where the page is located or to the new one by making a 301 redirect to the previous. what's best? and second question what's the way to keep the page on this adress ( www.zenucchi.it/ITA/poltrona-frau-brescia.it ) but show poltronafraubrescia.zenucchi.it as url? thank you guido

    | guidoboem
    0

  • Hey guys, I tried to search across the q&a for an answer but came up with nothing for this. I'm competing well for our keyword rankings with 20 keywords, 13 of which on the first page and 9 in the top 3. Which is great! But what we are interested in doing is taking over the rankings. Currently competitors reside around us in the serps taking up the remainder of traffic. We are considering creating new websites and competing on the same keywords to rank and eventually take over the google rankings for our products. So that if you were to look to buy 'purple buttons' the top 5 websites sell these but are owned by ourselves. The question is. Has anyone else done this? What are googles views? Are there any traps that we could run into? As far as I see it the only real issues we could run into are with google on a moral basis. Thoughts?

    | AdenBrands
    0

  • I had a communication problem with my writer and she used original unspun content and posted it to Unique Article Wizard.  So all UAW does is take each paragraph and mix them up.  So I searched a sentence on my site where the content came from and got back a bunch of returns for that sentence.  My site wasn't the first result returned.  I"m wondering how bad that is going to be for me.  The links from UAW are going back to an anchor layer that then links back to this site. Can anyone tell me if I need to rewrite the content on the original site?  That is the only way I can think to make that not an issue. Thanks

    | mtking.us_gmail.com
    0

  • I have read many different opinions on what links to make do follow on a wordpress website versus which ones to leave as no follow. (internal and external) There does not seem to be any consensus among the inputs to date. Any perspectives on this would be appreciated. thanks

    | casper434
    0

  • We have a client with a very large site that would like to put a login on each page; however, that would require the entire site be put behind a secure connection (changing http:// to https:// on every page).  They rank for a ton of keywords and rank well. Would the change impact their rankings at all?  Could it possibly help them?

    | dknewmedia
    0

  • Hi there, This seems a bit of a strange one, I have a particular keyword which I am trying to rank for, all internal links with the appropriate anchor text are pointing to the page I want to rank for, for this particular keyword, all external links are pointing to the page I want to rank for, for this particular keyword, however Google is ranking another page on my website for this keyword and the bizarre things is the page which is being ranked is a .PDF I am really not sure what else to do to give Google the hint that they are ranking the wrong page, any ideas? Kind Regards

    | Paul78
    0

  • I really am lost as to what to do these days.. The problem with my industry is the whole idea of link bait isn't very lucrative. There are no bloggers either, so guest blogging also isn't a very good option. Seems to me like the best thing I can do is just publish content! So, publish a lot of quality content? LOL, sounds like that's right up Google's alley. Where do you publish your content, and what would you say has shown the best results for you personally? We called an SEO company, Arteworks, a few days ago (Friday), and they really didn't go into any details about how they build links. We called them because I saw a post that you commented on, here, and it recommended a few companies at the bottom of the post. (Arteworks being one of them) Really, this is where I get so dang confused... The goal is to build links like the old days, except only use unique content, diversify your pages, and anchor text? Sound about right? Or, should I only create content on my site? Thanks in advance for your time and advice!! Sincerely, Tyler Abernethy

    | TylerAbernethy
    0

  • Our website lets users test whether any given URL or keyword is censored in China. For each URL and keyword that a user looks up, a page is created, such as https://en.greatfire.org/facebook.com and https://zh.greatfire.org/keyword/freenet. From a search engines perspective, all these pages look very similar. For this reason we have implemented a noindex function based on certain rules. Basically, only highly ranked websites are allowed to be indexed - all other URLs are tagged as noindex (for example https://en.greatfire.org/www.imdb.com). However, we are not sure that this is a good strategy and so are asking - what should a website with a lot of similar content do? Don't noindex anything - let Google decide what's worth indexing and not. Noindex most content, but allow some popular pages to be indexed. This is our current approach. If you recommend this one, we would like to know what we can do to improve it. Noindex all the similar content. In our case, only let overview pages, blog posts etc with unique content to be indexed. Another factor in our case is that our website is multilingual. All pages are available (and equally indexed) in Chinese and English. Should that affect our strategy?References:https://zh.greatfire.orghttps://en.greatfire.orghttps://www.google.com/search?q=site%3Agreatfire.org

    | GreatFire.org
    0

  • When I am signed in with Google and searching sites, the snippets do not display the "cached" link. Not good since I am trying to see when a particular page was crawled. If I login to another server that I never use to browse and search from there the "cache" link does show up. Assumption: google knows who I am on my machine and is "helping" me.......but is there an easy way to turn this help off?

    | Eyauuk
    0

  • my website hit by penguin update on april; i check my anchor text and found that i had 1790 anchor links for " air conditioning nyc" i erase them almost a month ago but it still shows on the website report. how long it take seomoz see the changes and what about google. if someone have any idea how can i bring my site back to the top pages. thank you

    | eoberlender
    0

  • We just submitted a new sitemap to google for our new rails app - http://www.thesquarefoot.com/sitemap.xml Which has over 1,400 pages, however Google is only seeing 114. About 1,200 are in the listings folder / 250 blog posts / and 15 landing pages. Any help would be appreciated! Aron sitemap.png

    | TheSquareFoot
    0

  • At the beginning of our life-cycle, we were just a wordpress blog. However, we just launched a product created in Ruby. Because we did not have time to put together an open source Ruby CMS platform, we left the blog in wordpress and app in rails. Thus our web app is at http://www.thesquarefoot.com and our blog is at http://blog.thesquarefoot.com. We did re-directs such that if the URL does not exist at www.thesquarefoot.com it automatically forwards to blog.thesquarefoot.com. What is the best way to handle sitemaps? Create one for blog.thesquarefoot.com and for http://www.thesquarefoot.com and submit them separately? We had landing pages like http://www.thesquarefoot.com/houston in wordpress, which ranked well for Find Houston commercial real estate, which have been replaced with a landing page in Ruby, so that URL works well. The url that was ranking well for this word is now at blog.thesquarefoot.com/houston/? Should i delete this page? I am worried if i do, we will lose ranking, since that was the actual page ranking, not the new one. Until we are able to create an open source Ruby CMS and move everything over to a sub-directory and have everything live in one place, I would love any advice on how to mitigate damage and not confuse Google. Thanks

    | TheSquareFoot
    0

  • We just launched a new Ruby app at (used to be a wordpress blog) - http://www.thesquarefoot.com We have not had time to create an auto-generated sitemap, so I went to a few different websites with free sitemap generation tools. Most of them index up to 100 or 500 URLS. Our site has over 1,000 individual listings and 3 landing pages, so when I put our URL into a sitemap creator, it should be finding all of these pages. However, that is not happening, only 4 pages seem to be seen by the crawlers. TheSquareFoothttp://www.thesquarefoot.com/http://www.thesquarefoot.com/users/sign_inhttp://www.thesquarefoot.com/searchhttp://www.thesquarefoot.com/renters/sign_upThis worries me that when Google comes to crawl our site, these are the only pages it will see as well.  Our robots.txt is blank, so there should be nothing stopping the crawlers from going through the entire site. Here is an example of one of the 1,000s of pages not being crawled****http://www.thesquarefoot.com/listings/Houston/TX/77098/Central_Houston/3910_Kirby_Dr/Suite_204Any help would be much appreciated!

    | TheSquareFoot
    0

  • WARNING : The follow question is for an adult website.  If you are at work, near children or are offended by such material, DO NOT CLICK Hey guys, This one has had me stumped for awhile.  I operate www.deviantclip.com.  Its a very old and trusted domain by google with loads of history.  However, in the past year, Google has been giving me the cold shoulder. One major problem I've noticed is that I've lost all longtail traffic.  Its even gotten to the point where aggregators are outranking me in google for my own custom titles and content. **Example A : ** Google Link 1 This search has my own sitename in the title and my site ranks somewhere on page 2 or further. **Example B : ** Google Link 2 This content originated from our site and has a unique title, yet we're dead last in the serps. I submitted my site for reconsideration a few times, and the outcome everytime is that Google tells me they have not applied any manual penalty. There are a TON of issues to adress with this site, but obviously, getting my own content to rank first is the primary problem I would like to fix. Your time and advice is greatly appreciated.  If you need furter info, don't be afraid to ask.

    | CrakJason
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.