Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • Hello. As I am a SEO Beginner I wonder if the elements of the page will make a better or worse ranking. I ask this because my boss asked to change the order of the elements. He wants to put first the section of Supplies and Accesories, and then additional information. http://www.theprinterdepo.com/hp-9500mfp-refurbished-printer Does it affect my seo for worse or for better?

    | levalencia1
    0

  • Hey all, Rarely, but sometimes we require to take down our site for server maintenance, upgrades or various other system/network reasons. More often than not these downtimes are avoidable and we can redirect or eliminate the client side downtime. We have a 'down for maintenance - be back soon' page that is client facing. ANd outages are often no more than an hour tops. My question is, if the site is crawled by Bing/Google at the time of site being down, what is the best way of ensuring the indexed links are not refreshed with this maintenance content? (ie: this is what the pages look like now, so this is what the SE will index). I was thinking that add a no crawl to the robots.txt for the period of downtime and remove it once back up, but will this potentially affect results as well?

    | Daylan
    1

  • Hi! We are desperately needing to overhaul our site navigation setup, and we have so many categories that we think our site could really benefit from a drop down navigation similar to what these sites have: http://www.paychex.com/ http://www.bmc.com/ We've held off doing this type of navigation in the past because we were only seeing people use flash to create it and we knew that it wouldn't be good for link juice. But these two sites are using HTML and CSS - which seems like a much better style and good for SEO. Do you agree? We want to make the switch but are worried about losing linking power by nesting our navigation in 's and CSS styling.

    | sciway
    0

  • Hi Everyone, When you go into Google Maps and look at the satellite view, you'll notice business names on some, but not all, businesses--like on the roof of the building. Where do I need to go to get this to work for my business? My Google Places pages are setup properly but as far as I can tell there is no setting for this. Is it the Service Area setting? Currently I have it set to "Yes, this business serves customers at their locations." Perhaps I need to set this to "No." Many thanks, Robert

    | AC_Pro
    1

  • My freelance designer made some changes to my website and all of a sudden my homepage was showing the title I have in Dmoz. We thought maybe the NOODP tag was not correct, so we edited that a little and now the site is showing as "Untitled". The website is http://www.chemistrystore.com/. Of course he didn't save an old copy that we can revert to. That is a practice that will end. I have no idea why the title and description that we have set for the homepage is not showing in google when it previously was. Another weird thing that I noticed is that when I do ( site:chemistrystore.com ) in Google I get the https version of the site showing with the correct title and description. When I do ( site:www.chemistrystore.com ) in Google I don't have the hompage showing up from what I can tell, but there are 4,000+ pages to the site. My guess is that if it is showing up, it is showing up as "Untitled". My question is.... How can we get Google to start displaying the proper title and description again?

    | slangdon
    0

  • As far as I know to rank well H1 tag should be present in all pages and it should be one of the first things in the page, it also should include the keywords. I was checking my site and magento generates the H1 with an image, www.theprinterdepo.com I dont know if this is wise? class="logo">The Printer Depo<a <span="">href</a><a <span="">="</a>http://www.theprinterdepo.com/" title="The Printer Depo" class="logo">width="377px" src="https://www.theprinterdepo.com/skin/frontend/default/MAG060062/images/logo.gif" alt="The Printer Depo" />

    | levalencia1
    0

  • I would like to track outbound links at   http://bit.ly/yYHmbf 1. Shall i  add the following code before at the above page What does 100 means in above code ? 2. Then use this for each outgoing link ``` [onClick="recordOutboundLink(this, 'Outbound Links', 'example.com');return false;">](http://www.example.com) ``` [](http://www.example.com) ```[``` http://www.example.com is the outbound link Am i right on both counts ? where should i look for report in GA ? ```](http://www.example.com)

    | seoug_2005
    0

  • Good morning from 2 degrees C mostly cloudy wetherby UK, Image this situation... Ive got two versions of a website: A desktop version & a mobile version, here are their respective urls: Desk top: http://www.innoviafilms.com Mobile:   http://www.innoviafilms.com/m/home.aspx With blackberry bold in hand I type in via google search "Innovia films" i click on the search snippet and it renders the desktop version. But when i type the url directly in my Blackberry bold 9700 it returns the mobile version 😞 This confuses me. Having read this: http://googlewebmastercentral.blogspot.com/2011/02/making-websites-mobile- friendly.html I thought the desktop version would render in smartphones which the balckberry bold is. So my question is... "Why does the the mobile version only appear when you type in www.innoviafilms.com on a blackberry (and iphone 4) and the desktop version appears only when you enter "Innovia films" via google search" Any insights welcome 🙂

    | Nightwing
    0

  • Good Morning from dull & overcast 2 degrees C wetherby UK 😞 Whilst Ive changed markup for seo purposes on desktop versions I would like to know if the principles of optimising on page content ie modifyting <title><h1> is exactly the same for <a href="http://www.innoviafilms.com/m/Home.aspx">http://www.innoviafilms.com/m/Home.aspx</a></p> <p>Whilst the desktop version of innovia films ranks well for the terms the client requested some time back now their attention is focusing on the mobile site but I feel a bit confused and I'll try my best to explain...</p> <p>Is it not totally redundant to "Optimise" a mobile site content as when i search via google on a smartphone i'm seeing the SERPS from the desktop version and when I click on a snippet the mobile site just piggybacks on the back of the listing anyway.</p> <p>Put another way is it not a royal waist of time tinkering with mobile site on page content for long as Googles SERPS on a smartphone are exactly the same as on a desktop ie they are not too seperate entities.</p> <p>Or am i totally wrong and you could optimise a mobile for a completely different term to its parent desktop version.?</p> <p>Tried to explain this the best i can, my head hurts... :-(</p> <p>Any insights</p> <p>welcome :-)</p></title>

    | Nightwing
    0

  • Hi all, Any help with the following.  We built a new site for a customer in June of last year.  We then cracked on with the on page and off page SEO.  All white had....good quality.  3 months in the site was still not ranking with google and indeed had been sandboxed.  All working fine with Bing and Yahoo.  We followed all the steps to get recognitions for Google but to no avail.  In December we took the drastic step of providing the customer with a completely new site...new content, design, structure etc etc.  In Jan we went back and fixed all the external linking sources to link to the new pages on the new site.  Now 7 months in.....the site is STILL sandboxed.  All still fine with Bing and Yahoo. Thoughts anyone?

    | SEOwins
    0

  • I would like to track outbound links at   http://bit.ly/yYHmbf 1. Shall i  add the following code before at the above page What does 100 means in above code ? 2. Then use this for each outgoing link ``` [onClick="recordOutboundLink(this, 'Outbound Links', 'example.com');return false;">](http://www.example.com) ``` [](http://www.example.com) ```[``` http://www.example.com is the outbound link Am i right on both counts ? where should i look for report in GA ? ```](http://www.example.com)

    | seoug_2005
    0

  • Looks although ehow was hit by Panda 2.0, its traffic has increased back to previous levels. Does anyone know of an article / study that goews over what ehow did.

    | nicole.healthline
    0

  • A website having more than 10,000 pages, as per the Google Algorithm If I restrict the page links to 100 for sitemap.asp then I have to generate 100 pages, any idea to shorten the process. Please advice.

    | younus
    0

  • Hi I am a huge fan of the SEOMOZ site and this great community which has helped me learn the current SEO skills I have now which are still very basic compared to the pros on the forum. I have tried to follow best practice regarding onsite and technical seo when developing my new site www.cheapfindergames.com and I would really appreciate it if experts on the forum could spare a minute to critique the site from a search perspective please This will give any elements of what onsite and technical SEO I done well and what aspects still need work. I am currently trying to build quality links and social mentions into the site which will take time, and the site has been designed around usability and conversions. Many Thanks Ian

    | ocelot
    0

  • Hey guys, I’ve been reading the blogs and really appreciate all the great feedback.  It’s nice to see how supportive this community is to each other.  I’ve got a question about near duplicate content. I’ve read a bunch of great post regarding what is duplicate content and how to fix it.  However, I’m looking at a scenario that is a little different from what I’ve read about.  I’m not sure if we’d get penalized by Google or not.  We are working with a group of small insurance agencies that have combined some of their back office work, and work together to sell the same products, but for the most part act as what they are, independent agencies. So we now have 25 different little companies, in 25 different cities spread across the southeast, all selling the same thing.  Each agency has their own URL, each has their own Google local places registration, their own backlinks to their local chambers, own contact us and staff pages, etc. However, we have created landing pages for each product line, with the hopes of attracting local searches.  While we vary each landing page a little per agency (the auto insurance page in CA talks about driving down the 101, while the auto insurance page in Georgia says welcome to the peach state) probably 75% of the land page content is the same from agency to agency.  There is only so much you can say about specific lines of insurance.  They have slightly different titles, slightly different headers, but the bulk of the page is the same. So here is the question, will Google hit us with a penalty for having similar content across the 25 sites?  If so, how do you handle this?  We are trying to write create content, and unique content, but at the end of the day auto insurance in one city is pretty much the same as in another city. Thanks in advance for your help.

    | mavrick
    0

  • Within some of my campaigns i can see issues regarding 404 pages. Then when i export the data to a csv, sometimes the referring pages that lead tot the 404 are not shown. Am i missing something here?

    | 5MMedia
    0

  • I'm auditing a local business' sites (a spa) and I wanted to run my recommendations by everyone. There are 3 sites: www.sitename1.com -- main store location, used for Google Places listing #1 www.sitename2.com -- 2nd store location, used for Google Places listing #2 www.sitename3.com -- used for product sales for both locations Sitename1.com has the most ranking power. I'm going to recommend that they move sitename2.com and sitename3.com to sitename1.com as subfolders, 301 redirecting each page to the corresponding page on sitename1.com/subfolder. Google Places listing #2 would be changed from www.sitename2.com to www.sitename.com/location2. Any risks or problems with this strategy anyone can see?

    | 540SEO
    0

  • I can buy a domain from a competitor. Whats the best way to make good use of these links for my existing website

    | Archers
    0

  • Probably the complete opposite to all the questions asked on here before hehe One of my online stores is ranking in Google at no2 for a service we used to offer but no longer do as we do not have the contract anymore.We need to derank the listing as fast as we can we do not offer this service now. Normally we could just get google to deindex the page but it's the homepage we are ranking. It is ranking due to a good collection of exact match backlinks on. Have been trying to get our page off the first page for a couple of months but it appears that we did too good a job of getting it up there...it's not moving! Anyone have a suggestion please on how we can get off page one asap for this keyword without deindexing the homepage for our store? note: the keyword in question has not appeared on the site for a couple of months, removing that has had no impact on the ranking Thanks

    | Grumpy_Carl
    1

  • Hi All, I run an e-commerce website with thousands of products.
    In each product page I have a link to a PDF guide of that product. Currently we link to it with a "nofollow" <a href="">tag.</a> <a href="">Should we change it to window.open in order not to lose link juice? Thanks</a>

    | BeytzNet
    0

  • Hi Everyone, Hope you can help me out. We have customer reviews on our product pages and are also ranking highly for those keywords for those product pages. To improve our CTR for those keywords I think to show those reviews with organic results using rich snippets would be a good way of improving CTR. I have asked my programmer to integrate this, which he came back and said after the rich snippets have been integrated we then have to ask Google to integrate it in their search results, is this true? I hope this makes sense. Kind Regards

    | Paul78
    1

  • Hi. I have 3 campaigns in SEOmoz. 2 are working ok but one I've set up like 3 weeks ago and still have no details about it , everywhere is '0' and no results at all. Google Analytics shows links etc but in SEOmoz nothing is moving above 0. Domain is quite new, like 2 months but I should get some rsults , isn't it ? Strange In Google I was on position 16 for my keywords and now I am 89 ! Anyone know what coule be reason for no stats in SEOmoz ? Thanks

    | sever3d
    0

  • My company is purchasing another company's website.  We are moving their entire site onto our CMS and the IT guys are working hard to replicate the URL structure.  Several of the category pages are changing slightly and I am not sure if it matters: Old URL - http://www.DOMAIN.com/products/adults New URL - http://www.DOMAIN.com/products/adults**/** Notice the trailing slash?  Will Google treat the new page as the same as the old one or as completely different (i.e. new) page? P.S. - Yes, I can setup 301s but since these pages hold decent rankings I'd really like to keep it exactly the same.

    | costume
    0

  • SEOmoz PRO is giving me back duplicate content errors. However, i don't see how i can get a list of pages that are duplicate to the one shown. If i don't know which pages/urls cause the issue i can't really fix it. The only way would be placing canonical tags but that's not always the best solution. Is there a way to see the actual duplicate pages?

    | 5MMedia
    0

  • Yes... It iz I again ;o) Here's one for you savy techies out there: So, I've got a primary domain which is live, optimized and running smooooth. And then I've got a couple of misspelled domains as well (17 to be exact). Will it have an effect if I 301 those misspelled domains? What's Best Practice for several domain aliases? Example.
    Primary domain:  bryghusprojektet.dk
    Alias domain 1: bryghusprojekt.dk (301 redirects to primary domain)
    Alias domain 2: bryghus-projekt.dk (Hosting company infopage)
    Alias domain 3: bryghus-projekter.dk (Not activated) Regards.

    | nosuchagency
    1

  • Firstly apologies for the very brief question as I am mainly looking for your thoughts as opposed to specific help about a specific problem. I am working on a site which has two sepreate domains and within one domain, two sub domains. The two different sites both havea  high page rank, PR6 each, one is the corporate site and the other is the company blog. There are also two sub domains within the corporate site, again both domains have high pr and tons of content. My question is would it be better to consolidate all the assets under one domain or is it better to keep the sites sepreate, from an seo perspective that is.

    | LiquidTech
    0

  • Hi, Please excuse me for being too simplistic or dumb. I haven't had any experience in Event tracking so far. So, Please help me out I want to track  how many persons have clicked on "subscribe for Newsletter" button on website - http://bit.ly/w7iwdh Pls can anyone paste here the code to implement this ?

    | seoug_2005
    0

  • If a domain loads on the domain and the IP is that a problem? So it loads on domain.com and 69.16.....com Thanks!

    | tylerfraser
    0

  • Hello, We have a very important keyword for our website (www.decorplanet.com) that hasn't moved in any direction in over a year. The keyword is "bathroom vanities". Our current SEO company has told us that the previous company has "over- optimized/linked" for this particular keyword and now Google is penalizing us for this. They told us that we need to leave it alone for a while and concentrate on other keywords and this one should naturally come back up (I should mention that at some point, we were much higher than where we are right now for that particular keyword). Many of our other high-profile keywords have been moving nicely to the first page. Keywords like "modern bathroom vanities", "antique bathroom vanities", "contemporary bathroom vanities", but the one that we really want ("bathroom vanities") hasn't budged at all. Does anyone have any thoughts on this. Is it possible that Google put us in some sort of a "sand-box" - I mean we are on page 2 so it doesn't sound like that's the case. It's just very bizarre that we can't seem to do anything with this keyword. Really appreciate any thoughts or input on this.

    | steven1133
    0

  • We were specifically targeting the Capital Region of New York and environs and now our goal is to broaden the net to reach potential clients. Should I to drop the location terms we already have baked in to the copy and add broader location terms? OR just add newer terms? We're developing a new design that sharpens our focus, but here is what we have now: http://www.behancommunications.com Thanks for any suggestions

    | PatDowd
    0

  • On our main site (http://www.deeperblue.com) we've been syndicating posts (not the full posts just link and short extract) from a trusted partner of ours. These posts are listed as Diverwire Staff and point directly back to the original website. What i'm concerned about is the impact on SERPS - we don't want to be penalised by any of the search engines.

    | StephanWhelan
    0

  • Hello, I recently did an SEOMoz crawl for a client site.  As it typical, the most common errors were duplicate page title and duplicate content.  The client site is a custom social network for researchers.  Most of the pages that showing as duplicate are simple variations of each user's profile such as comment sections, friends pages, and events. So my question is how can we limit duplicate content errors for a complex site like this.  I already know about the rel canonical tag, and rel next tag, but I'm not sure if either of these will do the job.  Also, I don't want to lose potential links/link juice for good pages. Are there ways of using the "noindex" tag in batches? For instance: noindex all urls containing this character?  Or do most CMS allow this to be done systematically? Anyone with experience doing SEO for a custom Social Network or Forum, please advise. Thanks!!!

    | BPIAnalytics
    0

  • I have run SEO moz software for a clients site, Its showing that virtually every single page has too many links.  For instance this url: http://www.golfthere.com/AboutUs Am I missing something? I do not see 157 links on this page.

    | ToddKing
    0

  • SEOMOZ says that my site has 150 <a title="Click for Help!">Canonical URLs and lists that as a potential problem.  It's a check box in the settings for Platinum SEO and here is the description it provides:</a> <a title="Click for Help!">Choose this option to set up canonical URLs for your Home page, Single Post, Category and Tag Pages.</a> I have the option engaged. So I was trying to figure out the best thing to do.  I have already instructed it to automatically make 301 redirects for any permalink changes and have instructed it to "noindex" tag archives,rss comment feeds, and rss feeds. I've only been doing this for about a year and am really confused right now.  After reading most of your posts about the subject I have a much better understanding, but still very confused. Help..Please...

    | pressingtheissue
    0

  • Hi - wondering what one should do with a mobile theme in relationship to Google. We are using wordpress and then a plugin that detects whether or not the user is on a mobile device. Should you no-follow the mobile theme since it is effectively the same content as the main site? Or just leave it be and not worry about it. Thanks!

    | leeg
    0

  • When we say  "link juice", does it mean that a particular page has link juice ( due to backlinks pointing towards the page ) or each link on that page has link juice which it passes to the target page I suppose "link juice "  is different from Pagerank ?

    | seoug_2005
    0

  • Hi, Just a quick note. I hope someone is able to advise. To cut a long story short I have a page/s that receive multiple syndicated feeds.
    We are using the content for its value to visitors. Am very happy to cross domain rel=canonical the source and not incur any of Panda's wrath but would like to know if one can add multiple rel-canonicals to one page to reference multiple sources. Appreciate your help. Many thanks Dave Upton

    | daveupton
    0

  • hi my client has asked if we can seo their dk site , my question is does all link building and article submission have to be in danish

    | Westernoriental
    0

  • Relative to Cyrus Shepard's article on January 4th regarding Google's Superior SEO strategy, if I'm the primary author of all blog articles and web site content, and I have a link showing authorship going back to Google Plus, is a site wide link from the home page enough or should that show up on all blog posts etc and editorial comment pages etc? Conversely, should the author's name appear in the Meta description or title tag of my web site  just as you would your key search phrase since Google appears to be trying to make a solid connection with my name, and all content?

    | lwnickens
    0

  • Has anyone else had the issue for a couple of months now where ose is showing links that are live when they are not. I am doing seo for a big e-commerce website so it is important i can find out what the anchor text distribution is for the domain. How is this possible when ose shows me links which are not live? There's been 2 or 3 updates and still showing expired links in our back-link profile. This is very frustrating and has led me to see if anyone else can recommend an alternative which is more up to date? And yes before seomoz reply, i know you only update a certain percentage of the web.

    | pauledwards
    0

  • Hi On a Magento Instance we manage there is an advanced search. As part of the ongoing enhancement of the instance we altered the advance search options so there are less and more relevant. The issue is Google has crawled and catalogued the advanced search with the now removed options in the query string. Google keeps crawling these out of date advanced searches.  These stale searches now create a 500 error. Currently Google is attempting to crawl these pages twice a day. I have implemented the following to stop this:- 1. Submitted requested the url be removed via Webmaster tools, selecting the directory option using uri: http://www.domian.com/catalogsearch/advanced/result/ 2. Added Disallow to robots.txt Disallow: /catalogsearch/advanced/result/* Disallow: /catalogsearch/advanced/result/ 3. Add rel="nofollow" to the links in the site linking to the advanced search. Below is a list of the links it is crawling or attempting to crawl, 12 links crawled twice a day each resulting in a 500 status. Can anything else be done? http://www.domain.com/catalogsearch/advanced/result/?bust_line=94&category=55&color_layered=128&csize[0]=0&fabric=92&inventry_status=97&length=0&price=5%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=115&category=55&color_layered=130&csize[0]=0&fabric=0&inventry_status=97&length=116&price=3%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=94&category=55&color_layered=126&csize[0]=0&fabric=92&inventry_status=97&length=0&price=5%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=137&csize[0]=0&fabric=93&inventry_status=96&length=0&price=8%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=142&csize[0]=0&fabric=93&inventry_status=96&length=0&price=4%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=137&csize[0]=0&fabric=93&inventry_status=96&length=0&price=5%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=142&csize[0]=0&fabric=93&inventry_status=96&length=0&price=5%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=135&csize[0]=0&fabric=93&inventry_status=96&length=0&price=5%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=128&csize[0]=0&fabric=93&inventry_status=96&length=0&price=5%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=127&csize[0]=0&fabric=93&inventry_status=96&length=0&price=4%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=127&csize[0]=0&fabric=93&inventry_status=96&length=0&price=3%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=128&csize[0]=0&fabric=93&inventry_status=96&length=0&price=10%2C10http://www.domain.com/catalogsearch/advanced/result/?bust_line=0&category=55&color_layered=122&csize[0]=0&fabric=93&inventry_status=96&length=0&price=8%2C10

    | Flipmedia112
    0

  • Is it a bad SEO move to have a your checkout process on a separate domain instead of the main domain for a ecommerce site. There is no real content on the checkout pages and they are completely new pages that are not indexed in the search engines. Do to the backend architecture it is impossibe for us to have them on the same domain. An example is this page: http://www.printingforless.com/2/Brochure-Printing.html One option we've discussed to not pass page rank on to the checkout domain by iFraming all of the links to the checkout domain. We could also move the checkout process to a subdomain instead of a new domain. Please ignore the concerns with visitors security and conversion rate.  Thanks!

    | PrintingForLess.com
    0

  • Hi SEOMoz gang! Been a long timer reader and hangerouter here but now i need to pick your brains. I've been working on two websites in the last few days which are showing very strange behaviour with 301 redirects. Site A This site is an ecommerce stie stocking over 900 products and 000's of motor parts. The old site was turned off in Feb 2011 when we built them a new one. The old site had terrible problems with canonical URLs where every search could/would generate a unique ID e.g. domain.com/results.aspx?product=1234.  When you have 000's of products and Google can find them it is a big problem. Or was. We launche the new site and 301'd all of the old results pages over to the new product pages and deleted the old results.aspx. The results.aspx page didn't index or get shown for months. Then about two months again we found some certain conditions which would mean we wouldn't get the right 301 working so had to put the results.aspx page back in place. If it found the product, it 301'd, if it didn't it redirected to the sitemap.aspx page. We found recently that some bizarre scenerio actually caused the results.aspx page to 200 rather than 301 or 404. Problem. We found this last week after our 404 count in GWMT went up to nearly 90k. This was still odd as the results.aspx format was of the OLD site rather than the new. The old URLs should have been forgetten about after several months but started appearing again! When we saw the 404 count get so high last week, we decided to take severe action and 301 everything which hit the results.aspx page to the home page. No problem we thought. When we got into the office on Monday, most of our product pages had been dropped from the top 20 placing they had (there were nearly 400 rankings lost) and on some phrases the old results.aspx pages started to show up in there place!! Can anyone think why old pages, some of which have been 301'd over to new pages for nearly 6 months would start to rank? Even when the page didn't exist for several months? Surely if they are 301's then after a while they should start to get lost in the index? Site B This site moved domain a few weeks ago. Traffic has been lost on some phrases but this was mainly due to old blog articles not being carried forward (what i'll call noisy traffic which was picked up by accident and had bad on page stats). No major loss in traffic on this one but again bizarre errors in GWMT. This time pages which haven't been in existence for several YEARS are showing up as 404s in GWMT. The only place they are still noted anywhere is in the redirect table on our old site. The new site went live and all of the pages which were in Googles index and in OpenSiteExplorer were handled in a new 301 table. The old 301s we thought we didn't need to worry about as they had been going from old page to new page for several years and we assumed the old page had delisted. We couldn't see it anywhere in any index. So... my question here is why would some old pages which have been 301'ing for years now show up as 404s on my new domain? I've been doing SEO on and off for seven years so think i know most things about how google works but this is baffling. It seems that two different sites have failed to prevent old pages from cropping up which were 301d for either months or years. Does anyone has any thoughts as to why this might the case. Thanks in advance. Andy Adido

    | Adido-105399
    0

  • I've been viewing an seo companies website that claims to get small business websites to Google Page 1 for free or starting at $150/mo. I'v noticed that on all the website this company has done work on they include in the footer (usually as a watermark) all the keyword phrases. There don't apprear to be any sites that have been penalized. Isn't this poor SEO practice?   I've included a screen shot of what I'm talking about. I just want to be clear. Thank you for your input. XnQUc.png

    | JulB
    0

  • Hello This is my robots.txt file http://www.theprinterdepo.com/Robots.txt However I have 8000 warnings on my dashboard like this:4 What am I missing on the file¿ Crawl Diagnostics Report On-Page Properties <dl> <dt>Title</dt> <dd>Not present/empty</dd> <dt>Meta Description</dt> <dd>Not present/empty</dd> <dt>Meta Robots</dt> <dd>Not present/empty</dd> <dt>Meta Refresh</dt> <dd>Not present/empty</dd> </dl> URL: http://www.theprinterdepo.com/catalog/product_compare/add/product/100/uenc/aHR0cDovL3d3dy50aGVwcmludGVyZGVwby5jb20vaHAtbWFpbnRlbmFjZS1raXQtZm9yLTQtbGo0LWxqNS1mb3ItZXhjaGFuZ2UtcmVmdWJpc2hlZA,,/ 0 Errors No errors found! 1 Warning 302 (Temporary Redirect) Found about 5 hours ago <a class="more">Read More</a>

    | levalencia1
    0

  • What is this robots.txt telling the search engines? User-agent: * Disallow: /stats/

    | DenverKelly
    0

  • Hi everyone, Lets say we have a robots.txt that disallows specific folders on our website, but a sitemap submitted in Google Webmaster Tools that lists content in those folders. Who wins? Will the sitemap content get indexed even if it's blocked by robots.txt? I know content that is blocked by robot.txt can still get indexed and display a URL if Google discovers it via a link so I'm wondering if that would happen in this scenario too. Thanks!

    | anthematic
    0

  • Hi i have just done an on grade report for my site www.in2town.co.uk and i found that i had a number of h1 which was not doing my seo any good. I have sorted most of the h1 problems out but the report is still showing i have two h1 but i cannot find them, i have found one which i have done which is a short description of the site under the main banner page but i cannot find the second h1 can anyone please let me know if their is a simple way of finding the other h1 so i can deal with it many thanks

    | ClaireH-184886
    0

  • Hey guys and gals, First Moz Q&A for me and really looking forward to being part of the community. I hope as my first question this isn't a stupid one but I was just struggling to find any resource that dealt with the issue and am just looking for some general advice. Basically a client has raised a problem with 404 error pages - or the lack thereof- on non-existent URLs on their site; let's say for example: 'greatbeachtowels.com/beach-towels/asdfas' Obviously content never existed on this page so its not like you're saying 'hey, sorry this isn't here anymore'; its more like- 'there was never anything here in the first place'. Currently in this fictitious example typing in 'greatbeachtowels.com/beach-towels/asdfas**'** returns the same content as the 'greatbeachtowels.com/beach-towels' page which I appreciate isn't ideal. What I was wondering is how far do you take this issue- I've seen examples here on the seomoz site where you can edit the URI in a similar manner and it returns the same content as the parent page but with the alternate address. Should 404's be added across all folders on a site in a similar way? How often would this scenario be and issue particularly for internal pages two or three clicks down? I suppose unless someone linked to a page with a misspelled URL... Also would it be worth placing 301 redirects on a small number of common mis-spellings or typos e.g. 'greatbeachtowels.com/beach-towles' to the correct URLs as opposed to just 404s? Many thanks in advance.

    | AJ234
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.