Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • We just started an "In the News" section on our webpage.  We are not sure what would be the best for SEO purposes.  Should we link to the news websites that have the stories about our company, even if they have no link bank?  Or should we just take screenshots of the news article and only link to articles that link back to us (this is what we a currently doing)? Here is our news page, http://www.buyautoparts.com/News/

    | joebuilder
    0

  • My website focuses on movie posters. I'm having a little debate on what is the best way to have images linked to. The current image location is stored like this: /movie-name/poster-1.jpg
    /movie-name/poster-2.jpg Is it best to leave it like that or change it to : /movie-name/movie-name-poster-1.jpg
    /movie-name/movie-name-poster-1.jpg The reason I ask, is that I read today that Google uses the image name to help detect what the image is about. At the same time, if the movie name is the in folder structure, along with the image name... wouldn't it start to look like keyword stuffing?

    | thedevilseeker
    0

  • So I have several clients who have been blogging for a few years on Blogger and self hosted WordPress.  They also have their "main site" on a different URL.  What is the current thinking on what to do with the content.  The "main sites" could use a bit of a boost and I know the content would help so I know I can 3-1 redirect everything from the current blogs to a new home on the main site. What I am thinking is to move most of the posts to the main site with redirects, but leave a few posts around perhaps a theme (and maybe writing a few more) and leaving that property up and "open for business" so the links from it have some value to the main site, we can get G-plus author attribution on several sites in their topic of experience and maybe we can get some extra pages to rank in top 7.   Does this seem like a reasonable strategy?

    | dot-B-dot-B
    0

  • I want to know best practices of Search Engine Optimization 2013 and also need best possible sources. Thanks

    | GM007
    0

  • Hi guys, We're planning to change our URLs structure for our product pages (to make them more SEO friendly) and it's obviously something very sensitive regarding the 301 redirections that we have to take with... I'm having a doubt about Mister Google: if we slowly do that modification (area by area, to minimize the risk of problems in case of bad 301 redirection), would  we lose rankings in the search engine? (I'm wondering if they might consider our website is not "coherent" -> not the same product page URLs structure for all the product pages during some time) Thanks for your kind opinion 😉

    | Kuantokusta
    0

  • Hi all, I have an ecommerce client and on the pages they have a drop down so customers can view via price, list etc. Natrurally I want a canonical tag on these pages, here's the question. as they have different pages of products,  the canonical tag on http://www.thegreatgiftcompany.com/occassion/christmas#items-/occassion/christmas/page=7/?sort=price_asc,searchterm=,layout=grid,page=1 is to http://www.thegreatgiftcompany.com/occassion/christmas#items-/occassion/christmas/page=7. now, because the page=7 is a duplicate of the main page, shouldn't the canonical just be to the main page rather than page=7? Even when there is a canonical tag on the /Christmas/page=7 to the /Christmas page? hope that makes sense to everyone!

    | KarlBantleman
    0

  • Hello here. I own an e-commerce website (virtualsheetmusic.com) and with the fact we have implemented structured data for our product pages,  now our search results on Google appear with pricing information whereas most of our competitors don't have that information displayed (yet). I am wondering: Do you think is that good? What side effects could that cause? Less CTR? Less bounce rate? Less traffic? Any thoughts on this issue are very welcome. Thanks!

    | fablau
    0

  • I have lots of news on my website and unlike other types of content, news posts quickly become obsolete and get a high bounce rate. I have reasons to think that the news on my website might be partly responsible for a Panda penalty so I'm not sure. There are over 400 news posts on the blog from the last 4 years so that's still a lot of content. I was thinking of isolating the news articles on a subdomain (news.mywebsite.com) If the news play a part in the Panda penalty, would that remove it from the main domain?

    | sbrault74
    0

  • Hello, I own Foodio54.com, which provides restaurant recommendations (mostly for the US). I apologize in advance for the lengthy questions below, but we're not sure what else to do. On May 8 we first noticed a dip in Google results, however the full impact of this sudden change was masked by an increase in Mother's Day traffic and is only today fully apparent. It seems as though we've lost between 30% and 50% of our traffic. We have received no notices in Google Webmaster Tools of any unnatural links, nor do we engage in link buying or anything else that's shady, and have no reason to believe this is a manual action. I have several theories and I was hoping to get feedback on them or anything else that anyone thinks could be helpful. 1. We have a lot of pictures of restaurants and each picture has its own page and these pages aside from the image are very similar. I decided to put a noindex,follow on the picture pages (just last night) especially considering Google's recent changes to image search that send less traffic anyways. Is there any way to remove these faster? There's about 3.5 million of them. I was going to exclude them in robots.txt, but that won't help the ones that are already indexed. Example Photo Page: http://foodio54.com/photos/trulucks-austin-2143458 2. We recently (within the last 2 months) got menu data from SinglePlatform, which also provides menus to UrbanSpoon and Yelp and many others, we were worried that adding a page just for menus that was identical to what is on Urbanspoon et all would just be duplicate content so we added these inline with our listing pages. We've added menus on about 200k listings.
    A. Is Google considering this entire listing page duplicate content because the menu is identical to everyone else?
    B. If it is, should we move the menus to their own pages and just exclude them with robots.txt? We have an idea on how to make these menus unique for us, but it's going to be a while before we can create enough content to make that worthwhile. Example Listing with Menu: http://foodio54.com/restaurant/Austin-TX/d66e1/Trulucks 3. Anything else? Thank you in advance. Any insight anyone in the community has would be greatly appreciated. --Mike Van Heyde

    | MikeVH
    0

  • Hi:
    I'm in the process of making sure I have a single URL that gets all link juice for every page. As the first step, I'm implementing 301 redirects. My question is: After 301 redirect is in place will it still be necessary to update links on my site(s) that point to the old URLs. For example, I just 301-redirected mysite.com/folder/index.html to mysite.com/folder/ - and it works fine, but I still have many links on the same site and other sites that link to mysite.com/folder/index.html. Is it important to change those links to mysite.com/folder/ as well or could I just leave them as they are since the 301 is taking care of the redirect anyway.. I mean, I will probably change them in time, just to keep things tidy. But if it's important I would definitely do it sooner rather than later. Thank you in advance.

    | romanbond
    0

  • We have about 20 testing environments blocked by robots.txt, and these environments contain duplicates of our indexed content. These environments are all blocked by robots.txt, and appearing in google's index as blocked by robots.txt--can they still count against us or hurt us? I know the best practice to permanently remove these would be to use the noindex tag, but I'm wondering if we leave them they way they are if they can still hurt us.

    | nicole.healthline
    0

  • I was hit bad by Penguin on 4-24-12. Never received a bad link warning from Google but I do have lots of bad links from a SEO company years ago. I canceled with them and they kept sending automated links with the same anchor text. I even asked Google to remove the bad links years before the penguin update I have removed many bad links by contacting webmasters and have used disavowal tool twice. 6 Months later still no improvement. Is it possible that Google needs more time to disavowal bad links ? What about 301 redirect to a new domain? Anyone ever have long term success with 301. What about 301 after using the disavowal tool any luck? What about changing my domain, will it take months for site to move up, anyone with experience trying this? Let me know any ideas. Thanks

    | DWatters
    0

  • Hello All, I launched my site on May 1 and as it turns out, another domain was pointing it's A-Record to my IP. This site is coming up as malicious, but worst of all, it's ranking on keywords for my business objectives with my content and metadata, therefore I'm losing traffic. I've had the domain host remove the incorrect A-Record and I've submitted numerous malware reports to Google, and attempted to request removal of this site from the index. I've resubmitted my sitemap, but it seems as though this offending domain is still being indexed more thoroughly than my legitimate domain. Can anyone offer any advice? Anything would be greatly appreciated! Best regards, Doug

    | FranGen
    0

  • Hello, I seem to be having a bit of a dilemma with making a crucial site architecture decision about which high traffic keyword I should put in my primary navigation menu. I am the owner of a computer repair business that I am currently re branding out of necessity for a few reasons. My existing business website has been established for the past 5 years now and I do all of the SEO and have been on the 1st Page of GOOGLE for anything computer repair related since day 1 however, like I said  am re branding my company and migrating from Joomla to WordPress so it is a great time to make some positive and effective changes to my site architecture. I am going to be using the Silo Site Architecture on the new Site and I have a very firm working knowledge on the process but I seem to have hit a snag or dilemma with one of my Primary Navigation Categories for the Silo Theme. My specif question is this please.. Doing keyword research the Keyword Phrase "Computer Repair" is the most highly searched for keyword phrase for people that have computer related problems (naturally) and Ideally "Computer Repair" should be one of my Main Menu Navigation Silo Category Themes. But... here lies the problem.... If I go with "Computer Repair" in the (Main Nav Menu) then although it gets - 823,000 Local Monthly Searches  I would be opening myself up to a potential problem because normally, most people associate the Phrase Computer Repair with Desktop Computer Repair. So in essence I would be forced to use an alternate other than "Computer Repair" for the Desktop Computer Repair structure in the Silo Theme (Sidebar Nav Menu).  The Keyword Phrase "Desktop Repair" gets only - 12,100 Local Monthly Searches  so basically no one uses the Search Phrase "Desktop Repair. when they are looking to get their computer repaired. I hope that I did not just confuse you? Still confused? Continue reading and I will dissect my psycho babble for you..... "The Semantic Historical Logic" Historically, a Desktop has always been referred to as a computer. Hence the reason why even still today, when our "Desktop" has problems and we need to get it fixed, we Search for "Computer Repair". Why is that? That's a very good question and here is "exactly" why. Long before we had Laptops, Netbooks, Tablets and Smart Phones we had the all encompassing and mighty "Computer" that allowed us to connect to the rest of the world. It was not until  Laptops actually came about where there was a need to assign an actual _"Classification System"_and all mighty and powerful "Computer" became a "Desktop Computer**"!!! ** So, there you have it. This is the reason why  "Computer Repair" is synonymous  with "Desktop Repair" and why "NO ONE" searches for desktop repair when their Desktop Computer is broken! ============================================================= ACTUAL EXAMPLES WITH SCREEN SHOTS BELOW! If I go with Example A: I have the the Highest Traffic Keyword Phrase in my Mast Head (Main Nav Menu)  but would be forced to use Desktop Repair to classify (Desktop Repair) in (Sidebar Nav Menu)  instead using the keyword phrase "Computer Repair" to classify Desktop Repair. Example A: Main Nav Theme Category =   "Computer Repair"  = 823,000 Monthly Loc Child Pages/ Categories  =           -Desktop Repair  =  **12,100 ** Monthly Loc -Laptop Repair    =  165,000  Monthly Loc -Tablet Repair    =  165,000  Monthly  Loc -Remote Desktop =  1,000,000 Monthly  Loc I am using WordPress - (Pages / Child Pages) not Categories & Posts! So, as you can see from (Example A:) above, not being able to use the keyword phrase "Computer Repair" to classify the "Desktop Repair" section kind of opens me up for failure to a good extent as most of my business is done on regular desktop computers which people generally think "Computer Repair" when they are searching to have their Desktop Repaired. ============================================================= Example B: Main Nav Theme Category =   **"Computer Service"  =  **246,000  Monthly Loc Child Pages/ Categories  =           -Computer Repair  = 823,000 Monthly Loc -Laptop Repair    =  165,000  Monthly Loc -Tablet Repair    =  165,000  Monthly  Loc -Remote Desktop =  1,000,000 Monthly  Loc I am using WordPress - (Pages / Child Pages) not Categories & Posts! Now, with (Example B:) even though the keyword Phrase "Computer Service" is not the more favorable item to have as the Silo Theme Category in the Main Navigation Menu, we can see that it is  much more favorable in terms of Local Monthly Searches over the just about non searched for phrase "Desktop Repair" So as you see, I have a bit of a dilemma that a more experienced SEO could counsel me on. The question is, through your experience, which scenario would you see as more favorable for the site Architecture example A: or example B: This brings me to my next question that also creates some confusion for me. If you say I think (Example B:) would be my better bet what would you recommend that I do with the URL Structure if "Computer Service" is the Parent Page for the Silo Theme? Example: I am using the /%category%/%postname%/ permalink structure for the Silo Site Architecture for the (Blog Section) only - and am using WP Pages and Child Pages for my Silo Content for my Services (Not Posts). Would this URL  be a problem in Googles eyes or a customers eyes and be perceived as SPAMMY ... http://www.pcmedicsoncall.com/computer-services/computer-repair/ More than likely, I would say yes because it looks that way to me! My question to you in regards to the  link structure above is, If I take the "Computer Service" page and change the "SLUG" to (services) yes it will look better but... will that effectively work against me??? EDIT:  ^^ Answered my own question on the Services deal directly above. ^^ Thank you for reading my very long winded questions but I am pretty detailed and I think that the better that I explained it the less writing and guessing what I meant would be better for all concerned (typing wise) Thank you very much and I look forward to your insightful expertise and wisdom. Marshall COMPUTER-SERVICE-MAST-HEAD.png COMPUTER-SERVICE-MAST-HEAD.png

    | MarshallThompson31
    0

  • Without getting into the debate/discussion about which server-side language should or should not be used, I am faced with the reality of moving an old ASP.NET site to a Coldfusion one with a different domain and different folder structure. Example: www.thissite.com/animals/lion.aspx --> www.thatsite.com/animals/africa/lion.cfm What is the best way to redirect individual .aspx pages to their .cfm counterparts keeping in mind that, in many cases, the folder paths will be different? If it would mean less work, I am hoping this can be done at the server level (IIS 6) rather than modifying the code on each now-defunct page. And on a related note, how long should any redirects be kept in place? My apologies if this has been answered in this forum in the past, but I did do a lot of searching first (both here and elsewhere) before posting this query.

    | hamackey
    0

  • I have duplicate content for all of the products I sell on my website due to categories and subcategories. Ex: http://www.shopgearinc.com/products/product/stockfeeder-af38.php http://www.shopgearinc.com/products/co-matic-power-feeders/stockfeeder-af38.php http://www.shopgearinc.com/products/co-matic-power-feeders/heavy-duty-feeders/stockfeeder-af38.php Above are 3 urls to the same title and content.  I use a third party developer backend system so doing canonicalization seems difficult as I don't have full access. What is the best to get rid of this duplicate content.  Can I do it through webmaster tools or should I pay the developer to do the canonicalization or a 301 redirect?  Any suggestions? Thanks

    | kysizzle6
    0

  • I have a client that is targeting some product related keywords.  They are on page one for them but Amazon, OfficeMax and Staples are ranking in the top 3 spots for this specific product. Before I start targeting completely different words, do you have any advice on how to tackle big name eCommerce sites who are ranking higher than you. Thank you!

    | TheOceanAgency
    0

  • Hi, Why would a website with good Moz stats such as DA/PA 45, mR/mT 5.0+ have 0 PageRank? Have these sites done something? I have seen some sites with similar Moz stats have PR3/4 and when I have checked a year later the PR has dropped to 0. Does that indicate Google has hit those sites and removed their PageRank? Thanks.

    | Bondara
    0

  • Sorry to ask a really dumb question. I want to sort out a load of old 404 errors. I've exported the list of URL and I'm more than happy to go through that and work out what needs to go where. After that my only option at the moment is to use the re-direct function in my WordPress install and do all the work manually. There are loads to do so I want to be able to upload all the re-directs. I know I need to create a htaccess file and upload it. I know where to upload it. This is where I get nervous. I need to get this file right. Is there a really obvious idiots file which I can use and then save as the correct file type? I've got all the URLs in a CSV at the moment. Sorry for being a bit thick. Hope you can help.

    | GlobalLingo
    0

  • One of our shops got a Panda penalty back in september. We sell all our items with same product name and same product description also on amazon.com , amazon.co.uk, ebay.com and ebay.co.uk. Did you ever have a case where such multichannel sales caused panda penalty?

    | lcourse
    0

  • Hi, I have recently installed wordpress and started a blog but now loads of duplicate pages are cropping up for tags and authors and dates etc. How do I do the canonical thing in wordpress? Thanks Ian

    | jwdl
    0

  • Our website which was ranking at number 1 in Google.co.uk for our 2 main search terms for over three years was hacked into last November. We rebuilt the site but had slipped down to number 4. We were hacked again 2 weeks ago and are now at number 7. I realise that this drop may not be just a result of the hacking but it cant' have helped. I've just access our Google Webmaster Tools accounts and these are the current results: 940 Access Denied Errors 197 Not Found The 940 Access Denied Errors apply to all of our main pages plus.... Is it likely that the hacking caused the Access Denied errors and is there a clear way to repair these errors? Any advice would be very welcome. Thanks, Colin

    | NileCruises
    0

  • We have a website that is ~10yrs old and a PR 6. It has a bunch of legitimate links from .edu and .gov sites. Until now the owner has never blogged or added much content to the site. We have suggested that to grow his traffic organically he should add a worpress blog and get agressive with his content. The IT guy is concerned about putting a wordpress blog on the same server as the main site because of security issues with WP. They have a bunch of credit card info on file. So, would it be better to just put the blog on a subdomain like blog.mysite.com OR host the blog on another server but have the URL structure be mysite.com/blog? I have tried to pass as much juice as possible. Any ideas?

    | jasonsixtwo
    0

  • We have several subdomains we use for testing applications. Even if we block with robots.txt, these subdomains still appear to get indexed (though they show as blocked by robots.txt. I've claimed these subdomains and requested permanent removal, but it appears that after a certain time period (6 months)? Google will re-index (and mark them as blocked by robots.txt). What is the best way to permanently remove these from the index? We can't use login to block because our clients want to be able to view these applications without needing to login. What is the next best solution?

    | nicole.healthline
    0

  • Hi all, I hope you can spend some time to answer my first of a few questions 🙂 We are running a Magento site - layered/faceted navigation nightmare has created thousands of duplicate URLS! Anyway, during my process to tackle the issue, I disallowed in Robots.txt anything in the querystring that was not a p (allowed this for pagination). After checking some pages in Google, I did a site:www.mydomain.com/specificpage.html and a few duplicates came up along with the original with
    "There is no information about this page because it is blocked by robots.txt" So I had added in Meta Noindex, follow on all these duplicates also but I guess it wasnt being read because of Robots.txt. So coming to my question. Did robots.txt block access to these pages? If so, were these already in the index and after disallowing it with robots, Googlebot could not read Meta No index? Does Meta Noindex Follow on pages actually help Googlebot decide to remove these pages from index? I thought Robots would stop and prevent indexation? But I've read this:
    "Noindex is a funny thing, it actually doesn’t mean “You can’t index this”, it means “You can’t show this in search results”. Robots.txt disallow means “You can’t index this” but it doesn’t mean “You can’t show it in the search results”. I'm a bit confused about how to use these in both preventing duplicate content in the first place and then helping to address dupe content once it's already in the index. Thanks! B

    | bjs2010
    0

  • Hi, I just downloaded a Crawl Summary Report for a client's website. I am seeing THOUSANDS of duplicate page content errors. The overwhelming majority of them look something like this: ERROR: http://www.earlyinterventionsupport.com/resources/parentingtips/development/parentingtips/development/development/development/development/development/development/parentingtips/specialneeds/default.aspx This page doesn't exist and results in a 404 page. Why are these pages showing up? How do I get rid of them? Are they endangering the health of my site as a whole? Thank you, Jenna <colgroup><col width="1051"></colgroup>
    |   |

    | JennaCMag
    0

  • Hi, we switched platforms to Magento last year. Since then our SERPS rankings have declined considerably (no sudden drop on any Panda/Penguin date lines). After investigating, it appeared we neglected to No index, follow all our filter pages and our total indexed pages rose sevenfold in a matter of weeks. We have since fixed the no index issue and the pages indexed are now below what we had pre switch to Magento. We've seen some positive results in the last week. Any ideas when/if our rankings will return? Thanks!

    | Jonnygeeuk
    0

  • I know I've asked a similar question in the past but I'm still trying to figure out what to do with my website. I've got a website at thewebhostinghero.com that's been penalized by both Panda and Penguin. I cleaned up the link profile and submitted a reconsideration request but it was denied. I finally found a handful of additional bad links and I submitted a new disavow + reconsideration request a few days ago and I am still waiting. That said, after submitting the initial disavow request, the traffic has completely gone and while I expected a drop in traffic, I also expected my penalty to be lifted but it was not the case. Even though the penalty might be lifted this time, I think that making the website profitable again could be harder than creating a new website. So here's my questioning: The website's domain is thewebhostinghero.com but I also happen to own webhostinghero.com which I bought later for $5000 (yes you read that right). The domain "webhostinghero.com" is completely clean as it's only redirecting to thewebhostinghero.com. I would like to use webhostinghero.com as a completely new website and not redirect any traffic from thewebhostinghero.com as to not pass any bad link juice. Pros: Keeping the same branding image (which cost me $$$) Keeping the 17,000+ Facebook followers Keeping the same Google+ and Twitter accounts Keeping and monetizing a domain that cost me $5000 webhostinghero.com is a better domain than thewebhostinghero.com Cons: Will create confusion between the 2 websites Any danger of being flagged as duplicate or something? Do you see any other potential issues with this? What's your opinion/advice? P.S. Sorry for my english...

    | sbrault74
    0

  • I have a primary domain, toptable.co.uk, and a disaster recovery site for this primary domain named uk-www.gtm.opentable.com.  In the event of a disaster, toptable.co.uk would get CNAMEd (DNS alias) to the .gtm site.  Naturally the .gtm disaster recover domian is an exact match to the toptable.co.uk domain. Unfortunately, Google has crawled the uk-www.gtm.opentable site, and it's showing up in search results.  In most cases the gtm urls don't get redirected to toptable they actually appear as an entirely separate domain to the user.   The strong feeling is that this duplicate content is hurting toptable.co.uk, especially as .gtm.ot is part of the .opentable.com domain which has significant authority. So we need a way of stopping Google from crawling gtm. There seem to be two potential fixes.   Which is best for this case? use the robots.txt to block Google from crawling the .gtm site 2) canonicalize the the gtm urls to toptable.co.uk In general Google seems to recommend a canonical change but in this special case it seems robot.txt change could be best. Thanks in advance to the SEOmoz community!

    | OpenTable
    0

  • On my ecommerce site, we have .html extensions on all files and categories. I was wondering if it is worth the development cost to make all of them /   ? Is there any SEO benefit in doing so? Thanks, B

    | bjs2010
    0

  • We're experiencing an issue where we have keywords directing traffic to incorrect child landing pages. For a generic example using fake product types, a keyword search for XL Widgets might send traffic to a child landing page for Commercial Widgets instead. In some cases, the keyword phrase might point a page for a child landing page for a completely different type of product (ex: a search for XL Widgets might direct traffic to XL Gadgets instead). It's tough to figure out exactly why this might be happening, since each page is clearly optimized for its respective keyword phrase (an XL Widgets page, a Commercial Widgets page, an XL Gadgets page, etc), yet one page ends up ranking for another page’s keyword, while the desired page is pushed out of the SERPs. We're also running into an issue where one keyword phrase is pointing traffic to three different child landing pages where none of the ranking pages are the page we've optimized for that keyword phrase, or the desired page we want to rank appears lower in the SERPs than the other two pages (ex: a search for XL Widgets shows XL Gadgets on the first SERP, Commercial Widgets on the second SERP, and then finally XL Widgets down on the third or fourth SERP). We suspect this may be happening because we have too many child landing pages that are targeting keyword terms that are too similar, which might be confusing the search engines. Can anyone offer some insight into why this may be happening, and what we could potentially do to help get the right pages ranking how we'd like?

    | ShawnHerrick
    0

  • We have partners that want us to build and manage a co-branded white label for them.  We will have unique content on the white label, however the white label will be located on our server.  I was planning to put it on a subdomain and mask the URL, however was told that google will see through that and not give any credit to the white label. Our partners all have high PR and we are a new company with low PR.  We want the white labels to get the credit from the partner websites. Should we do it through url masking or by changing the A Record in the other website to point to our server?

    | TravelerVIP
    0

  • I'm after a fresh set of eyes and any suggestions to help me with my site on what next I should be doing to help increase rankings. The site is: http://bit.ly/VR6xIm Currently the site is ranking around 9-11th on google.co.uk for it's main term which is the name of the site. The site is around a year old, when it launched it went initially up towards positions 3-5 but has since settled at around where it is now. I have a free tool webmasters can use to implement our speed test into their sites which also includes a link back to our site in it to recognise that we are providing the tool for free, I periodically change the link achor text so it is not always the same anchor text that every site uses. Is there anything obvious I should be doing or that is missing that would help with my rankings? *Just as a note, I am not after a review on the actual speed test on the site, a new one will be developed to help further increase accuracy.

    | Wardy
    0

  • My website sends Customers from a http://www.mysite.com/features page to a  https://www.mysite.com/register page which is an account sign-up form using a 302 re-direct. Any page that collects customer data has an authenticated SSL certificate to protect any data on the site. Is this 302 the most appropriate way of doing this as the weekly crawl picks it up as being bad practise? Is there a better alternative?

    | Ubique
    0

  • Hello, I have an ecommerce website with products I have many categories and more products are associated with several categories (I can not do otherwise). Urls of each product are not duplicated because I have : http://www.site.com/product-name However, my breadcrumb varies depending on the way. I have for example: If I go through the A section and sub-section Aa, my breadcrumb will:
    Home> Section A> subheading Aa> product 1 If >> I go through the B section and sub-section Ca, my breadcrumb will:
    Home> Section B> subheading Ca> product 1 My question: is that with only a breadcrumb different for my product sheets, there is a duplication? My opinion ...... not because the url of the page is unique. Thank you for your feedback. Sorry for the english, i'm french 😉 D.

    | android_lyon
    0

  • Hi, We have a Magento website using layered navigation - it has created a lot of duplicate content and I did ask Google in GWT to "No URLS" most of the querystrings except the "p" which is for pagination. After reading how to tackle this issue, I tried to tackle it using a combination of Meta Noindex, Robots, Canonical but still it was a snowball I was trying to control. In the end, I opted for using Ajax for the layered navigation - no matter what option is selected there is no parameters latched on to the url, so no dupe/near dupe URL's created. So please correct me if I am wrong, but no new links flow to those extra URL's now so presumably in due course Google will remove them from the index? Am I correct in thinking that? Plus these extra URL's have Meta Noindex on them too - I still have tens of thousands of pages indexed in Google. How long will it take for Google to remove them from index? Will having Meta No Index on the pages that need to be removed help? Any other way of removing thousands of URLS from GWT? Thanks again, B

    | bjs2010
    0

  • Hi, In my sitemap, I have the preferred entrance pages and URL's of categories and subcategories. But I would like to know more about how Googlebot and other spiders see a site - e.g. - what is classed as a deep link? I am using Screaming Frog SEO spider, and it has a metric called level on it - and this represents how deep or how many clicks away this content is.. but I don't know if that is how Googlebot would see it - From what Screaming Frog SEO spider software says, each move horizontally across from Navigation is another level which visually doesnt make sense to me? Also, in my sitemap, I list the URL's of all the products, there are no levels within  the sitemap. Should I be concerned about this? Thanks, B

    | bjs2010
    0

  • Hi everyone. This question has been nagging at my mind today ever since I had a colleague say "no one ever searches for the term 'presonus 16.4.2'" My argument is "Yes they do." My argument is based on the fact that when you type in 'presonus 16" - Google's auto-suggest lists several options, of which presonus 16.4.2 is one. That being said. Does Google's Keyword Tool base traffic estimates ONLY on actualy keywords typed in by the user, in this case "presonus 16" or does it also compile data for searchers who opt for the "suggested" term "presonus 16.4.2" ??? To clarify, does anyone have any insight as to whether Google is compiling data on strictly the term typed in from a use or giving precendence to a term being selected by a user that was listed as an auto-suggest, or, are they being counted twice???? Very curious to know everyone's take on this! Thanks!

    | danatanseo
    0

  • Looking for some advanced help here. I've been reading a lot of conflicting information on this, and I am hoping someone can clear this up. My question is regarding length and complexity of title tags. For example, my top level keywords are: IT Support, IT Services, IT Outsourcing, Help Desk, etc. I also have pages for many modified versions ex: IT Support Services, Managed IT Services, etc. I have robust pages for each. Should my title tag be: IT Support | CSM Corp. - Simple IT Support Company | CSM Corp. (Picks up a longer tail) or IT Support | Secondary Keyword | CSM Corp. Does adding secondary keywords dilute the strength of the primary keyword? If long is preferable, can someone give me an example using "IT Support"?

    | CsmBill
    0

  • While reading the beginners guide, I noticed that to increase my SEO I need to have access to the physical website (ie. to use html rich text/meta tags). I, however, used a third party creative team to build my site, so I have no admin access. Are there any step-by-step instructions of things I can do if I don't have portal access to my website to increase SEO? Please let me know. Thanks..

    | SmartEnergy.com
    0

  • I recently posted a question regarding a product page that appeared to have no content. [http://www.seomoz.org/q/why-is-ose-showing-now-data-for-this-url] What puzzles me is that this page got indexed anyway. Was it indexed based on Google knowing that there was once content on the page? Was it indexed based on the trust level of our root domain? What are your thoughts? I'm asking not only because I don't know the answer, but because I know the argument is going to be made that if Google indexed the page then it must have been crawlable...therefore we didn't really have a crawlability problem. Why Google index a page it can't crawl?

    | danatanseo
    0

  • We have a site with a generic top level domain and we'd like to use a small portion of the homepage to cater content based on the IP of a visiting user. The content is for product dealerships around different regions/states of the US, not internationally. The idea being that someone from Seattle would see dealerships for this product near their location in Seattle. The section on the homepage is relatively small and would churn out 5 links and images according to location. The rest of the homepage would be the same for everyone, which includes links to news and reviews and fuller content. We have landing pages for regional/state content deeper in the site that don't use an IP to deliver content and also have unique URLs for the different regions/states. An example being a "Washington State Dealerships" landing page with links to all the dealerships there. We're wondering what kind of SEO impact there would be to having a section of the homepage delivering different content based on IP, and if there's anything we should do about it (or if we should be doing it all!). Thank you.

    | seoninjaz
    0

  • Hey, I have a bit of a problem/issue what is freaking me out a bit. I hope you can help me. If i do site:www.somesitename.com search in Google i see that Google is indexing my attachment pages. I want to redirect attachment URL's to parent post and stop google from indexing them. I have used different redirect plugins in hope that i can fix it myself but plugins don't work. I get a error:"too many redirects occurred trying to open www.somesitename.com/?attachment_id=1982 ". Do i need to change something in my attachment.php fail? Any idea what is causing this problem? get_header(); ?> /* Run the loop to output the attachment. * If you want to overload this in a child theme then include a file * called loop-attachment.php and that will be used instead. */ get_template_part( 'loop', 'attachment' ); ?>

    | TauriU
    0

  • Hi, Ive noticed that Google is not recognizing/crawling the latest changes on pages in my site - last update when viewing Cached version in Google Results is over 2 months ago. So, do I Fetch as Googlebot to force an update? Or do I remove the page's cached version in GWT remove urls? Thanks, B

    | bjs2010
    0

  • Hi, We have an e-commerce store in English and Spanish - same products. URLs differ like this: ENGLISH:
    www.mydomain.com/en/manufacturer-sku-productnameinenglish.html SPANISH:
    www.mydomain.com/es/manufacturer-sku-productnameinspanish.html All content on pages is translated, e.g, H1, Titles, keywords, descriptions and site content itself is in the language displayed. Is there a risk of similar or near dupe content here in the eyes of the big G? Would it be worth implementing different languages on subdomains or completely different domains? thank you B

    | bjs2010
    0

  • We are debating moving a strong category page (and subcategory, product pages) from our current older domain to a new domain vs just moving the whole domain. The older domain has DA 40+, and the category page has PA 40+. Anyone with experience on how much PR etc will get passed to a virgin domain if we just redirect olddomain/strongcategorypage/ to newdomain.com? If the answer is little to none, we might consider just moving the whole site since the other categories are not that strong anyway. We will use 301 approach either way. Thanks!

    | Durand
    0

  • My Home Page has about 500 words of content, but when I crawl the text it brings up about 1400 total words when counting all the anchor text links (I believe all are in the navigation or images).  All of the link are internal and relevant (it's a huge site), but I am worried that they are diluting the copy.  Is that likely the case?  What's a good ratio?  Thoughts?

    | NathanArizona
    0

  • Hi! A client has just had 14k 404s pop up in his WMT. I think this is because a page that they redirected to had moved. My question is, can I clean these up by redirecting the page the original redirect was one? If so, will it have any negative impact?

    | neooptic
    0

  • Hi Mozzers, I'm sitting here going through our site and optimizing all of our content.
    For the most part we've just written without correct keyword research, so the content lacks focus. Here is a page I would consider finished - http://www.consumerbase.com/international-mailing-lists.html I have our KWs in the: URL Title Tag Meta Description Bolded in Content Image Alt Attribute. If I optimize my other pages like this, will I be good?
    It feels a tiny bit stuffed to me, but SEOmoz's on-page tool gives me glowing numbers. Thanks!

    | Travis-W
    0

  • Hi Mozzers, I saw a considerable amount of duplicate content and page titles on our clients website. We are just implementing a fix in the CMS to make sure that these are all fixed. What changes do you think I could see in terms of rankings?

    | KarlBantleman
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.