Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • Hi there, I've been working on a pretty dated site. The product pages have tabs that separate the product information, e.g., a tab for specifications, a tab for system essentials, an overview tab that is actually just a copy of the product page. Each tab is actually a link to a completely separate page, so product/main-page is split into product/main-page/specs, product/main-page/resources, etc. Wondering if canonicals would be appropriate in this situation? The information isn't necessarily duplicate (except for the overview tabs) but with each tab as a separate page, I would imagine that's diluting the value of the main page? The information all belongs to the main page, shouldn't it be saying "I'm a version of the main page"?

    | anneoaks
    0

  • Hi folks, I have a bit of a dilemma that I'd appreciate some advice on. We'll just use the solid wood flooring of our website as an example in this case. We use the rel=canonical tag on the solid wood flooring listings pages where the listings get sorted alphabetically, by price etc.
    e.g. http://www.kensyard.co.uk/products/category/solid-wood-flooring/?orderBy=highestprice  uses the canonical tag to point to http://www.kensyard.co.uk/products/category/solid-wood-flooring/ as the main page. However, we also uses filters on our site which allows users to filter their search by more specific product features e.g.
    http://www.kensyard.co.uk/products/category/solid-wood-flooring/f/18mm/
    http://www.kensyard.co.uk/products/category/solid-wood-flooring/f/natural-lacquered/ We don't use the canonical tag on these pages because they are great long-tail keyword targeted pages so I want them to rank for phrases like "18mm solid wood flooring". But, in not using the canonical tag, I'm finding google is getting confused and ranking the wrong page as the filters mean there is a huge number of possible URLs for a given list of products. For example, Google ranks this page for the phrase "18mm solid wood flooring" http://www.kensyard.co.uk/products/category/solid-wood-flooring/f/18mm,116mm/ This is no good. This is a combination of two filters and so the listings are very refined, so if someone types the above phrase into Google and lands on this page their first reaction will be "there are not many products here". Google should be ranking the page with only the 18mm filter applied: http://www.kensyard.co.uk/products/category/solid-wood-flooring/f/18mm How would you recommend I go about rectifying this situation?
    Thanks, Luke

    | LukeyB30
    0

  • Hi people, I have been working on this new website for a month now and it has still not been indexed, here is a link: http://bit.ly/HNgzKG Can any of you spot anything wrong with it? I have tried submitting and also submitted an xml sitemap but still no joy.

    | Eavesy
    0

  • Hi All, Please can you let me know the name and / or point me at an article / blog / directory on how best to achieve additional links under a search engine listing (I don't mean site links)  e.g. I do a search for 'home insurance' on Google.co.uk and under the listing for Compare the Market it has - home insurance, building insurance and landlords insurance. Thanks for your help!

    | Joseph-Vodafone
    0

  • I have a client with a large sitemap in html for his web shop. I am wondering though if i would be better to have a xml sitemap for Google. Is there any advantage in type of sitemap?

    | auke1810
    0

  • Hi, I have a page with a lot of links going to it.  I want to change the name of the page (thereby changing the URL).  I can do a 301 redirect, but does a 301 send the "link juice" to the new page? The page in question is www.aerlawgroup.com/dui.html, and I want to change it to www.aerlawgroup.com/dui-lawyer.html. Thank you in advance for your time.

    | mrodriguez1440
    0

  • I've read that the Hreflang Tag is all the rave for International solutions on a per page basis. I haven't read much about what International agencies are using for non-Google search engines such as Bing. Is the common language meta tags the only solution? would love to see an article that addresses this

    | MikeSEOTruven
    0

  • Hi! Bit of an odd one, but I thought I's ask. Recently I wrote an article for Smashing Mag. It's was a great success and not really an SEO exercise at all, but after several weeks my author page hasn't been indexed (http://www.smashingmagazine.com/author/sam-wright/?rel=author). I just assumed give the quality of the site that it wouldn't take that long. I know it's just a case of leaving it, by any thoughts on why its not been picked up?

    | Blink-SEO
    0

  • So our question is should we handle page redirection/rewriting in php or in .htaccess (with a specific problem we are running into outlined below). We have an ecommerce store in a subfolder of our site (example.com/store/).  In the next folder down we have a group of widgets(www.example.com/store/widget-group1).  Recently we put a .htaccess redirect in the top level folder (example.com/store/.htaccess), in order to re-write some URL’s and also 301 a page to another page.  This seems to be negatively affecting our /widgets-group1/ subfolder however (organic traffic to example.com/store/widget-group1) took a nose dive 3 days after putting the .htaccess redirect in place on the /store/ folder and it has not recovered 8 days later). *Nothing appears outwardly wrong with the current setup to the eye when viewing the pages or requesting as googlebot (the only issue being the nose dive in organic traffic lol) *both subfolders are setup in apache config file to allow local overrides of .htaccess as follows: <directory store="" widget-group1="">Options -Indexes FollowSymLinks -MultiViews
    AllowOverride All
    Order allow,deny
    allow from all</directory> <directory store="">Options -Indexes FollowSymLinks -MultiViews
    AllowOverride All
    Order allow,deny
    allow from all</directory>

    | altecdesign
    0

  • Hey all, Came across a bit of an interesting challenge recently, one that I was hoping some of you might have had experience with! We're currently in the process of a website rebuild, for which I'm really excited. The new site is using Markdown to create an entirely static site. Load-times are fantastic, and the code is clean. Life is good, apart from the 302s. One of the weird quirks I've realized is that with oldschool, non-server-generated page content is that every page of the site is an Index.html file in a directory. The resulting in a www.website.com/page-title will 302 to www.website.com/page-title/. My solution off the bat has been to just be super diligent and try to stay on top of the link profile and send lots of helpful emails to the staff reminding them about how to build links, but I know that even the best laid plans often fail. Has anyone had a similar challenge with a static site and found a way to overcome it?

    | danny.wood
    1

  • I feel my site is not to bad. Needs more work and could be better but what site could not, However i'm bamboozled by some of the sites that are out ranking me on key terms. e.g. Horse Rugs http://www.fasttackdirect.co.uk/products-1-122/Horse_Clothing/HORSE_RUG_SALE.html ranking 8th http://www.centralsaddlery.co.uk/horse/horse-rugs/ not ranking in first 10 pages I dont expect my site (the later) to rank well as its not up there with the bigger players in our industry yet however I can't see why a page like the one i mentioned is ranking so well when as far as i can see its not as well optimized and has very little content. This is not me ranting about it or whining about why i'm not top etc. I just can't work it out and would love sombody to explain the reasons for this? The only thing i can think of is that they have more category with the words "Horse Rugs" in them. Other than that i'm stumped! Ideas on a postcard please!

    | mark_baird
    0

  • We had a customer that had 2 sites. They left us, and 301'd site A to site B. Things didn't go well. Now, a year later they want to use us again. Ideally, I would undo the 301.   Has anyone done this? Would I be better off starting with a new domain? If you've done it, how long before it started to rank like you expected/hoped?

    | TimColeman
    0

  • Hi guys, in order to get rid of our very old-school captcha on our contact form at troteclaser.com, we would like to use a honeypot captcha. The idea is to add a field that is hidden to human visitors but likely to be filled in by spam-bots. In this way we can sort our all those spam contact requests.
    More details on "honeypot captchas":
    http://haacked.com/archive/2007/09/11/honeypot-captcha.aspx Any idea if this single cloaked field will have negative SEO-impacts? Or is there another alternative to keep out those spam-bots? Greets from Austria,
    Thomas

    | Troteclaser
    0

  • I have some old images that are no longer used, but they have a few decent external links pointing to them. Can I 301 them to the page they used to be on? And if yes, will their link juice flow to the page?

    | GregB123
    0

  • remoe remove remove

    | yuyuyu
    0

  • Hey Peeps, I've been struggling lately with a new static site, and I'm looking for anyone's opinion who's had to optimize a site using Nginx. I understand that Nginx is recommended for static sites, however I want to avoid being in a situation where I can't do things like write redirect rules the way I want to. Considering that it will be hosting a Static site, are there any features or functions that Nginx lacks when compared to Apache, such as ability to write rewrite rules, etc.?

    | danny.wood
    1

  • Hello again, I have received an update request where they want me to remove images from this site (as of now its a bunch of thumbnails) current page design: http://1stimpressions.com/portfolio/car-wraps/ and turn it into a new design which utilized a slider (such as this): http://1stimpressions.com/portfolio/ They don't want the thumbnails on the page anymore. My question is since my site has a image sitemap that has been indexed will removing all the images hurt my SEO greatly? What would the recommended steps to take to reduce any SEO damage be, if so? Thank you again for your help, always great and very helpful feedback! 🙂 cheers!

    | allstatetransmission
    0

  • HI, is it possible to make 301 redirections from a Blogger blog to a new domain name? http://name.blogspot.fr > http://www.domain.com Thanks for yours answers. D.

    | android_lyon
    0

  • Hi, I am a local retailer with a physical store in a major US City. My website is very well ranked on major keywords related to my business from an organic result perspective (between 1st and 3rd spot). However, in terms of local results (google place), my website isn't even ranked (except for one specific long tail keyword). Anybody know why? Thank you so much for your help. This is driving me crazy:-)

    | larose37
    0

  • Will blocking the Wayback Machine (archive.org) by adding the code they give have any impact on Google crawl and indexing/SEO? Anyone know? Thanks! ~Brett

    | BBuck
    0

  • How do I tell if a link in a Flash document is follow or nofollow? Or doesn't it matter? (I just found out that my company placed an advertorial in a Flash publication and I want to make sure it doesn't wind up as a paid, followed link.) Thank you!

    | Linda-Vassily
    0

  • We have a webpage that changes content each evening at mid-night -- let's call this page URL /foo.  This allows a user to bookmark URL /foo and obtain new content each day.  In our case, the content on URL /foo for a given day is the same content that exists on another URL on our website.  Let's say the content for November 5th is URL /nov05, November 6th is /nov06 and so on.  This means on November 5th, there are two pages on the website that have almost identical content -- namely /foo and /nov05.  This is likely a duplication of content violation in the view of some search engines. Is the Canonical URL Tag designed to be used in this situation?  The page /nov05 is the permanent page containing the content for the day on the website.  This means page /nov05 should have a Canonical Tag that points to itself and /foo should have a Canonical Tag that points to /nov05.  Correct? Now here is my problem.  The page at URL /foo is the fourth highest page authority on our 2,000+ page website.  URL /foo is a key part of the marketing strategy for the website.  It has the second largest number of External Links second only to our home page.   I must tell you that I'm concerned about using a Cononical URL Tag that points away from the URL /foo to a permanent page on the website like /nov05.  I can think of a lot of things negative things that could happen to the rankings of the page by making a change like this and I am not sure what we would gain.  Right now /foo has a Canonical URL Tag that points to itself.  Does anyone believe we should change this?  If so, to what and why? Thanks for helping me think this through!  Greg

    | GregSims
    0

  • Hi guys! How does normally take to get Google to index the images within the sitemap? I recently submitted a new, up to date sitemap and most of the pages have been indexed already, but no images have. Any reason for that? Cheers

    | PremioOscar
    0

  • Hello!My website is www.enchantingquotes.com.  I also own the domain www.enchantingwallquotes.com,which forwards to my site.  About 90% of my business comes from the keyword "wall quotes".  Should I consider changing switching to the enchantingwallquotes.com domain and redirecting?   And if I do, do I need to recreate the entire website or is there an easier way that I am overlooking? Thank you for any advise/insight!

    | eqgirl
    0

  • Hi my client has recently relaunched the website and they use a lot of 302 redirects because they want Google to crawl the pages. They do not plan to add any content to those pages. I advised 301 instead but they do not want to do this. Can too many 302s harm their rankings?

    | GardenPet
    0

  • If I have a <noscript>tag on every page of my website with the same sentence over and over saying something to the effect of "Sorry our site uses Javascript, please enable javascript for the full site experience.", Webmaster Tools will tell me that one of the most common words on my site is "Javascript".</p> <p>Is this something to be concerned about from an SEO perspective? My site is obviously not about Javascript and I don't want to dilute my page's topic or authority by repeating words that are not relevant to the topic of my site.</p> <p>Thanks!</p></noscript>

    | IrvCo_Interactive
    0

  • Hi All, In talking with my internal developers, UX, and design team there has been a big push to move from a "tabbed" page structure (where as each tab is it's own page) to combining everything into one long page.  It looks great from a user experience standpoint, but I'm concerned that we'll decrease in rankings for the tabbed pages that will be going away, even with a 301 in place. I initially recommending#! or pushstate for each "page section" on the long form content.  However there are technical limitations with this in our CMS.  The next idea I had was to still leave those pages out there and to link to them in the source code, but this approach may get shot down as well. Has anyone else had to solve for this issue? If so, how did you do it?

    | AllyBank
    1

  • Hi We're redesigning a website for a client whose SEO rankings are fairly strong, and are driving a good volume of enquiries through their website. We would like to do our best to ensure the redesign has a minimal impact on the rankings but aren't sure if there's anything in particular we should be doing to avert a severe dip in results. The content will be similar on many pages, with the major updates being from 2013 products to the 2014 suite. Layouts and images will, of course be updated, and we're going to do our best to keep the inbound links going to the correct place - any advice here would be much appreciated, too. We'd be really grateful for any suggestions for how to avoid a ranking disaster, and any major pitfalls we should look out for. Many thanks.

    | Kal-SEO
    0

  • My Google blogger blog is about 10 months old. In that time i have worked really hard with adding unique content, building relationships with other bloggers in the same niche, and done some inbound marketing. 2 weeks ago I updated the template to something cleaner, with a little more "wordpress" feel to it. This means i've messed about with the code a lot in these weeks, adding social buttons etc. The problem is that from some point late last week thurs/fri my pages started disappearing from Googles index. I have checked webmaster tools and have no manual actions. My link profile is pretty clean as its a new site, and i have manually checked every piece of content published for plagiarism etc. So what is going on? Did i break my blog? Or is something else amiss? Impressions are down 96% comparing Nov 1-5th to previous 5 days. site is here: http://bit.ly/174beVm Thanks for any help in advance.

    | Silkstream
    0

  • Hi Moz, I have a question concerning Vintykids.com.
    The site comes in four languages: German on vintykids.com/de Dutch on vintykids.com/nl English on vintykids.com/en French on vintykids.com/fr The German language gives a problem. In Google.de (German Google) the site is completely indexed in German but we also see results in Dutch. So when you do a search in Google.de on their brandname (Vintykids) we see results in Dutch on vintykids.com.
    We think we have set the meta's right in the German language on vintykids.com/de and we also managed some links to Vintykids.com/de from good quality and relevant German sites. What can we do further to get vintykids.com/de ranking in Google.de on the brandname? Thank you.

    | B.Great
    0

  • My clients pull from a central article directory on our server (medical directory), as the information is about standard medical issues. This said, the MOZ analytics is showing these articles for each client as indexed and duplicate in content, descriptions, titles, etc. Would it be better to use a no-follow for these articles to avoid looking like duplicate content, or should I consider overhauling the resource section into static pages and making each article unique to each client-considering the latest updates in Google? Any help/insight would be greatly appreciated!!!!! Thanks

    | lfrazer
    0

  • My URL is: http://www.fslocal.comRecently, we discovered Google's cached snapshots of our business listings look different from what's displayed to users. The main issue? Our content isn't displayed in cached results (although while the content isn't visible on the front-end of cached pages, the text can be found when you view the page source of that cached result).These listings are structured so everything is coded and contained within 1 page (e.g. http://www.fslocal.com/toronto/auto-vault-canada/). But even though the URL stays the same, we've created separate "pages" of content (e.g. "About," "Additional Info," "Contact," etc.) for each listing, and only 1 "page" of content will ever be displayed to the user at a time. This is controlled by JavaScript and using display:none in CSS. Why do our cached results look different? Why would our content not show up in Google's cache preview, even though the text can be found in the page source? Does it have to do with the way we're using display:none? Are there negative SEO effects with regards to how we're using it (i.e. we're employing it strictly for aesthetics, but is it possible Google thinks we're trying to hide text)? Google's Technical Guidelines recommends against using "fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash." If we were to separate those business listing "pages" into actual separate URLs (e.g. http://www.fslocal.com/toronto/auto-vault-canada/contact/ would be the "Contact" page), and employ static HTML code instead of complicated JavaScript, would that solve the problem? Any insight would be greatly appreciated.Thanks!

    | fslocal
    0

  • Two of our dev sites(subdomains) were indexed by Google. They have since been made private once we found the problem. Should we take another step to remove the subdomain through robots.txt or just let it ride out? From what I understand, to remove the subdomain from Google we would verify the subdomain on GWT, then give the subdomain it's own robots.txt and disallow everything. Any advice is welcome, I just wanted to discuss this before making a decision.

    | ntsupply
    0

  • Hi Folks, Complete newbie (well last 12 months) I have recentley added a blog to my site and have been doing quite a bit of quite word researching through google. I have found some good keywords that have up till now escaped me! Heres my question because I trying for local traffic, mainly newcastle durham and sunderlanddo i go with one of the following two options get two very similar keywords in my article and go for both and rely on google to bring up local listings for the end user in my area e.g   Small garden design | Garden design from the experts.    (keywords bold ) or Garden Design | Newcastle | Sunderland | Durham | so I have geo locations in title either way I will obviously have both keywords and locations in the artcle Help please I dont want to write many hours and find I have missed a trick! Many thank guys n girls!

    | easigrassne
    0

  • Hey folks, Several years ago we created a couple subdomains (ie, NEWS.URL.COM) and the posts that we put on this subdomain were very full of keyword anchor text links. Each post sometimes had 4-5. We haven't posted any new content on this subdomain for 3 years.   After getting hit with manual "linking" penalty, we disavowed tons of links.  But left the links on the subdomains alone. They really aren't providing any traffic and the content is poor, and not written by me. Do you think having these links is hindering our ranking efforts? I think I should just blow up this subdomain and get rid of it and all the keyword anchor links. Thoughts? Thanks ron Thoughts?

    | yatesandcojewelers
    0

  • I'm relatively new to the technical side of SEO and have been trying to brush up my skills by going through Google's online Web-master Academy, which suggests that you need a If-Modified-Since HTTP Header tag on your site.  I checked and apparently our web server doesn't support this. I've been told by a good colleague that the If-Modified-Since tag is no longer relevant as the spiders will frequently revisit a site as long as you regularly update and refresh the content (which we do). However our site doesn't seem to of been reindexed for a while as the cached version's are still showing the pages from over a month ago. So two question really - is the If-Modified-Since HTTP Header still relevant and should I make sure this is included? And is there anything else I should be doing to make sure the spiders crawl our pages? (apart from keeping them nice, fresh and useful)

    | annieplaskett
    0

  • Hello All, Looking for input on an issue I am having. We used to have a website www.gazaro.com.  It was a price comparison engine for consumers.  A shift in the focus of the business resulted in www.360pi.com - a price intelligence tool for retailers.The two websites have similar themes, so I thought it would be valuable to pass SEO juice from the old domain to the new domain.Back in August, I noticed that Gazaro was redirected to 360pi with a meta refresh.  I know a 301 redirect is preferable to a meta refresh, so we switched to a 301 redirect.Since that happened, there has been a spike in 404 errors in webmaster tools. If you hover over the url, it is actuallywww.360pi.com/deal/amazon etc etc.  It is looking for gazaro urls on the 360pi domain - which don't exist. I think this is hurting our homepage ranking.  Our homepage no longer ranks for "price intelligence" when it used to be in pos. 4 or 5.  As it turns out, we are ranking #1 for "price intelligence" but with our product page.I'm wondering why the 404 are happening.  Is something setup in correctly?  Or should I have them switch back to a meta refresh.Thoughts?  Thanks for your helpPNM1cYO PNM1cYO

    | AmandaHorne
    0

  • ​ We have a client in Nashville who opened his first location on Spring St., then later bought out PAC Auto to open a second location on Dickerson St. Lately, we noticed that the Dickerson location wasn't ranking. I found that the previous business owner at Pac Auto had already built up a good web presence and that sigh our client was using their old number. Basic NAP violation, ok, got it. But what to do next? I decided to update PACs citations with The Car People's business name and website. Where I was unable to edit or where listings were already claimed, I just reported PAC auto as closed. But yesterday I noticed not only was the Dickerson location still not ranking, but the Spring street location had indeed dropped several places too! (edit: I'm referring to local search results here as we don't own the site) What kind of beast have I stirred?! What kind of signals am I sending to Google that are devaluing the Spring st. location? Will things get worse before they get better? What can I do to make some progress on one without hurting the other? Is it worth trying to get the previous business owners logins (not likey)? Talk to The Car People about getting a new number (not impossible)? Is it worth trying to get the site in order to build separate landing pages for each location? Thanks in advance!

    | cwtaylor
    0

  • Hi my client's website uses URL with www and without www. In page/title both website shows up. The one with www has page authority of 51 and the one without 45. In Moz diagnostic I can see that the website shows over 200 duplicate content which are not found in , e.g. Webmaster. When I check each page and add/remove www then the website shows the same content for both www and no www. It is not redirect - in search tab it actually shows www and then if you use no www it doesn't show www. Is the www issue to blame? or could it be something else? and what do I do since both www URL and no-www URL have high authority, just set up redirect from lower authority URL to higher authority URL?

    | GardenPet
    0

  • Hello all, As I have mentioned in another Q&A, one of our new clients got hit by manual penalty. I checked their link profile and there was a lot of black hat involved. Long story sort, I submitted a reconsiderationr equest which was not enough as it seems 99,9% of his links are bad links. We took the decision to move a newly launched web site from www.websitename.com to www.website-name.com with the latter being an old domain name with good authority and clean link profile. The problem is that at the moment the www.website-name.com is set to 301 redirect to www.websitename.com and what we want to do now is take the web site off www.websitename.com and launch (not 301 as we dont want to pass the penalty to the clean domain) it to www.website-name.com. What is the best practise for this particular case and are there any things i should pay attention to? I would appreciate your advise!

    | artdivision
    0

  • Hi everyone, I have a bit of problem with duplicate content on a newly launched site and looking for some advice on which pages to canonicalize.  Our legacy site had product "information" pages that now 301 to new product information pages.  The reason for the legacy having these pages (instead of pages where you can purchase) is because we used our vendors  "cart link", which was an iframe inside the website. So in order to get ranked for these products, we created these pages, that had links to the frame where they could buy.  The strategy worked, and we got ranked for our products. Now with the new site, we have those same product information pages, but when you click the link to buy, it goes to a page which now is on our actual site, where you can make the purchase, but this page contains the same basic information, though it looks very different. So my question ---  the product "information" pages, are the new 301 homes and are the pages with the rank.  The purchase pages are new and have no rank, but are essentially duplicate content.    Should I put the canonical link element on the purchase page and tell Google to regard the information pages since those are ranked?   It just seems weird to me to direct Google away from the place where people can purchase, however, the purchase pages aren't nearly as "pretty" as the information pages are, and wouldn't be the greatest landing pages.  We have an automotive site, and the purchase page you have to enter vehicle information.  The information page is nicer, and if the visitor is interested, its just one click to get to that page to buy. What to do here?  I am fairly new to Moz, and I couldn't determine whether I am permitted to include an example link from our site of what I am referring to.  Is that permitted? Thanks for any help anyone can provide.
    Kristin

    | yogitrout1
    0

  • Hello, Has anyone had luck associating their Google + business page with their YouTube channel? Our YouTube page is associated with our Google + profile  (and we would like it to be associated with the Google + business page.) There are numerous articles out there that Google is working on an update to allow the Channel/Google+ business page association but I am wondering if there is news we might have missed. Or if there is a way to get around it? We want to implement video on some site pages and would rather use YouTube code as opposed to customizing a solution. Do most folks think Google will have an easy solution once it at arrives? Meaning if you upload videos to your channel that is currently associated with the profile page, do you think there will be a way to convert everything over to a Google + business page once they unveil an update. Thank you!

    | SEOSponge
    0

  • Hi Everyone, Ok so here is my question. I have a client who sells gourmet tea and gourmet spices. She has a culinary blog. There is a culinary blog that just posted that the website will be shut down in the near future. It has 100% white hat links. Would it be considered black hat to buy the domain and redirect it to my clients blog which is also a culinary blog? I would really like to ask Matt Cutts this question. Does anyone know how to send him questions? Thanks Carla

    | Carla_Dawson
    0

  • Hey all, I've got another HTML5 + multiple H1s question, although I feel like my situation is unique from the other discussions/questions on Moz. On our homepage at http://www.strutta.com, near the bottom of the page, we have a section for testimonials. Here, we have the names of three other organizations that have written testimonials for us. Following proper HTML5 guidelines, we have placed their company names in H1s. Could having the names of other companies in H1s potentially dilute the subject/meaning of our page? Cheers

    | danny.wood
    0

  • My client has a live page with 100+ links subdivided into 10 categories that each have great potential keyword targeting opportunities.  I'd like to improve this page and my intuition is to split it into 11 pages, one page with links to all the others and a bit of content about each.  Here's an example of the potential IA: Dog Rescue Groups
       Golden Retriever Rescue - description
       Poodle Rescue - description
       Cocker Spaniel Rescue - description
       Poodle Rescue - description
       Labrador Retriever Rescue - description
    etc. --------- Golden Retriever Rescue
       Link 1 - description
       Link 2 - description
       Link 3 - description Is this a good idea and will I see a big traffic drop overall at first?  Also, these are all internal links, not external.

    | elenarox
    0

  • Ok... I will try to explain as clear as possible. This issue is regarding close to 5000 'Warnings' from our most recent seomoz pro crawl diagnostic test. The top three warnings have about 6000 instances among them: : 1. Duplicate Page Title 2. Duplicate Page Content 3. 302 (Temporary Redirect) We understand that duplicate titles and content are "no-no's" and have made it top priority to avoid duplication on any level. Here is the issue lies... we are using the Volusion eCommerce solution and they have a variety of value add shopping features such as "Email A Friend" and "Email Me When Back In-Stock" on each product page. If one of these options is clicked, you are then directed to the appropriate page. Now each page has a different url with the sole variable of each individual product code. But with it being a part of Volusion's ingrained functionality... the META title is the same for each page. It takes from the title of our store homepage. Example below: Online Beauty Supply Store | Hair Care Products | Nail Care | Flat Irons http://www.beautystoponline.com/Email_Me_When_Back_In_Stock.asp?ProductCode=AN1PRO7130 Online Beauty Supply Store | Hair Care Products | Nail Care | Flat Irons http://www.beautystoponline.com/Email_Me_When_Back_In_Stock.asp?ProductCode=BI8BIOSI34 The same goes for the duplicate content warnings. If you click on one of these features, it directs you to a page with pretty much the same content except for different product. Basically each page has both duplicate content and duplicate title. SEOMOZ description is Duplicate Title: Content that is identical (or nearly identical) to content on other pages of your site forces your pages to unnecessarily compete with each other for rankings. Duplicate Page Content: You should use unique titles for your different pages to ensure that they describe each page uniquely and don't compete with each other for keyword relevance. Because I know SEO is not an exact science, the question here is does Google recognize that although they are duplicates, it actually is generated from a feature that makes us even more of a legitimate eCommerce site? Or, from seomoz description, if duplication is bad only because you do not want your pages to be competing with each other... should I not worry because i could care less if these pages don't get traffic. Or does it effect my domain authority as whole? Then as for a solution. I am still trying to work out with Volusion how we can change the META title of the pages. It's highly unlikely but we'll see. As for the duplicate content, there is no way to change one of these pages. It's hard coded. Solution... so if it is  bad (even though it shouldn't be) would it be worth it to disable these features. I hope not. Wouldn't that defeat the purpose of Google trying to provide the most legitimate, value add sites to searchers? As for the 302 (Temporary Redirect) warning... this is only appearing on all of our shopping cart pages. Such as the "Email A Friend" feature, there is a page for every product.  For example: http://www.beautystoponline.com/ShoppingCart.asp?ProductCode=AN1HOM8040 http://www.beautystoponline.com/ShoppingCart.asp?ProductCode=AN1HOM8050 The description semoz provides is: 302 (Temporary Redirect): Using a 302 redirect will cause search engine crawlers to treat the redirect as temporary and not pass any link juice (ranking power). We highly recommend that you replace 302 redirects with 301 redirects. So the probably solution... I do have the ability to change to a 301 redirect but do I want to do this for my shopping cart? Does Google realize the dead end is legitimate? Or... does it matter if link juice is passed through my shopping cart? And again, does it impact my site as a whole? It is greatly appreciated if anyone could help me out with this stuff 🙂 Thank you

    | anthonyjamesent
    1

  • We have a possible duplicate content issue based on the fact that we have a number of websites run from the same code base across .com / .co.uk / .nl / .fr / .de and so on. We want to update our sitemaps alongside using the href lang tags to ensure Google knows we've got different versions of essentially the same page to serve different markets. Google has written an article on tackling this:https://support.google.com/webmasters/answer/75712?hl=en but my question remains whether having a single sitemap accessible from all the international domains is the best approach here or whether we should have individual sitemaps for each domain.

    | jon_marine
    0

  • Currently I have a site where the targeted keywords were on the home page, with links built to the homepage. It has been widely recognised though that Google is looking more and more for specific content on webpages that holds greater relevance to search queries. As such, I switched this targeted page to other created webpages - changing metatags and creating more relevant content for respective keywords. I thought this would improve rankings, however, upon doing this there was a sharp fall in rankings for keywords. Is there anything that I could have done wrong, or can do better so that keywords move back up the rankings?

    | Gavo
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.