Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi If my site gets mentioned on a site with the web address written out but not hyper-linked do I still get some SEO value from this or is it not giving any SEO benefit? Thanks Sean

    | MotoringSEO
    1

  • I have a site that I'm working on that sells waste oil heaters, and I'm beginning to run into an issue. As one would assume, our primary keyword phrase is "waste oil heaters" for which we're doing rather well. The issue is that there are two other phrases that are directly synonymous to our primary term that users are actively searching for (i.e. the product can accurate be called three different things). Phrases are listed below w/ phrase match search volumes "waste oil heater" - 6600 "waste oil burner" - 2400 "waste oil furnace" - 1900 I'm not one who likes to engage in trying to "trick" anything, so I'm fairly opposed to listing all three of these in the title tag or something similar. This is being done by our competitors, but only one outranks us as this point for the primary phrase. My initial thoughts are that we should be targeting our home page and category page for "waste oil heater(s)", and then lightly pepper our content with the use of these synonyms.  Then from there we can focus on other term variations w/ our blog posts and try to vary up the anchor text coming into the site when we launch link building. What do you guys think?  Have you guys been a situation like this with three phrases describing the same product?  I appreciate any feedback or advice. Thanks guys!

    | CaddisInteractive
    0

  • I have clients with Wordpress sites and clients with just a Wordpress blog on the back of website. The clients with entire Wordpress sites seem to be ranking better. Do you think the URL structure could have anything to do with it? Does having that extra /blog folder decrease any SEO effectiveness? Setting up a few new blogs now...

    | PortlandGuy
    0

  • Last year i 301'd one of my directories on my site, pointing everything to a different directory. Long story short I am going to sell this product line again and would like to just remove the 301 to that original directory, but I am reading that the 301s are also cached in most browsers for a long time. Has anyone successfully done this and if you did what was it that you had to do? Thanks Mike

    | SandyEggo
    0

  • Example of setup:
    www.fancydomain.com
    blog.fancydomain.com Because of certain limitations, I'm told we can't put our blogs at the subdirectory level, so we are hosting our blogs at the subdomain level (blog.fancydomain.com). I've been asked to incorporate the blog's sitemap link on the regular domain, or even in the regular domain's sitemap. 1. Putting the a link to blog.fancydomain.com/sitemap_index.xml in the www.fancydomain.com/sitemap.xml -- isn't this against sitemap.org protocol? 2. Is there even a reason to do this? We do have a link to the blog's home page from the www.fancydomain.com navigation, and the blog is set up with its sitemap and link to the sitemap in the footer. 3. What about just including a text link "Blog Sitemap" (linking to blog.fancydomain.com/sitemap_index.html) in the footer of the www.fancydomain.com (adjacent to the text link "Sitemap" which already exists for the www.fancydomain.com's sitemap. Just trying to make sense of this, and figure out why or if it should be done. Thanks!

    | EEE3
    0

  • I'm interested to know if Phantom was just a "pre-Penguin" 2.0 or if it was a completely different update. Thoughts?

    | nicole.healthline
    0

  • Hi Mozzers I need a little help with Magento redirects. We are using the Ultimo theme [1] and our crawl report has discovered several thousand 302 redirects occurring for links such as: Product comparison links and add to basket links which are formatted as “REPLACE THIS WITH THE INITIAL LINK” are rewritten to links such as “REPLACE THIS WITH THE REWRITTEN LINK ”. Is there any way to stop this from happening? Should I change these rewrites to 301 redirects? Or is this perfectly normal? [1] - http://themeforest.net/item/ultimo-fluid-responsive-magento-theme/3231798 Thanks in advance @Anthony_Mac85

    | Tone_Agency
    0

  • I just noticed that for lots of the articles on my website, there are two results in Google's index. For instance: http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html and http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+thewebhostinghero+(TheWebHostingHero.com) Now my Feedburner feed is set to "noindex" and it's always been that way. The canonical tag on the webpage is set to: rel='canonical' href='http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html' /> The robots tag is set to: name="robots" content="index,follow,noodp" /> I found out that there are scrapper sites that are linking to my content using the Feedburner link. So should the robots tag be set to "noindex" when the requested URL is different from the canonical URL? If so, is there an easy way to do this in Wordpress?

    | sbrault74
    0

  • Our website www.turbocupones.com dropped drastically in ranking today. I have no idea why this happened as we have not outsourced any SEO and sticked to google´s guidelines. Can you please check the web and give me Ideas why this happened? Our traffic dropped by 1000%. All our content is unique. I have years of experience in SEO and never had a problem like this. The rankings dropped drastically for everything. As said all SEO done is white hat. Thanks!

    | sebastiankoch
    0

  • Hi Moz community, One of my clients has a beast of a website built in ASP.NET (which causes me problems cos I don't have much experience in that) It is a job-site that aggregates job opportunities from other job-sites and provides a job matching service by email etc. They used to have great presence on Google naturally for thousands of job searches. Since Penguin and Penguin 2.0 (I think) their traffic has fallen off a cliff. I have been doing some "off-page" experimentation, seeing if we can fix a lot of issues by re-sculpting their backlink profile (seeing as it was after penguin). but what I have found is that some pages respond to this off page work but some just do not at all, despite how we approach it, such as disavowing previous links building fresh new top quality content links with natural anchor text etc.... Which has lead me to the conclusion that the wider issue is on-page and potentially site structure. Unfortunately as it is ASP.NET I am not so comfortable diagnosing the issues. I think also some issues will be related to dupe content etc.... but I would LOVE to get some input from my learned Moz colleagues. The website is http://www.allthetopbananas.com/ - any tips on how to recover from this dramatic loss of traffic would be massively appreciated. Kind regards

    | websearchseo
    0

  • One of my clients provides some Financial Analysis tools, which generate automated content on a daily basis for a set of financial derivatives. Basically they try to estimate through technical means weather a particular share price is going up or down, during the day as well as their support and resistance levels. These tools are fairly popular with the visitors, however I'm not sure on the 'quality' of the content from a Google Perspective. They keep an archive of these tools which tally up to nearly a 100 thousand pages, what bothers me particularly is that the content in between each of these varies only slightly. Textually there are maybe up to 10-20 different phrases which describe the move for the day, however the page structure is otherwise similar, except for the Values which are thought to be reached on a daily basis. They believe that it could be useful for users to be able to access back-dated information to be able to see what happened in the past. The main issue is however that there is currently no back-links at all to any of these pages and I assume Google could deem these to be 'shallow' provide little content which as time passes become irrelevant. And I'm not sure if this could cause a duplicate content issue; however they already add a Date in the Title Tags, and in the content to differentiate. I am not sure how I should handle these pages; is it possible to have Google prioritize the 'daily' published one. Say If I published one today; if I had to search "Derivative Analysis" I would see the one which is dated today rather then the 'list-view' or any other older analysis.

    | jonmifsud
    0

  • Hey All, We discovered an issue where new product pages on our site were not getting indexed because a "noindex" tag was inadvertently being added to section when those pages were created. We removed the noindex tag in late April and some of the pages that had not been previously indexed are now showing up, but others are still not getting indexed and I'd appreciate some help on why this could be. Here is an example of a page that was not in the index but is now showing after removal of noindex: http://www.cloud9living.com/san-diego/gaslamp-quarter-food-tour And here is an example of a page that is still not showing in the index: http://www.cloud9living.com/atlanta/race-a-ferrari UPDATE: The above page is now showing after I manually submitted it in WMT.  I had previously submitted another page like a month ago and it was still not indexing so I thought the manual submission was a dead end.  However, it just so happens that the above URL just had its Page Title and H1 updated to something more specific and less duplicative so I am currently running a test to see if that's the problem with these pages not indexing.  Will update this soon. Any suggestions?  Thanks!

    | GManSEO
    0

  • Hello, I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website. I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda.  (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware). Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content.  The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere). This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience. When we cut pages, though, we used a different approach, doing all of the below steps:
    1. We cut the pages
    2. We set up permanent 301 redirects for all of them immediately.
    3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages. When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway.  We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way… I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover.  We've been feeling that our site should no longer be under the shadow of Panda. So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions: 1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?  
    Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)? 2.  If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did? Thank you in advance for your help,
    Eric

    | Eric_R
    1

  • We just signed a new client whose site is really lacking in terms of content. Our plan is to add content to the site in order to achieve some solid on-page optimization. Unfortunately the site design makes adding content very difficult! Does anyone see where we may be going wrong? Is added content really the only way to go? http://empathicrecovery.com/

    | RickyShockley
    0

  • An example URL can be found here: http://symptom.healthline.com/symptomsearch?addterm=Neck%20pain&addterm=Face&addterm=Fatigue&addterm=Shortness%20Of%20Breath A couple of questions: Why is Google reporting an issue with these URLs if they are marked as noindex? What is the best way to fix the issue? Thanks in advance.

    | nicole.healthline
    0

  • I contracted with a local web design firm - a highly recommended firm - to redo my law practice's Wordpress site.  The redesign was done in early April.  After the redesign I saw a large drop in rankings across all of my keywords, lost internal page rank, and had a big traffic drop.  The site is www.toughtimeslawyer.com. There were a couple of issues that contributed to it; but I'm not sure how to rebuild. The internal URL structure changed completely.  I wasn't aware of this until the site went live. I didn't have a sitemap for about a week, then the sitemap plugin they used was not very good and showing errors in Webmaster tools.  Last week, I replaced it with Yoast's SEO plugin. The biggest problem is that they setup a subdomain old.toughtimeslawyer.com, without asking me or telling me.  The subdomain had all of my content on it.  It was not blocked with robots.txt; and it is being cached by Google.  I just discovered it today, when I was doing something in my cpanel.  I assume that this is creating a duplicate content problem with Google. I'm not sure what steps to take to recover.  I am especially concerned about the subdomain old.toughtimeslawyer.com and the best want to handle it with the search engines. Thanks in advance, all advice is appreciated. I've been pulling my hair out for the last few weeks over my rankings.

    | ToughTimesLawyer
    0

  • Hello fellow mozzers, I have a lot of experience with Magento and a little bit of experience with Prestashop and i am quite aware of their strengths and weaknesses regarding SEO. I was wondering which E-commerce CMS is the best for SEO. I am talking about the CMS as you download it. There are hundreds of plugins for the popular systems which improve their SEO power tremendously, but i'm interested in which CMS is the best right out-of-the-box. Let me know what you think and why you think so. Thanks in advance 🙂

    | WesleySmits
    1

  • I work for a web and graphic design company. We're not a huge shop but we do fairly well. We're starting to dig into SEO, especially for ourselves. Our biggest problem is our backlinks and competition. We need to be able to rank for keywords like "web design boston" and "graphic design boston". Yet our competition has those locked down and only because of their backlinks. Normally I would say well okay lets look at what they're doing and do it better. The problem is most of their backlinks come from their clients websites that they themselves have designed and put a link on the footer of each page. We do that too but because we're smaller we don't have anywhere near as many clients as they do. I know I can try and rank for more "niche" keywords. But I want to know in all honesty what my options are for these same keywords. What realistic methods can I use to achieve the same kind of rankings they are?

    | iconadvertising
    0

  • Hi I have a tool that I wish to give to sites, it allows the user to get an accurate idea of their credit score with out giving away any personal data and with out having a credit search done on their file. Due to the way the tool works and to make the implementation on other peoples sites as simple as possible the tool remains hosted by me and a one line piece of Javascript code just needs to be added to the code of the site wishing to use the tool. This code includes a link to my site to call the information from my server to allow the tool to show and work on the other site. My questions are: Could this cause a problem with Google as far as their link quality goes? - Are we forcing people to give us a backlink to use the tool? (in the eyes of Google) or will Google not be able to read the Javascript / will ignore the link for SEO purposes? Should I make the link in the code Nofollow? If I should make the link a Nofollow any tips on how to make the most of the opportunity from a link building or SEO point of view? Thanks for your help

    | MotoringSEO
    0

  • Can somenone please recommend a good pluggin to minify css upload so that my web runs faster as it uploads more than 20 css and i want to placed it all on the same css

    | maestrosonrisas
    0

  • I read about some recommendations to noindex the URL of the site search.
    Checked in analytics that site search URL generated about 4% of my total organic search traffic (<2% of sales). My reasoning is that site search may generate duplicated content issues and may prevent the more relevant product or category pages from showing up instead. Would you noindex this page or not? Any thoughts?

    | lcourse
    0

  • Hi Everyone, I first thank for all people who read this post and offer to help me.. 1. My client is going to start a weekly campaign starting from tomorrow its a online flower delivery company 2. They have already setup the Google ad words for it but he wanted to know is there any other way he can bring visitors to this campaign. 3. I have setup email blasting apart from this anything you people can suggest and little explanation too.

    | dineshsearch
    0

  • Hello, I would like to know if it is considered bad by Google to rank every single spot in the top 10 with the same keyword? All websites conver the same topic which this keyword, but have unique content. Those websites dont link to one another. To be clear, I am not asking if it is a good idea because it will be harder to rank, find links etc. or because the last positions don't bring enough traffic. I really just would like to know if it against Google guidelines? Any thoughts or sources are greatly appreciated.

    | EndeR-
    0

  • Hi , I am on a new project for a great prospective client. After a company split the IT dept says, "Our IT Infrastructure team doesn’t provide admin access to anyone (internal or external) to any of our servers/sites for security, maintainability, and quality reasons.   We would be more than happy to install and configure any software you would like and provide you read access to any output you might need." I am not aware of any such software. Or what help "read only access" is to getting tasks accomplished. Tomorrow am I meet with four others on the project that I am to assign SEO tasks to. Everyone wonders how to accomplish marketing tasks, web redesign and SEO tasks efficiently? They are told to drop by new content on a backupn CD. My contact has asked that I am permitted access... what can one do when IT says "no?"

    | jessential
    0

  • Howdy Mozzers I’ve been a Moz fan since 2005, and been doing SEO since. This is my first major question to the community! I just started working for a new company in-house, and we’ve uncovered a serious problem. This is a bit of a long one, so I’m hoping you’ll stick it out with me! ***Since the images aren't working, here's a link to the google doc with images. https://docs.google.com/document/d/1I-iLDjBXI4d59Kl3uRMwLvpihWWKF3bQFTTNRb1R3ZM/edit?usp=sharing Background The site has gone through a few changes in the past few years. Drupal 5 and 6 hosted at bcbusinessonline.ca and now on Drupal 7 hosted at bcbusiness.ca. The redesigned responsive design site launched on January 9th, 2013. This includes changing the structure of the URL’s, such as categories, tags, and articles. We submitted a change of address through GWT shortly after the change. Problem Organic site traffic is down 50% over the last three months. Below, Google analytics, and Google Webmaster Tools shows the decline. *They used the same UA number for Google analytics, so that’s why the data is continuous Organic traffic to the site. January 2011 - Dips in January are because of the business crowd on holidays. Google Webmaster Tools data exported for bcbusiness.ca starting as far back as I could get. Redirects During the switch, the site went from bcbusinessonline.ca to bcbusiness.ca. They were implemented as 302’s on January 9th, 2013 to test, then on January 15th, they were all made 301’s. Here is how they were set up: Original: http://www.bcbusinessonline.ca/bcb/bc-blogs/conference/2010/10/07/11-phrases-never-use-your-resume --301-- http://www.bcbusiness.ca/bcb/bc-blogs/conference/2010/10/07/11-phrases-never-use-your-resume --301-- http://www.bcbusiness.ca/careers/11-phrases-never-to-use-on-your-resume Canonical issue On bcbusiness.ca, there are article pages (example) that are paginated. All of the page 2 to page N were set to the first page of the article. We addressed this issue by removing the canonical tag completely from the site on April 16th, 2013. Then, by walking through the Ayima Pagination Guide we decided for immediate and least work choice was to noindex, follow all the pages that simply list articles (example). Google Algorithm Changes (Penguin or Panda) According to SEOmoz Google Algorithm Changes there is no releases that could have impacted our site at the February 20th ballpark. However - Sitemap We have a sitemap submitted to Google Webmaster Tools, and currently have 4,229 pages indexed of 4,312 submitted. But there are a few pages we looked at that there is an inconsistency between what GWT is reporting and what a “site:” search reports. Why would the submit to index button be showing, if it’s in the index? That page is in the sitemap. Updated: 2012-11-28T22:08Z Change Frequency: Yearly Priority: 0.5 *GWT Index Stats from bcbusiness.ca What we looked at so far The redirects are all currently 301’s GWT is reporting good DNS, Server Connectivity, and Robots.txt Fetch We don’t have noindex or nofollow on pages where we haven’t intended them to be. Robots.txt isn’t blocking GoogleBot, or any pages we want to rank. We have added nofollow to all ‘Promoted Content’ or paid advertising / advertorials We had TextLinkAds on our site at one point but I removed them once I satarted working here (April 1). Sitemaps were linking to the old URL, but now updated (April)

    | Canada_wide_media
    1

  • Hello, My website has 112 internal exact match sidebar links for my targeted keyword. I rank nr.5 for this keyword in Google and i was wondering if i should remove those links  or not? I know that footer link are no no but i am not sure about sidebar links. Any ideas? Regards, InigoOÜ

    | InigoOU
    0

  • Hi, Lets say you have a category which displays a list of 20 products and pagination of up to 10 pages. The root page has some content but when you click through the paging the content is removed leaving only the list of products. Would it be best to apply a canonical tag on the paging back to the root or apply the prev/next tags. I understand prev/next is good for say a 3 part article where each page holds unique content but how do you handle the above situation? Thanks

    | Bondara
    0

  • Hey Everyone, we redirected an entire domain to a specific URL on another domain (not the homepage). We used a 301 Redirect, but I'm also wondering if I should use the Google Webmaster Tools "Change of Address" section to redirect. There is no option to redirect the old domain to the specific URL on the new domain within the "Change of Address" section. Thoughts?

    | M_D_Golden_Peak
    0

  • Can anyone think of any reasons why it would be a bad idea to use PowerPoint to create documents and then convert them to PDFs? Do you think this could cause any crawling issues for Google?

    | BlueLinkERP
    0

  • Hello fellow mozzers, I've seen a lot of discussion and confusion about whether you should use subfolders or subdomains when you have a website, a blog and a webshop.
    Of course with subfolders the PageRank will be more effective since it's all in one domain. On the other hand subdomains will be a better user experience since you can focus on just the webshop or just the blog. Was wondering how you guys/girls think what would be the best way to handle this.

    | WesleySmits
    0

  • Hi All, In order to avoid duplication errors we've decided to redirect old categories (merge some categories).
    In the past we have been very generous with the number of categories we assigned each post. One category needs to be redirected back to blog home (removed completely) while a couple others should be merged. Afterwords we will re-categorize some of the old posts. What is the proper way to do so?
    We are not technical, Is there a plugin that can assist? Thanks

    | BeytzNet
    0

  • Hi, We are an e commerce company that has our own domain but also sell the same products on eBay and Amazon. What is the feeling on the same exact descriptions being used on different platforms? Do they count as duplicate content? Will our domain be punished/penalised as our domain does not have as much authority as EBay or Amazon? We have over 5,000 products with our own hand written product descriptions. We want our website to be the main place/ have priority over the above market places. What's the best suggestion/solution? thanks,

    | Roy1973
    0

  • Hi all, I'm just wondering if anyone is able to help me with why my website lost practically all its ranking last October (2012). My website is here: http://bit.ly/nAOfNj Since early 2010, we have been ranking in the top 3 for our keyword when searched all around the country. Between end September, and end October 2012, we started dropping (from 2nd to 8th, then in December, 13th, January, 18th place... and then March back up to 13th, now ~10th). The main problem seems to be that Google has changed how websites rank for our keyword (trampoline). In Brisbane, Australia (where we are based), we only rank in the local organic searches. We don't have a separate listing there anymore (with the meta description), even though we had a normal organic listing (and local listing) for the last 2 years! When searching from other states/suburbs further away, we dropped way off the first page. Our product is sold by resellers in 400 stores around Australia, so its not like we're just in Brisbane. Has anyone experienced Google changing how they return results for a specific keyword like this? Did they do it a lot towards the end of last year? We have a place page for Brisbane, but for some reason I have little to no control over it (Places/Local+ stuff up means I can't manage the page on Local+, can't add pictures/videos etc). My boss suggested we even try deleting the maps page or our local+ page to get out of there. We don't get anywhere near as much traffic through the local listing than a normal listing... I'm not sure if that's best though? From what I can tell, the only Google algorithm updates that may have affected us at the time (October 9th) were the page layout updates that penalised(?) sites that have a lot of "ads" above the fold. Our website is designed to have splash banners on the top of every page to either promote our own product, competitions or the athletes we sponsor. Up until last week, the banners were always 500px high on larger screen desktops and 300px high on smaller desktops, laptops, ipad etc. I have recently changed them to all be 300px high to test, but I imagine i'll have to wait a while? Is this the kind of content that Google means by "ads above the fold"? I've spent the last 4 weeks working on our SEO, from HTML validation, to rich snippets, content optimisation, a lot more internal linking, setting up some location-based content, doing a lot of keyword research, and now starting to work on cleaning up our Blog and creating some real sharable content that we'll share on our Facebook. I really just wish I knew where the problem was so I could tackle that 😞 Any advice would be GREATLY appreciated!!

    | Vuly
    0

  • Hello, I'm a relatively active member for a website support forum for the Genesis Framework - studiopress.com/forums. Right now I have a plain text signature, but I am thinking about linking to my website (a WordPress Development Firm) with branded anchor text. I understand the risk of Penguin for non-branded, keyword-heavy anchor text -- but If I do this with simply a link to my website's homepage or Genesis development page, do you think i'd be at risk? Thanks Zach

    | Zachary_Russell
    1

  • hello post the  google Image update  ( http://googlewebmastercentral.blogspot.com/2013/01/faster-image-search.html ) please could you let me know what the status of image optimization is and also what the best practices are? Thank you so much. I appreciate it. Vijay

    | vijayvasu
    0

  • Our company hosts a booking engine that feeds to a lot of hotel properties in our area. We also own and run a site that is competing for a lot of the hotel related keywords. We are wondering if there would be a benefit (and how much) to moving  that booking engine to be hosted on that site that is competing for the hotel related keywords. On a subdomain would be easiest. Has anyone ever dealt with something like this before? I imagine it would help from an inbound link perspective since all these hotels will be pulling information from that site and when someone books it would send them to a page on that site, but could it also be seen as an outbound link from that site which might help the property sites? Any help with this would be really appreciated.

    | Fuel
    0

  • Hey Mozzers, I'm optimizing a small ecommerce site. The site URL directory structure seems all good & logical, BUT should I try for a flatter architecture - so that the individual products are at top level after the domain name in URLs? e.g.
    www.domain.com/first-item/
    www.domain.com/second-item/
    etc. etc. My current setup (I'm using the Woocommerce plugin in Wordpress): www.domain.com/shop/ (main shop page)
    www.domain.com/shop/category-name-1/
    www.domain.com/shop/category-name-2/
    www.domain.com/shop/category-name-3/ with products appearing as:
    www.domain.com/product/first-item/
    www.domain.com/product/second-item/
    etc. I've researched some big brand ecommerce sites and most seem to be domain.com/amazing-product/ even if the product itself is many categories or sub-categories down. i.e. Homepage > Home & Furniture > Furniture > Living Room Furniture > Coffee Tables As I say the information architecture makes sense from a user point of view, but I'm guessing the individual products would stand more chance of ranking if directly following the domain name? Woocommerce although flexible doesn't seem to do this out-of-the-box, so please some advice before I go on a hacking and URL rewriting mission! Thanks 🙂

    | GregDixson
    0

  • Hi, I am working with a company that has a .net site and a .ch website that are identical. Will this duplicate content have a negative affect on SERPs? Thanks Ali.B

    | Bmeisterali
    0

  • It all started very well but now it is all just going down and down even though I try to follow the proper guidelines. Could anyone give me some advice if I pm the link?

    | y3dc
    0

  • Hey, i changed my hosting provider because better server hardware on 22 May. And many results dropped on google! My website opened only just 17 Jan 2013. Maybe you want to look my anchors. you can find attachment image. Total backlinks: 4,9K Is that temporary situation coz changing ip address (hosting provider) or penguin 2.0? nw68v.jpg

    | umutege
    0

  • My website uses a prefix at the end to instruct the back-end about visitor details. The setup is similar to this site - http://sanfrancisco.giants.mlb.com/index.jsp?c_id=sf with a 302 redirect from the normal link to the one with additional info and a canonical tag on the actual URL without the extra info ((the normal one here being http://sanfrancisco.giants.mlb.com,) However, when I used www.xml-sitemaps.com to create a sitemap they did so using the URLs with the extra info on the links... what should I do to create a sitemap using the normal URLs (which are the ones I want to be promoting)

    | theLotter
    0

  • I've recently started a website that is based on movie posters. The site has fundamentally been built for users and not SEO but I'm wondering if anyone can see any problems or just general advice that may help with our SEO efforts? The "content" on the website are the movie posters. I know Google likes text content, but I don't see what else we could add that wouldn't be purely for SEO. My site is: http://www.bit.ly/ZSPbTA

    | whispertera
    0

  • Hi, I'm working on a real estate website last 2 years. I've get top position on local and organic search result. But last 2 month I'm loosing ranking on some keywords on both local and organic search. I'm doing boomarking, guest posting related to real estate and social media promotion but getting no result that's way I'm looking seo person for my website. Thanks and waiting for your feedback

    | KLLC
    0

  • Hi Guys. I'm looking seo person who can help me in my project. I'm losing ranking day by day from last 2 months. For more detail and price please PM me. Thanks

    | KLLC
    0

  • I have two sites with same theme - buying cars.  I am going remove one of the sites from being crawled permenantly (ie junkthecars.com) and point domian via 301, to another similar theme site (sellthecars.com). The purpose is to simply pass the SEO link juice from one site to the other as we retire junkthecars.com.... Is a forwarding of the domain  OK and the best way for the search engines to increase the rank of sellthecars.com (we hate to wast the link work done on Junkthecars.com)? What dangers should I look for that could hurt sellthecars.com if we do the redirect at a simple TLD?

    | bestone
    0

  • Canonical url issue My site https://ladydecosmetic.com on seomoz crawl showing duplicate page title, duplicate page content errors. I have downloaded the error reports csv and checked. From the report, The below url contains duplicate page content.
    https://www.ladydecosmetic.com/unik-colours-lipstick-caribbean-peach-o-27-item-162&category_id=40&brands=66&click=brnd And other duplicate urls as per report are,
    https://www.ladydecosmetic.com/unik-colours-lipstick-plum-red-o-14-item-157&category_id=40&click=colorsu&brands=66 https://www.ladydecosmetic.com/unik-colours-lipstick-plum-red-o-14-item-157&category_id=40 https://www.ladydecosmetic.com/unik-colours-lipstick-plum-red-o-14-item-157&category_id=40&brands=66&click=brnd But on every these url(all 4) I have set canonical url. That is the original url and an existing one(not 404). https://www.ladydecosmetic.com/unik-colours-lipstick-caribbean-peach-o-27-item-162&category_id=0 Then how this issues are showing like duplicate page content. Please give me an answer ASAP.

    | trixmediainc
    0

  • Fellow mozzers, Today I got an interesting question from an entrepreneur who has plans to start about 100-200 webshops on a variety of subjects. His question was how he should like them together. He was scared that if he would just make a page on every website like: www.domain.com/our-webshops/ that would list all of the webshops he would get penalised because it is a link farm. I wasn't sure 100% sure which advise to give him so i told him i needed to do some research on the subject to make sure that i'm right. I had a couple of suggestions myself. 1. Split the amount of pages by 3 and divide them into three columns. Column A links to B, B links to C and C links to A. I realize this is far from ideal but it was one of the thoughts which came up. 2. Divide all the webshops into different categories. For example: Webshops aimed at different holidays, webshops aimed at mobile devices etcetera. This way you will link the relevant webshops together instead of all of them. Still not perfect. 3. Create a page on a separate website (such as a company website) where the /our-webshops/ page exists. This way you only have to place a link back from the webshops to this page. I've seen lots of webshops using this technique and i can see why they choose to do so. Still not ideal in my opinion. That's basicly my first thoughts on the subject. I would appreciate any feedback on the methods described above or even better, a completely different strategy in handling this. For some reason i keep thinking that i'm missing the most obvious and best method. 🙂

    | WesleySmits
    0

  • We had to 301 redirect a large number of URL's. Not Google WMT is telling me that we are having tons of duplicate page titles. When I looked into the specific URL's I realized that Google is listing an old URL's and the 301 redirected new URL as the source of the duplicate content. I confirmed the 301 redirect by using a server header tool to check the correct implementation of the 301 redirect from the old to the new URL. Question: Why is Google Webmaster Tool reporting duplicated content for these pages?

    | SEOAccount32
    0

  • Hi mozzers, I would like to know What is the difference between link rel="canonical" and meta name="canonical"? and is it dangerous to have both of these elements combined together? One of my client's page has the these two elements and kind of bothers me because I only know  link rel="canonical" to be relevant to remove duplicates. Thanks!

    | Ideas-Money-Art
    0

  • I submitted a link disavowal file for a client a few weeks ago and before doing that I read up on how to properly use the tool. My understanding is that if you received a manual penalty then you need to submit a reconsideration request after cleaning up links. We didn't receive a penalty so I didn't submit one. I'm wondering if anyone has used the tool (not stemming from a penalty) and if you did or didn't submit a recon. request, and what the results were. I've read that if a site is hit algorithmically, then filing a recon request won't help. Should I just do it anyway? Would be great to hear from anyone who has gone through a similar situation.

    | Vanessa12
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.