Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • After redesigning my old Drupal website and launching a new "improved" Wordpress version the new version is performing badly. Ranking is poor and conversions don't occur. I realize that my new design is bad (no call to action, poor structure, text heavy). New business inquiries have ceased. The site contains 450 pages. After spending $25,000 and a year of my life I see the new version is not an improvement! What would be the effect of reinstating the old version of the site and doing 301 redirects back to it? Would the old rankings be restored? I need to decide whether I should revert or focus on fixing flaws in the improved design. Any thoughts?? Thanks,
    Alan

    | Kingalan1
    0

  • One of my sites I work with got this message: http://www.mysite: Unnatural inbound linksJune 27, 2013 Google has detected a pattern of artificial or unnatural links pointing to your site. Buying links or participating in link schemes in order to manipulate PageRank are violations of Google's Webmaster Guidelines. As a result, Google has applied a manual spam action to mysite.com/. There may be other actions on your site or parts of your site. But, when I got to manual actions it says: Manual Actions No manual webspam actions found. -- So which is it??? I have been doing link removal, but now I am confused if I need to do a reconsideration request or not.

    | netviper
    0

  • Just got my newlsetter from LinkedIn with a link to an interesting article about the part of a webpage that gets the most viewership. Are there any page tools out there that will tell me the pixel measurements on a page as shown in the article?

    | AWCthreads
    0

  • Hello Everyone, When the whole world is debating on EMD, whether one should use it or avoid. Many bloggers from India still crack a very good traffic from EMD only. Recently, I was researching and found a very impressive link. Keyword: " sad shayari hindi" Google India Search Top 7 position occupied by a single domain with multiple URLs. I would like to request everyone to check the screenshot and comment. VJSQkuy

    | pushkar63
    0

  • Hi all, A bit of a long questions so apologies in advance but please bear with me... My client has received an 'Unnatural Inbound Links' warning and it is now my task to try and resolve through a process of; Highlighting the unnatural links Requesting that the links be removed (via webmaster requests) Possibly using the Disavow Tool Submitting a Reconsideration Request So I downloaded my clients link profile from both OSE and GWT in CSV format and compared - the amount of links returned was considerably more in GWT than it was in OSE...? So I set about going through the links, first filtering into order so that I could see blocks of links from the same URL - I highlighted in colours; Red - Definitely need to be removed Orange - Suspect, need to investigate further Yellow - Seem to be ok but may revisit Green - Happy with the link, no further action So to my question which relates to, is it 'black & white' - is it a case of 'good link v 'bad link' or could there be some middle ground? (am I making this process even more confusing than it actually is?) As an example, here are some 'Orange' URL's; http://www.24searchengines.com/ (not exact URL as it goes to the travel section which is my clients niche) - this to me looks spammy and I would normally 'paint it red' and look to remove, however, when I go to the 'contact us' page; (http://www.24searchengines.com/texis/open/allthru?area=contactus) and follow the link to remove from directory, it takes me here; http://www.dmoz.org/docs/en/help/update.html DMOZ??? My clients has a 'whole heap' of these type of links; http://www.25searchengines.com/ http://www.26searchengines.com/ http://www.27searchengines.com/ http://www.28searchengines.com/ ...and many many more!! Here is another example; http://foodys.eu/ http://foodys.eu/2007/01/04/the-smoke-ring-bbq-community/ ...plus many more... My client is in the 'cruise niche' and as there is a 'cruise' section on the site I'm not sure whether this constitutes a good, bad or indifferent link! Finally, prior to me working with this client (1 month) they moved their site from a .co.uk to a .com domain and redirected all links from the .co.uk to the .com (according to GWT, over 16k have been redirected) - a lot of these 'spammy' links were to the .co.uk and have thus been redirected, should I even consider removing the redirection or will that have severe consequences? Apologies for the long (long) post, I know I'm heading in the right direction but some assurance wouldn't go amiss! 🙂 Many thanks Andy <colgroup><col width="1317"></colgroup>
    |   |

    | TomKing
    0

  • Our site's last Red flag issue is the "eliminate render blocking javascript and css" message. I don't know how to do that, and while I'm not sure if I could spend hours/days cutting and pasting and guessing until I made progress, I'd rather not. Does anyone know of a plugin that will just do this? Or, if not, how much would it cost to get a web developer to do this? Also, if there is not plugin (and it didn't look like there was when I looked) how long do you think this would take someone who knows what they are doing to complete. The site is: www.kempruge.com Thanks for any tips and/or suggestions, Ruben

    | KempRugeLawGroup
    0

  • Hello forum! I have a question about subdomains vs. subfolders for a new sub-brand for a company. The company is looking at creating a sub-brand delivering a different service to the parent company. It is complimentary in a sense, but it would need a very different marketing strategy. It is not trying to 'hide' its parent brand at all, but instead would leverage the parent brand as added social proof. I've read that creating a subdomain essentially means starting from scratch in terms of SEO, and that a subfolder would better leverage the domain authority the TLD has accrued. However, creating a subfolder does not really gel with me, as it would not in my opinion provide a good experience for visitors. I.e. it's like running a website that sells electronics and having a subfolder marketing IT support services. Yes, there is some synergy-- but it can also lead to visitor confusion. I'd love your opinions on this! Carlo

    | carlod
    0

  • Hi All, Let’s say you are a service provider such a Garden Landscaper, and over time your customers have placed reviews on Google Local Places. As part your sites redesign your looking to implement "Rich Snippets Aggregated Reviews". Your organic natural search results should include the reviews; aggregated star picture, aggregated review value and total reviews. Question #1 Is this possible?
    #2 What is the best way of achieving this via Wordpress as a plugin?
    #3 Do you need to file for consideration via Google? Any other useful advice will be appreciated. Thanks Mark

    | Mark_Ch
    0

  • Hi Mozzers, We're in the process of re-developing and redesigning several of our websites, and moving them all onto the same content management system. At the moment, although the websites are all under the same brand and roughly the same designs, because of various reasons they all either live on a separate domain to the main website, or are on a subdomain. Here's a list of what we have (and what we're consolidating): Main site - http://www.frenchentree.com/ Property database - http://france-property.frenchentree.com/ (subdomain) Forum - http://www.france-forum-frenchentree.com/ (separate domain) Classified ads - http://www.france-classified-ads-frenchentree.com/ (separate domain) My question to you lovely people is: should we take this opportunity through the redevelopment of the CMS to put everything into subfolders of the main domain? Keep things as they are? Put each section onto a subdomain? What's best from an SEO perspective? For information - the property database was put onto a subdomain as this is what we were advised to do by the developers of the system. We're starting to question this decision though, as we very rarely see subdomains appear in SERPs for any remotely competitive search terms. Our SEO for the property database is fairly non-existent, and only ever really appears in SERPs for brand related keywords. For further info - the forum and classifieds were under a separate brand name previously, so keeping them on separate domains felt correct at that time. However, with the redevelopment of our sites, it seems to make more sense to either put them on subdomains or subfolders of the main site. Our SEO for the forum is pretty strong, though has dwindled in the last year or so. Any help/advice would be very much appreciated. Thanks Matt

    | Horizon
    0

  • My thinking is to make a list of most linked to and most trafficked error pages, and just redirect those, but I don't know how to get all that data because i can't even download all the error pages from Webmaster Tools, and even then, how would i get backlink data except by checking each link manually? Are there any detailed step-by-step instructions on this that I missed in my Googling? Thanks for reading!!

    | DA2013
    0

  • Hi all. This is my second pass at the problem. Thank you for your responses before, I think I'm narrowing it down! Below is my original message. Afterwards, I've added some update info. For a while, we've been working on http://thewilddeckcompany.co.uk/. Everything was going swimmingly, and we had a top 5 ranking for the term 'bird hides' for this page - http://thewilddeckcompany.co.uk/products/bird-hides. Then disaster struck! The client added a link with a faulty parameter in the Joomla back end that caused a bunch of duplicate content issues. Before this happened, all the site's 19 pages were indexed. Now it's just a handful, including the faulty URL (thewilddeckcompany.co.uk/index.php?id=13‎) This shows the issue pretty clearly. https://www.google.co.uk/search?q=site%3Athewilddeckcompany.co.uk&oq=site%3Athewilddeckcompany.co.uk&aqs=chrome..69i57j69i58.2178j0&sourceid=chrome&ie=UTF-8 I've removed the link, redirected the bad URL, updated the site map and got some new links pointing at the site to resolve the problem. Yet almost two month later, the bad URL is still showing in the SERPs and the indexing problem is still there. UPDATE OK, since then I've blocked the faulty parameter in the robots.txt file. Now that page has disappeared, but the right one - http://thewilddeckcompany.co.uk/products/bird-hides - has not been indexed. It's been like this for several week. Any ideas would be much appreciated!

    | Blink-SEO
    0

  • Hello everyone, I have a question that is currently puzzling me, and I hope you can help me with. On musicianspage.com (one of our websites), we show a list of online users embedded within the page which, as you may expect, changes all the time according to who's online at that moment. That list appears on every page of the site, so at any time any page on the site has a different content and different link profile (sometimes we have just a few users connected, other times we may have over 50 users connected at the same time). My question is: is such a "dynamical-embedded" list bad, good or neutral from a SEO stand point? If it is bad, what do you suggest to do? Put it inside a frame? Using AJAX? Any thoughts and suggestions are very welcome! Thanks in advance to anyone reading this. All the best, Fabrizio

    | fablau
    0

  • Hey Moz'ers, I have created several blogs on different domains for the purpose of writing good content articles that contain 2-3 links per article that go back to my website.  It has been up for about 3-4 weeks.  I am not seeing my results/links showing up in OSE, is this because it still needs more time or is there something else I could be advised to look into? In theory these blogs will only contain 2-3 links from each domain to the site.  I was also going to make sure the anchor text per link is different (keyword, brand name, random anchor like click here). Side note:  How does this system sound as part of one small aspect to link building? red flags? Thanks for all the responses and advice.

    | MonsterWeb28
    0

  • We're using the Google snapshot method to index dynamic Ajax content.  Some of this content is from tables using pagination. The pagination is tracked with a var in the hash, something like: #!home/?view_3_page=1 We're seeing all sorts of calls from Google now with huge numbers for these URL variables that we are not generating with our snapshots.  Like this: #!home/?view_3_page=10099089 These aren't trivial since each snapshot represents a server load, so we'd like these vars to only represent what's returned by the snapshots. Is Google generating random numbers going fishing for content?  If so, is this something we can control or minimize?

    | sitestrux
    0

  • So there are 2 pest control companies owned by the same person - Sovereign and Southern. (The two companies serve different markets) They have two different website URLs, but the website code is actually all the same....the code is hosted in one place....it just uses an if/else structure with dynamic php which determines whether the user sees the Sovereign site or the Southern site....know what I am saying? Here are the two sites:  www.sovereignpestcontrol.com and www.southernpestcontrol.com. This is a duplicate content SEO nightmare, right?

    | MeridianGroup
    0

  • Dear other Moz fans, We have an E-commerce store in Norway. Our main conversion to sale still happens in our physical store, but do to the description and information we provide online.
    To warn you before you click; Our store is a boutique for "erotic items". A nice one how ever, made buy woman for woman and their man. We use enormous time writing descriptions and information for (almost) every item online.
    We really want to protect our content (text information). What is the best practice to mark up "protection" of our hard work content? Thank you for your time.
    Regards form the Flirt girls in Norway.

    | Monica_Flirt
    0

  • I'm evaluating a new client site which was built buy another design firm. My question is they are dynamically creating meta tags and I'm concerned that it is hurting their SEO. When I view the page source this is what I see. <meta name="<a class="attribute-value">keywords</a>" id="<a class="attribute-value">keywordsGoHere</a>" content="" /> <meta name="<a class="attribute-value">description</a>" id="<a class="attribute-value">descriptionGoesHere</a>" content="" /> <title id="<a class="attribute-value">titleGoesHere</a>">title> To me it looks like the tags are not being added to the page, however the title is showing when you view it in a browser and if use a spider view tool, it sees the title. I'm guess it is being called from a DB. So I'm a little concerned though that the search engines are not really seeing the title and description. I'm not worried about the keywords tag. Can anyone shed some light on how this might work? Why it might not being showing the text for the description in the page code and if that will hurt SEO? Thanks for the help!

    | BbeS
    0

  • In one week's time, we've dropped from #3 on Page 1 of Google to Page 7 (similar on Bing). It looks like our traffic started to drop on 9/5 to 9/7 and has been a steady, rapid decline ever since. 1000s of pages are indexed, just suddenly ranking poorly -- even for branded terms. History:
    --In January, we switched to a web redesign & new domain
    --In August, our hosting server was slow & kept crashing so we migrated our site to a new hosting company. We're not currently using the old hosting server. All domains, redirects, .htaccess files should now be correct and site speeds are improved.
    --In early September, our NEW hosting company had a DNS issue causing more slow speeds and downtime for about 1 wk. Originally they thought it was htaccess so they changed our htaccess file - no luck - then discovered it was DNS. DNS issue was finally resolved on September 6th -- one day before the penalty/traffic issue seemed to begin.
    -- According to GWMT, it looks like there were crawls completed around 9/4-9/5 What we've tried:
    --Webmaster Tools - Googlebot dropoff since 9/5 (see attached screenshot). Nothing flagged. No site health alerts. Fetch as Google works. No manual webspam actions found.
    -- W3C link checker, screamingfrog SEO spider, Xenu Link Sleuth, OSE (found some 4xx errors so we've updated those links)
    -- Majestic SEO - backlinks reviewed 9/3 to 9/8
    -- spoke to two different Adwords salespeople; unable to help
    -- Bing Webmaster Tools
    -- not showing organic search traffic since 9/6
    -- 15% fewer pages crawled this month
    -- top keywords are very odd -- stuff like "mt1 google apis" and "aaremel"
    -- there are 4xx crawl errors under Crawl Information. We've fixed those URLs but they still appear in Webmaster Tools
    -- some missing h1's and meta's, and dup titles, which we're working to fix
    -- spike in crawl errors 9/11-9/12 and again on 9/14-9/15 It's been one thing after another this year, but all issues are now resolved with the exception of this newly-discovered penalty. We also have sites on a separate hosting server (with a different hosting company) that rank just fine. googlebot-crawls.jpg

    | ddwilliamson
    0

  • Hey guys, Wondering if you good people could help me out on this one? A few months back (June 19) I disavowed some links for a client having uploaded a .txt file with the offending domains attached. However, recently I've noticed some more dodgy-looking domains being indexed to my client's site so went about creating a new "Disavow List". When I went to upload this new list I was informed that I would be replacing the existing file. So, my question is, what do I do here? Make a new list with both old and new domains that I plan on disavowing and replace the existing one? Or; Just replace the existing .txt file with the new file because Google has recognised I've already disavowed those older links?

    | Webrevolve
    0

  • Hi guys, One of my main concerns when we start redesigning the site Trespass.co.uk, is the current pages like this one http://www.trespass.co.uk/snow-sports/clothing/ski-jackets/womens-ski-jackets are bordering over optimisation. Is this the case as each product listed in the url above has "womens ski jacket" under each product. If we have 50 products on each product listing page with the product name + type of product, ie. flora womens ski jacket, xyz mens waterproof jacket. Are we over optimising the page for the main keywords by having them under each product? Would that page be over optimised for womens ski jackets? Thanks guys

    | Trespass
    0

  • Scenario
    The website has a menu consisting of 4 links Home | Shoes | About Us | Contact Us Additionally within the body content we write about various shoe types. We create a link with the anchor text "Shoes" pointing to www.mydomain.co.uk/shoes In this simple example, we have 2 instances of the same link pointing to the same url location.
    We have 4 unique links.
    In total we have 5 on page links. Question
    How many links would Google count as part of the link juice model?
    How would the link juice be weighted in terms of percentages?
    If changing the anchor text in the body content to say "fashion shoes" have a different impact? Any other advise or best practice would be appreciated. Thanks Mark

    | Mark_Ch
    0

  • I'm currently in the process of redesigning my site's 404 page. I know there's all sorts of best practices from UX standpoint but what about search engines? Since these pages are roadblocks in the crawl process, I was wondering if there's a way to help the search engine continue its crawl. Does putting links to "recent posts" or something along those lines allow the bot to continue on its way or does the crawl stop at that point because the 404 HTTP status code is thrown in the header response?

    | brad-causes
    0

  • Should I be building back links to my Back links, I have a fairly decent amount of guest post on decent to standard blogs and just wondered if I should be building a few medium quality back links to these. I can understand that I should just build all links to my website but there are a lot of opportunities on cheaper blogs such as little better than article submission sites which I would not really want my main site on would this help?

    | BobAnderson
    1

  • This question is for EGOL (if he's willing) and anyone else who wants to partake. EGOL is the best content writer I've ever run into, really. I'm wondering what his top 3 to 5 tips are on how to use graphical layout (font, images, graphics, organization, menu, etc) to make content irresistable. A couple of assumptions: The content is written really well from a perspective of authority. Also, we're not including video on this one. Again, anyone is welcome to answer this. Thanks!

    | BobGW
    1

  • Is it worth putting together an image sitemap to submit to Google if you're not an e-commerce site? Also, if you're using a CDN like Amazon Web Services (cloudfront), can you even submit an image sitemap? According to Google you need to verify your CDN in webmaster tools if you're going to do so. https://support.google.com/webmasters/answer/178636?hl=en

    | kking4120
    1

  • A team member is porting over documentation from a .org wiki that will be placed on the company's root domain. The problem with MadCap is that it uses frames as well as javascript navigation. Has anyone encountered this problem before? I'm unfamiliar with the software and the project is pretty far into the pipeline at this point (I'm new at the company as well). Any advice on work-arounds or alternatives would be greatly appreciated.

    | AnthonyYoung
    1

  • Hi All! We are in the process of migrating to Drupal and I know that I want to block any instance of /node/ URLs with my robots.txt file to prevent search engines from indexing them. My question is, should we set 301 redirects on the /node/ versions of the URLs to redirect to their corresponding "clean" URL, or should the robots.txt blocking and canonical link element be enough? My gut tells me to ask for the 301 redirects, but I just want to hear additional opinions. Thank you! MS

    | MargaritaS
    0

  • Since subscribing to Moz, I have been focussing alot on some of the more technical aspects of SEO.  The current thing I am finding interesting is stopping link juice leaks. Here are a selection of some of the things I have done: I have cloaked my affiliate links - see http://yoast.com/cloak-affiliate-links/ Removed some html coded social share links within the theme, and replaced with javascript plugin (http://wordpress.org/plugins/flare/) Used the Moz toolbar to view as Google, to see what google is seeing.  Removed some meta links at the bottom of blog posts (author etc) that were duplicated. Now, I don't intend to go over the top with this, as links to social accounts on each page are there to encourage engagement etc, but are there any things you may have come across \ tips that people may have overlooked but perhaps should look out for? As example as some of the things that might be interesting to discuss: Are too many tags, categories bad?  Do you index your tag, date archive pages?  Does it matter?

    | Jonathan1979
    0

  • I have several complete websites with blogs setup for different geo locations and was considering forwarding them all to one domain directly would greatly benefit ranking. The blogs are all linked together and that is where most of the links come from. Would I benefit in 301 Redirecting the domains?

    | WindshieldGuy-276221
    0

  • A local business has been smashing the SERPs for a while now, but since May (updates) it has been sliding and search visibility has plummeted. They came to me for help, so I ran a Dtox report and it's showing a lot of bad links (2,863 links in total). TOX1 are deindexed website so it was being linked to from a huge private blog network. MY question is, with only 209 decent links pointing to them, are they ranking because Google hasn't picked up all the shitty links or DESPITE them? I assume that after Google deindexes a domain, that link is wiped out in their index? Which is the reason for the huge drop in rankings and visibility. However, they are still there or there abouts for 40% of their keywords. Whats the best course of action here, do you think? They haven't had a penalty (as far as I know). Should I proceed to disavow? Leave them to drop away and juts build quality links? I don't want to disrupt anything at the moment, they still do well in bing. They say their rankings are slowly sliding. Any ideas would be good!

    | jasonwdexter
    1

  • Hi, On most of our pages Javascript is displaying our main content, so it doesn't show up on the page source and I assume isn't being crawled by Google to the best of its ability. It's also not showing up on MOZ's page grader and crawl results, making analysis and testing harder. What's the easiest way, without having to completely redo our website, to have this content crawled by search engines and moz?

    | S.S.N
    0

  • I’m working with a company that got hit by Penguin 2.0. They’re going to switch to white-hat only for a few months and review analytics before considering repairing the penalty. In the meantime, would it make sense to focus less SEO effort (on-site optimization, link building, etc.) on any pages or keywords that were penalized or hit hardest? Or are those the pages we should work on the most? Thanks for reading!

    | DA2013
    0

  • Hi All, Ive found multiple threads about previous issues but I haven't found any tailored to my specific question.I know there are a large amount of factors so I wanted to see if any other individuals had ran into this previously. We are currently in a centralized position in a major city. We are discussing moving the main office about 15 miles away into another city, moving us out of the main city where we have been for the past 3 years. The city where we are currently located has a lot more GEO search volume compared to the new city search terms and variants of. If we move will our local rankings drop when someone searches in the city where we were previously? How long would it take for this ranking to fall? Or would we still rank because we are moving a short distance away and have a large amount of citations there? I know we would need to change over all our online directories, on page etc..Any other suggestions on a smooth transition? I know there are many factors that go into this and any past experience, guidance and/or assistance is greatly appreciated. Cheers!

    | PRKEL
    0

  • Hello, What would be a good approach to gain backlinks for this site: www.nlpca.com The owners don't have much time to write content. I as the consultant have time but do not have the expertise the owners do. The people that run the site are authorities in the field. Thanks!

    | BobGW
    0

  • We have inherited a site that has a Joomla CMS "showroom" front-end and a Magento "store room" for check out etc. Question - As the site's main pages are in the CMS section should we: make all Magento product pages canonical to the main sections/product pages within the CMS (even though there are no duplicate content issues) "No index" the product pages Index but indicate low page value in sitemap Do something else? 🙂 Thanks for any and all input!

    | TheNorthernOffice79
    0

  • Hi - I've read through the forum, and have been reading online for hours, and can't quite find an answer to what I'm searching for. Hopefully someone can chime in with some information. 🙂 For some background - I am looking closely at four websites, trying to bring them up to speed with current guidelines, and recoup some lost traffic and revenue. One of the things we are zeroing in on is the high amount of outbound links in general, as well as inter-site linking, and a nearly total lack of rel=nofollow on any links. Our current CMS doesn't allow an editor to add them, and it will require programming changes to modify any past links, which means I'm trying to ask for the right things, once, in order to streamline the process. One thing that is nagging at me is that the way we link to our images could be getting misconstrued by a more sensitive Penguin algorithm. Our article images are all hosted on one separate domain. This was done for website performance reasons. My concern is that we don't just embed the image via , which would make this concern moot. We also have an href tag on each to a 'larger view' of the image that precedes the img src in the code, for example - We are still running the numbers, but as some articles have several images, and we currently have about 85,000 articles on those four sites... well, that's a lot of href links to another domain.  I'm suggesting that one of the steps we take is to rel=nofollow the image hrefs. Our image traffic from Google search, or any image search for that matter, is negligible. On one site it represented just .008% of our visits in July. I'm getting a little pushback on that idea as having a separate image server is standard for many websites, so I thought I'd seek additional information and opinions. Thanks!

    | MediaCF
    0

  • How are date ranges interpreted by Google - ie if you type "1993-2003" does Google know 1995 is incl. and should be referenced for a query? What is the best practice for an ecomm site when it comes to a landing page for multiple years? Should be list out each year (looks spammy, "2003,2004,2005...), go with a full range (1993-2003 ), or is a two digit range suffice (88-95)?

    | andrewv
    0

  • Hello,
    Now that the Keywords Tool is gone, how can I see [exact match] search volumes on Google?
    Thank you,
    Cornel

    | Cornel_Ilea
    0

  • SEO has really moved away from the nitty gritty analysis of backlinking factors, link wheels, and the like and has shifted to a more holistic marketing approach.  That approach is best described around MOZ as “Real Company S#it”.  RCS is a great way to think about what we really do because it is so much more than just SEO or just Social Media. However, our clients and business owners do want to see results and want it quantified in some way.  The way most of our clients understand SEO is by ranking high on specific terms or online avenues they have a better possibility of generating traffic/sales/revenue.  They understand this more from the light of traditional marketing, where you pay for a TV ad and then measure to see how much revenue that ad generated. In the light of RCS and the need to target a large number of keywords for a given client, how do most PROs handle this situation; where you have a large number of keywords to target but with RCS? Many I’ve asked tend to use the traditional approach of creating a single content piece that is geared towards a given target keyword.  However, that approach can get daunting if you have say 25 keywords that a small business wants to target.  In this case is not really a case of scaling down the client expectations? What if the client wants all of the keywords and has the budget?  Do you just ramp your RCS content creation efforts?  It seems that you can do overkill and quickly run out of RCS content to produce.

    | AaronHenry
    0

  • Hi guys, this time it's me asking for help :D. I have a client with a Magento 1.7.0.0 version site: www.mybomboniere.it I audited it, and found out tons of issues, but the one that worries me more is the fault of canonicalization, which is causing serious duplicated content problems. I'm not new to Magento, hence, the first thing I did was going to: First: Going to System > Configuration > Catalog > Search Engine Optimization and setting on "No" the Use Categories Path for Product URLs voice. Doing so I quit all the duplicated product pages. System > Configuration > Catalog > Catalog > Search Engine Optimization and setting on "yes" the voices Use Canonical Link Meta Tag For Categories and Use Canonical Link Meta Tag For Products. Doing this I should see URLs with sort parameters having the URLs without them as canonical The BIG PROBLEM is that even if I did that, I am still not seeing any rel="canonical" tag added to the code. I've tried to figure out the reason of this, but - sincerely - I cannot find one. Secondly, the client created so many categories and subcategories that - honestly - the best thing would be to start cutting some of them. But one thing is what is correct in theory, another what the client desires, and she does not desire cutting any subcategory. 
    That means that some risk to be a substantial duplicate of others. The correct choice should be to canonicalized the overly identical subcategories to a main one... but this is not possible using the default Magento functions. So, or using an SEO extension (but, which one is the best for Magento 1.7.0.0.? Yoast plugins seem outdated), or using a solution like the second option proposed in this post: http://www.adurolabs.com/blog/technical/how-to-add-rel-canonical-in-magento. The doubt is that the post is presenting it in case of products pages, not categories ones. Hence, is it correct also for them, or do you have others suggestions. Sorry for the long question, but any help will be much appreciated :). Ciao Gianluca

    | gfiorelli1
    0

  • It would be helpful if you can share .htaccess guides you're currently using. Thanks in advance! 🙂

    | esiow2013
    0

  • Hi guys, I hope you can help solve a mystery for me! My site FranceForFamilies.com has been around for 9 years and has always ranked well - at least until I launched a new Wordpress version earlier this year. The purpose of the relaunch was to improve the look of the site, so I kept the content and meta titles the same but created a new design. However, from the day of the new launch the search engine rankings have plummeted, to the point where most seem to have disappeared all together. I have found that when Moz crawls the site, it only crawls one page. I asked the Moz team about this and they said that the site is returning a 403. They also tested this using a curl and received a 406 response: curl -I www.franceforfamilies.com/ HTTP/1.1 406 Not Acceptable However, when I check our Google Webmaster tools I can't recreate the issue. I don't really know what is going on, and I don't have the technical knowledge to solve this - can you help? Thanks, Daniel

    | LeDanJohnson
    0

  • I have some problems understand hay should we do the url on your image page to replace tumbnail.jpg Ejemple. <spanitemprop="name">Jane Doe</spanitemprop="name"> <imgsrc="janedoe.jpg"itemprop="image"></imgsrc="janedoe.jpg"itemprop="image"> Adapting to my site Dario Vieira Pereira
    This seems to be correct and it works But my problem is when i want to do a videoobjet I can not make the tumbnail to appear on the video to show: ejemple: Vídeo: Instalaciones clínica dental Barcelona Propdental Video de las instalaciones de la nueva clínica dental Barcelona de Propdental. Tour virtual por las instalaciones del dentista. Can anyone helpme on what am i doing wrong because i cannot understand what is missing on my videoobjet schema with google ejemples

    | maestrosonrisas
    0

  • I currently have a website set up so that http://www.example.net/ redirects to http://www.example.net but **http://www.example.net/ **has more links and a higher page authority. Should I switch the redirect around? Here's the Open Site Explorer metrics for both: http://www.example.net/ Domain Authority: 38/100 Page Authority: 48/100 Linking Root Domains: 112 Total Links: 235 http://www.example.net Domain Authority: 38/100 Page Authority: 45/100 Linking Root Domains: 18 Total Links: 39

    | kbrake
    0

  • Hi all, I'm currently in the process of revitalizing my company's blog. Currently, the blog sits on a subdomain (blog.rootdomain.com). SEO best practice dictates that I should move this (and 301 redirect the old URLs) to rootdomain.com/blog to concentrate link equity and avoid the risk of having search engines treat the subdomain as separate from the root domain. However, the PageRank Status extension for Chrome is reporting that the PR for the blog on the subdomain and the root domain are the same. Is there any benefit to migrating the subdomain to a subdirectory? Is that data accurate enough to base decisions off of?

    | brad-causes
    0

  • Hello, The search engines have indexed a sub-domain I did not want indexed its on old.domain.com and dev.domain.com - I was going to password them but is there a best practice way to block them. My main domain default robots.txt says :- Sitemap: http://www.domain.com/sitemap.xml global User-agent: *
    Disallow: /cgi-bin/
    Disallow: /wp-admin/
    Disallow: /wp-includes/
    Disallow: /wp-content/plugins/
    Disallow: /wp-content/cache/
    Disallow: /wp-content/themes/
    Disallow: /trackback/
    Disallow: /feed/
    Disallow: /comments/
    Disallow: /category//
    Disallow: */trackback/
    Disallow: */feed/
    Disallow: /comments/
    Disallow: /?

    | JohnW-UK
    0

  • How can we communicate to Google the exact product pages we'd like indexed on our site? We're an apparel company that uses Shopify as our ecommerce platform. Website is sportiqe.com. Currently, Google is indexing all types of different pages on our site. **Example of a product page we want indexed: ** Product Page: sportiqe.com/products/PRODUCT-TITLE (Like This) **Examples of product pages being indexed: ** sportiqe.myshopify.com/products/PRODUCT-TITLE sportiqe.com/collections/COLLECTION-NAME/products/PRODUCT-TITLE See attached for an example of how two different  "Boston Celtics Grateful Dead" shirts are being indexed. Any suggestions? We've used both Shopify and Google Webmaster tools to set our preferred domain (sportiqe.com). We've also added this snippet of code to our site three months ago thinking that would do the trick... {% if template == 'product' %}{% if collection %} {% endif %}{% endif %} sKwNZOl

    | farmiloe
    0

  • Hi there, I have the following domains: OLD DOMAIN: domain1.co.uk NEW DOMAIN: domain2.co.uk I need to create a .htaccess file that 301 redirects specific, individual pages on domain1.co.uk to domain2.co.uk I've searched for hours to try and find a solution, but I can't find anything that will do what I need. The pages on domain1.co.uk are all kinds of filenames and extensions, but they will be redirected to a Wordpress website that has a clean folder structure. Some example URL's to be redirected from the old website: http://www.domain1.co.uk/charitypage.php?charity=357 http://www.domain1.co.uk/adopt.php http://www.domain1.co.uk/register/?type=2 These will need to be redirected to the following URL types on the new domain: http://www.domain2.co.uk/charities/ http://www.domain2.co.uk/adopt/ http://www.domain2.co.uk/register/ I would also like a blanket/catch-all redirect from anything else on www.domain1.co.uk to the homepage of www.domain2.co.uk if there isn't a specific individual redirect in place. I'm literally tearing my hair out with this, so any help would be greatly appreciated! Thanks

    | Townpages
    0

  • Hi, We are changing our URLs to be more SEO friendly. Is there any negative impact or pitfall of using <base> HTML-tag? Our developers are considering it as a possible solution for relative URLs inside HTML-markup in the Friendly URL context.

    | theLotter
    0

  • My agency just built a new website for a client who is a franchisee.  It's not launched yet - it's currently under an IP address.  I suggested to client that he buy a keyword-rich domain name for it, which he did. Then he found out that the franchisor will not allow it to be his main domain name. They want him to use a domain name with the franchisor name in it.  But they WILL allow him to put a 301 redirect on that franchisor-approved domain name, and redirect it to his keyword-rich domain name. He is interested in having my agency perform an SEO Campaign for this new website. But would SEO and link marketing work for a website that has a new non-keyword domain name that 301 redirects to a new keyword-rich domain name?

    | netsites
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.