Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • I've had a successful online business for the past 8 years, and my website was consistently on the front page of Google for 5 of the years, almost all of my longtail (that's an old term huh?) keywords as well as my top choice, vanity keywords.  However, with the last update and Google's brand push, I've gotten beaten up pretty badly, falling to 3rd, 4th and 5th page results.   Maybe I could have saved them (disavow some links, improve speed) but I had a really happy and horrific event in my life, the birth of my first child.  Happy because I've wanted a family since I was 13, horrific because she's been ill, and it's only my wife and I taking care of her.  I literally haven't been back to sit in front of my computer for the past 3 months except to do quick searches on medicines, treatments, etc. So now that I have some freedom to get back to work, and our major selling season is over (with a massive drop in sales) I'm considering on re-branding with a new url (our reputation suffered a lot too, late shipments, long customer service waits).  The current url is a partial match domain relative to the two categories of products I sell - thisproductandthatproduct.com.  However, one of these categories of products I want to drop because that side of the business no longer represents a profitable market, but I want to keep selling the other category of product. I own productname.net and and old fashion iproductname.com.  I know that it's possible to rank .net's but since I'm in the online retail field, .com's usually have more success.  The iproductname.com I think I can brand successfully, but the i in the domain name makes it kind of cliche and old fashioned.  The Exact Match Domain .net is really, REALLY tempting, because it represents a search that is done about 250k per month.  But, it's a .net, and might be more of a struggle since Google seems to be cracking down on EMD's, which sucks because my site is not a content or affiliate traffic site, but a retail source. And, that's not the end of my predicament.  Do I start from scratch, or redirect.  The current url is ten years old, with about a PR4 according to seobook's browser tool.  It's still within the first 100 results, which might or might not mean it still has some authority.  I was thinking about redirecting the category pages to the new url and keeping the old one as an informational page for old customers, with a link to the new site, then redirecting later on. Any suggestions would be really helpful.

    | rumblepup
    0

  • Hi Friends I have sitelink when people search my brand. Can I have another sitelink for none branded keyword that my site is more than 1 year top of google position?

    | vahidafshari45
    0

  • hi guys i find a service seomaximus.com want to know if anyone use it and if its good?

    | adulter
    0

  • eLLo! Has anyone used Data Highlighter? I've had colleagues mentioning a jump in CTR after using the data highlighter on pages. Thought I'll do the same and went into my webmaster tools but I've hit a brick wall. Whenever I highlight a product page, my country selector pops up and I'm unable to highlight a product page. A colleague of mine mentioned to bypass google by basing it on user agent, this will allow you to avoid the country selector. But if I bypass Google, wouldn't it affect Google Analytics, Indexing etc?

    | Bio-RadAbs
    0

  • I am curious if anyone can share some advice. I am working on planning architecture for a tour company. The key piece of the content strategy will be providing details on each of the tour destinations, with associated profiles for each city within those destinations. Lots of content, which should be great for the SEO strategy. With regards to the architecture, I have a ‘destinations’ section on the Website where users can access each of the key destinations served by the tour company. My question is – from a planning perspective I can organize my folder structure in a few different ways. http://www.companyurl.com/destinations/touring-regions/cities/ or http://www.companyurl.com/destinations/ http://www.companyurl.com/touring-regionA/ http://www.companyurl.com/touring-regionB/cities-profile/ I am curious if anyone has an opinion on what might perform best in terms of the site structure from an SEO perspective. My fear is taking all of this rich content and placing it so many tiers down in the architecture of the site. Any advice that could be offered would be appreciated. Thanks.

    | VERBInteractive
    0

  • Hi there. My client uses a CMS/E-Commerce platform that is automatically set up to index every single internal search results page on search engines. This was supposedly built as an "SEO Friendly" feature in the sense that it creates hundreds of new indexed pages to send to search engines that reflect various terminology used by existing visitors of the site. In many cases, these pages have proven to outperform our optimized static pages, but there are multiple issues with them: The CMS does not allow us to add any static content to these pages, including titles, headers, metas, or copy on the page The query typed in by the site visitor always becomes part of the Title tag / Meta description on Google.  If the customer's internal search query contains any less than ideal terminology that we wouldn't want other users to see, their phrasing is out there for the whole world to see, causing lots and lots of ugly terminology floating around on Google that we can't affect. I am scared to do a blanket de-indexation of all /search/ results pages because we would lose the majority of our rankings and traffic in the short term, while trying to improve the ranks of our optimized static pages. The ideal is to really move up our static pages in Google's index, and when their performance is strong enough, to de-index all of the internal search results pages - but for some reason Google keeps choosing the internal search results page as the "better" page to rank for our targeted keywords. Can anyone advise? Has anyone been in a similar situation? Thanks!

    | FPD_NYC
    0

  • Hi Guys, I've been working on a E-commerce site for a while now. Let me sum it up : February new site is launched Due to lack of resources we started 301's of old url's in March Added rel=canonical end of May because of huge index numbers (developers forgot!!) Added noindex and robots.txt on at least 1000 urls. Index numbers went down from 105.000 tot 55.000 for now, see screenshot (actual number in sitemap is 13.000) Now when i do site:domain.com there are still old url's in the index while there is a 301 on the url since March! I know this can take a while but I wonder how I can speed this up or am doing something wrong. Hope anyone can help because I simply don't know how the old url's can still be in the index. 4cArHPH.png

    | ssiebn7
    0

  • I know having rel="canonical" for each page on my website is not a bad practice... but how necessary is it for pages that don't have any external links pointing to them? I have my own opinions on this, to be fair - but I'd love to get a consensus before I start trying to customize which URLs have/don't have it included. Thank you.

    | Netrepid
    0

  • Hi, I have rich snippet mark up showing in serps for some pages but not others. All pages test fine using Googles structured data testing tool. Whats really annoying is that they seem to appear for some pages but not others within the same directory / page format. None of googles troubleshooting suggestions on the issue are a problem i.e. Does your markup follow our usage guidelines? Is your marked-up content hidden from users? Is your markup incorrect or misleading? Is your marked-up content representative of the main content of the page? Have you supplied enough information? Have you only recently updated your content? Does your markup include incorrect nesting? Reviews: Does your review use count instead of vote? There are alot of instances when the same mark up is used twice e.g. on product x page in one directory and on product x page in a different directory (theres no dupe content). I wondered if that could be a reason but there are alot of instances when product x in directory a has the snippet when it doesnt in directory b. There doesnt seem to be an identifiable pattern as to why one page whould show the snippet and not another. Any feedback appreciated. Happy to pm example pages. Andy

    | AndyMacLean
    0

  • Has anyone submitted pages that generate sitemaps on the fly as opposed to only submitting static XML files to Bing? For instance, sitemap.php vs sitemap.xml, video sitemap.php vs videositemap.xml?

    | alhallinan
    0

  • Weird, my rankings have fallen on one of my sites and all the main pages that were ranking are now PR unranked. We did ad some domains to a disavow file, but nothing major that should have had an impact like this. Pages are all allowed to be indexed and not blocked and are currently indexed in Google. Can Google just come around and decide they don't like these pages anymore and PR unrank them all?

    | irvingw
    0

  • Hi Guys This week the Security certificate of our website expired and basically we now have to wail till next Tuesday for it to be re-instated. So now obviously our website is now index with the https urls, and we had to drop the https from our site, so that people will not be faced with a security risk screen, which most browsers give you, to ask if you are sure that you want to visit the site, because it's seeing it as an untrusted one. So now we are basically sitting with the site urls, only being www... My question what should we do, in order to prevent google from penalizing us, since obviously if googlebot comes to crawl these urls, there will be nothing. I did however re-submitted it to Google to crawl it, but I guess it's going to take time, before Google picks up that now only want the www urls in the index. Can somebody please give me some advice on this. Thanks Dave

    | daveza
    0

  • Buonngiorno from 16 degrees C cloudy wetherby UK, A client has cloned their UK sites copy for a US version. What theyve now got is a USA site and a uK site with exactly the same copy, the only difference is the suffix. Am i right in saying this will cause problems when for example a browser enters a phrase and two sites appear in the SERPS. Is a solution to this to block the usa site from appearing in the UK (is this possible?). Yes i know the true fix is to change the copy but we are dealing with clients here 😉 Grazie,
    David

    | Nightwing
    0

  • Hi all We are server and website monitoring company for over 13 years and I dare to say our product evolved and mastered over the years. Our marketing not so much. Most of our most convertible traffic came from the keyword "ping test" with our ping test tool page, and for the first 10 years we have been positioned 1-3 in Google.com so it was all good. The last two years we have been steady on positioned 8-9, and since 7-30-13 we are on the second page. We have launched a blog in 2009 at http://www.websitepulse.com/blog, and post 2-3 times a week, and are working on new website now, and my question is what is your advice in our situation? Aside from providing fresh content and launching a new website is there anything specific we could do at this stage to improve our position for "ping test"? Thanks Lily

    | wspwsp
    0

  • Hi All, I have a client with a sizeable international manufacturing operation who we've managed to get up to a DA of 40 over time. However, things seem to have levelled out, and I'm not sure how to mix it up to get the numbers back on the rise. We create regular blog and social content, run press releases bi-weekly, optimize on-page content and stay on top of all technical issues. What else can we do?? Any suggestions are greatly appreciated, Thanks.

    | G2W
    0

  • I posted a link to Rand's recent Moz Blog in another forum. One of the users posted a link to this article as a counter point. Thoughts? [title edited by staff for clarity]

    | AWCthreads
    2

  • I am looking to have an SEO specialist to audit and do consultation on one of my sites. This website never received a penalty from Google but was hit algorithmically and I need to bring it back up strong on the serps. Who do you recommend from the "recommended list" from MOZ? Cheers 🙂

    | mbulox
    0

  • Hey guys, quick question I didn't find an answer to online. Scenario: 1. Site A links to Site B. It's a natural, regular, follow-link 2. Site A joins Site B's affiliate program, and adds an affiliate link Question: Does the first, regular follow link get devalued by the second affiliate link? Cheers!

    | ipancake
    0

  • Since fixing a major duplicate content issue in Dec 2012, our traffic saw steady and dramatic growth through the summer. It has started to settle down now and rankings have started to decline. I have a bad feeling that something is wrong but for the life of me I cannot figure it out. Our Moz rank is pretty high but some of our biggest keywords have gone from #2 to #9 in a fairly short time period which makes me very nervous. I am not even sure where to look at this point. Any suggestions are welcomed as I am kind of at a loss now but worry I am missing something basic. www.threeguysgolf.com

    | astaelin
    0

  • Hello, I'm faced with a decision which requires some feedback from experts like you. My client has a business which has inconsistent citation submissions out the ying yang.  For examples sake their business is called "brand all types body repair", where "brand" is the parent company who owns my client's company,  and "all types body repair" is my client's company. In most of their existing citations the phone number and address are all consistent, but some citations contain the name "brand all types body repair", and others just "all types body repair". The name used on their website's NAP is "all types body repair"; however the name used on the larger directories like the BBB and yellow pages is "brand all types body repair".  Also since they are registered in the BBB as this, "brand all types body repair" is their legal business name. Since they are registered in the BBB as "brand all types body repair" my thoughts are to ensure every citation uses this as the name. Three questions come out of this: I'm not really changing the name, only adding a brand word in front of the existing name.  Will this have less of an effect than changing the name completely? If I change the name on their website from "all types body repair" to "brand all types body repair", will that have a negative affect on search rankings if some of the citations are already using "brand all types body repair"? What would you do about the name issue?

    | reidsteven75
    1

  • Penguin 2.0 was a great update for one of my biggest client. A website that was using terrible black hat techniques and ranked first on the most important keyword in my clients niche got kickt from the SERP's and my client jumped from 4th to 1st. The jump in traffic was enormous and on top of that 5% of the traffic converted instead of the usual 2,5%- 3% on other traffic. Untill July 2nd. Traffic from the keyword dropped by 80% while we were still in position 1, after a lot of digging I thaught I found what caused it, Google booted the keyword from their autofill. My question is if anyone has seen a removal from tthe autocomplete making that big of a difference in search volume.

    | Laurensvda
    1

  • The company I work for is expanding their business to new territories. I've got a lot of stabilization to do in the region/state where we're one of the most well known companies of our kind. Currently, we have 3 distinct product lines which are currently distinguished by 3 separate URLS. This is affecting the user flow of our site, so we'd like to clean it up before launching our products into the various regions. The business has decided to grow into 5 new states (one state consisting of one county only) — none of which will feature all 3 products. Our homebase state is the only one that will have all 3 products this year. My initial thought was to use subdomains to separate out the regions, that way we could use a canonical tag to stabilize the root domain (which would feature home state content, and support content for all regions), and remove us from potential duplicate content penalization. Our product content will be nearly identical across the regions for the first year. I second guessed myself by thinking that it was perhaps better to use a "[product].root/region" URL instead. And I'm currently stuck by wondering if it was not better to build out subdomains for products and regions...using one modifier or the other as a funnel/branding page into the other. For instance, user lands on "region.root.com" and sees exactly what products we offer in that region. Basically, a tailored landing page. Meanwhile the bulk of the product content would actually live under "product.root.com/region/page". My head is spinning. And while searching for similar questions I also bumped into reference of another tag meant to be used in some similar cases to mine. I feel like there's a lot of risks involved in this subdomain strategy, but I also can't help but see the benefits in the user flow.

    | taylor.craig
    0

  • Hi All! So the company that I work for owns two very strong domains in the information security industry. There are two separate sections on each site that draws a ton of long tail SEO traffic. For our corporate site we have a vulnerability database where people search for vulnerabilities to research, and find out how to remediate. On our other website we have an exploit database where people can look up exploits in order to see how to patch an attackers attack path. We are going to move these into a super database under our corporate domain and I want to ensure that we maintain or minimize the traffic loss. The exploit database which is currently on our other domain yields about three quarters of the traffic to the domain. It is obviously OK if that traffic goes directly to this new subdomain. What are my options to keep our search traffic steady for this content? There are thousands and thousands of these vulnerabilities and exploits so it would not make sense to 301 redirect all of them. What are some other options and what would you do?

    | PatBausemer
    0

  • I have a website that has around 90k pages indexed, but after doing the math I realized that I only have around 20-30k pages that are actually high quality, the rest are paginated pages from search results within my website. Every time someone searches a term on my site, that term would get its own page, which would include all of the relevant posts that are associated with that search term/tag. My site had around 20k different search terms, all being indexed.  I have paused new search terms from being indexed, but what I want to know is if the best route would be to 404 all of the useless paginated pages from the search term pages. And if so, how many should I remove at one time? There must be 40-50k paginated pages and I am curious to know what would be the best bet from an SEO standpoint. All feedback is greatly appreciated. Thanks.

    | WebServiceConsulting.com
    0

  • WMT shows a significant drop in structured data markup on June 7th, steep incline by June 21st. Now the same thing happened on August 9th, with no signs of recovery. Lost 45% of our search traffic. There are many people with the same problem, and nobody seems to know what caused it. Here are a few links to some forums: #1 Google Groups, #2 Google Groups, #3 Google Groups, #4 70% drop on GWT on June 7 Google SEO News and Discussion forum at WebmasterWorld. On our end we see a 100% drop in breadcrumbs and a 100% drop in hcards leading to a 45% search traffic drop. Any ideas why might have happened and how to fix this?

    | PhilippGreitsch
    0

  • Hello I have a smartphone site (e.g.m.abc.com). To my understanding we do not need a mobile sitemap as its not a traditional mobile site. Shall I add those mobile site links in my regular www XML sitemap or not bother to add the links as we already have rel = canonical (on m.abc.com ) and rel= alternate in place (on www site) to respective pages. Please suggests a solution. I really look forward to an answer as I haven't found the "official" answer to this question anywhere.

    | AdobeVAS
    0

  • I purchase a a domain xxx.ru and want to add my new website to yandex webmaster tools. However, when I try adding my site I get a message saying "this site is a mirror of yyy.ru" (the old website who used to have my new domain and has since move to this new domain instead of the one I purchased from them) After getting this message I am not able to add my website. Any advice?

    | theLotter
    0

  • I have optimised a homepage for two keywords. I optimised this a few weeks ago and the page has been crawled by Google, also before this it was already reasonably well optimised for these terms. However, the homepage is not appearing in Google for these terms. Instead two other random pages on the site are appearing for these terms that have not been optimised for these keywords and have few mentions of the keywords on the pages!?? These pages have a lower DA and lower inbound links than the homepage. The homepage is showing for other lower competition keywords. Could anyone offer me some insight into this? The homepage content has been posted on other websites by a former SEO consultant - to a business directory for one? Could duplicate content be causing this problem?

    | absolutely17
    0

  • Hi, I manage the SEO of a brand poker website that provide ongoing very good content around specific poker tournaments, but all this content is split into dozens of pages in different sections of the website (blog section, news sections, tournament section, promotion section). It seems like today having one deep piece of content in one page has better chance to get mention / social signals / links and therefore get a higher authority / ranking / traffic than if this content was split into dozens of pages. But the poker website I work for and also many other website do generate naturally good content targeting long tail keywords around a specific topic into different section of the website on an ongoing basis. Do you we need once a while to merge those content pages into one page ? If yes, what technical implementation would you advice ? (copy and readjust/restructure all content into one page + 301 the URL into one). Thanks Jeremy

    | Tit
    0

  • We have an e-commerce website with about 4500 products for sale. About 1200 of these items were not showing up in the Google PLA ads because they were $0 dollar items, so we made those products invisible.  Then Set 301 Redirects for each of the 1200 items. My question is this; we want to turn back on the 1200 items, should we delete the 301 redirects that are in place for them.? Will it hurt SEO performance by having them?

    | Goriilla
    0

  • Hi looking for any research on the impact of using exit popups (when a visitor is exiting the site), and the impact on it from SEO perspective. Regards, Mike

    | MBASydney
    0

  • What options do we have for keyword research now that Google is switching from the Google AdWords Keyword Tool to the Keyword Planner??

    | alhallinan
    0

  • Hi, I'm building a new e-commerce site and I'm conflicting about what to do in my category pages. If we take for example a computer store.
    I have a category of laptops and inside there are filters by brand (Samsung, HP, etc.). I have two options - either having the brand choice open a new dedicated page - 
    i.e. Samsung-Laptops.aspx or simply do a JQuery filter which gives a better and faster user experience (immediate, animated and with no refresh). **Which should I use? (or does it depend on the keyword it might target)? **
    Samsung laptops / dell laptops / hp laptops -  are a great keyword on there own! By the way, splitting Laptops.aspx to many sub category physical pages might also help by providing the site with many actual pages dealing with laptops altogether.

    | BeytzNet
    0

  • Not sure if anyone has ever had this problem. We have a client who is a UK based retailer with a large retail presence in Canada and a U.S site as well. For the past year while keeping track of their rankings, they steadily ranked #1 for their brand term on Google.CA. At the end of June we implemented a GEO IP redirect for U.S visitors to be redirected to the U.S site if they clicked on the .CA listing. Over the next two weeks the ranking for the single branded keyword went from #1 to completely off the top 50. Could this have possibly happened due to the GEO IP redirect? The .CO.UK site has always been top 3 in the organic listing and is still #1 but in Google.ca the Canadian site has dropped off completely after consistently ranking #1.

    | demacmedia
    0

  • Hi All, Just a quick question ... A shady domain linking to my website is indexed in Google for both example.com and www.example.com. If  I wan't to disavow the entire domain, do I need to submit both: domain:www.example.com domain:example.com or just: domain:example.com Cheers!

    | Carlos-R
    0

  • I'm a little confused over the correct area to place a H1 title Tag. When I look at Wordpress templates and published Wordpress sites, it suggests placing the H1 tag within the header area. However, SEO companies and other well postioned sites place the H1 title tag at the start of the main content area. What is the correct and/or best practice for placing H1 page title tags? Thanks Mark

    | Mark_Ch
    0

  • Issue: Images too large for mobile devices in some articles, cannot be shrunk responsively, also should help reduce page size/improve site speed on small screen devices. I am thinking of switching depending on the user-agent, such as iPhone / Android devices and serving up an optimised, rediced size image. I envisage this working in the background / ie. hidden from authors so it is easy. Platform: WordPress Would like a solution or some feedback on people's experiences with this problem. No good plugins found that can handle this so would probably need to be custom coded, but no processing overhead, unless it is generated upon publication of article. Thanks peeps Keith H

    | Greywood
    0

  • I have a client that is a small but growing fashion brand in the UK, they make sunglasses, eyewear and swim suits. They are priced as a high-end brand, so around £200 for a pair of sunglasses. They have asked me about 'sorting out their SEO', but I am struggling to think of viable keywords we could target that don't contain the name of their brand. The SERPS for anything that would be along the lines of 'buy sunglasses / buy swimsuits' etc are dominated by very big players - big department stores etc, so with their small budget I'm sure would be impossible to crack. My thoughts are that apart from sorting out their on-page SEO (crawlable, sitemap etc) and making sure they rank for their branded terms, that putting money into trying to rank for generic terms around buying swimwear/sunglasses would not be viable. A better route for traffic generation would be through a more content marketing / social media approach to get people sharing their content (e.g. fashion industry commentary) and leading them back to their brand from there. What do others think? Am I missing a trick on the SEO front? thanks

    | nicandm
    0

  • Hi Guys, Does anyone have an example of a site using schema.org Organization logo markup (as per: http://googlewebmastercentral.blogspot.com.au/2013/05/using-schemaorg-markup-for-organization.html), with the logo appearing in Google SERPs? One of the designers is pressing me ofr an example. I've found plenty of brands getting their logo in the SERPs knoweldge base results, but they have all been using G+ verified company profiles, or other methods (Googs simply selecting a best fit?) to achieve it. Thx!

    | David_ODonnell
    0

  • Basically, I have no inbound likes going to https://www.mysite.com , but google is indexing the Homepage only as https://www.mysite.com In June, I was re included to the google index after receiving a penalty... Most of my site links recovered fairly well.  However my homepage did not recover for its top keywords. Today I notice that when I search for my site, its displayed as https:// Robots.txt blocks all content going to any secure page.  Leaving me sort of clueless what I need to do to fix this.  Not only does it pose a problem for some users who click, but I think its causing the homepage to have an indexing problem. Any ideas?  Redirect the google bot only?  Will a canonical tag fix this? Thx

    | Southbay_Carnivorous_Plants
    0

  • I'm trying to go through my site and add microdata with the help of Google's Structured Data Markup Helper. I have a few questions that I have not been able to find an answer for. Here is the URL I am referring to: http://www.howlatthemoon.com/locations/location-chicago My company is a bar/club, with only 4 out of 13 locations serving food. Would you mark this up as a local business or a restaurant? It asks for "URL" above the ratings. Is this supposed to be the URL that ratings are on like Yelp or something? Or is it the URL for the page? Either way, neither of those URLs are on the page so I can't select them. If it is for Yelp should I link to it? How do I add reviews? Do they have to be on the page? If I make a group of days for Day of the Week for Opening hours, such as Mon-Thu, will that work out? I have events on this page. However, when I tried to do the markup for just the event it told me to use  itemscope itemtype="http://schema.org/Event" on the body tag of the page. That is just a small part of the page, I'm not sure why I would put the event tag on the whole body? Any other tips would be much appreciated. Thanks!

    | howlusa
    0

  • Hi , I came across a super domain for my business but found out that it was a great domain with 100s of link backs but is now banned by Google search engine meaning Google does not index content from that domain. Since the domains linkbacks are from my domin does it make sense to but that domain and redirect those link backs to another (301) and hope that the new domain gets some juice ... I know it is sounding crazy and may not be the best thing to do ethically but still wanted to check if its possible to get some juice.. Rgds Avinash

    | Avinashmb
    0

  • My client has a pretty creative idea for his web copy.  In the body of his page there will be a big block of text that contains random industry related terms but within that he will bold and colorize certain words that create a coherent sentence.  Something to the effect of "cut through the noise with a marketing team that gets results".  Get it? So if you were to read the paragraph word-for-word it would make no sense at all.  It's basically a bunch of random words.  He's worried this will affect his SEO and appear to be keyword stuffing to Google. My question is: Is there a way to block certain text on a webpage from search engines but show them to users?  I guess it would be the opposite of cloaking?  But it's still cloaking...isn't it? In the end we'll probably just make the block of text an image instead but I was just wondering if anyone has any creative solutions. Thanks!

    | TheOceanAgency
    0

  • I am currently struggling with the decision whether to create individual ecommerce sites for each of 3 consumer product segments or rather to integrate them all under one umbrella  domain. Obviously integration under 1 domain makes link building easier, but I am not sure how far google will favor in rankings websites focussed on one topic=product segment. Product segments are medium competitive.Product segments are not directly related but there may be some overlap in customer demographics- Any thoughts ?

    | lcourse
    1

  • I am merging 11 real estate community sites into 1 regional site and don't really know what type of redirect should I use for the homepage?, for instance: www.homepage.com redirect to www.regionalsite.com/community-page Should I 301 this redirect? If yes, how could I 301 redirect a homepage to an internal page in my new site? Cheers 🙂

    | mbulox
    0

  • If we mention words such as supplement or lesser word cosmetic are they a trigger to anything Google side such as a more through look at the website or such as I can see sex ect being one. I am not selling the above but we do sell the stickers for them (not the sex ones) so just airing on caution on weather to have a page talking about them on the site.

    | BobAnderson
    0

  • I am working on a multilingual blog build in WordPress.From the first day I see the URL structure getting abrupt when I add an article in other language.
    The following is an example of abrupt URL.
    http://muslim-academy.com/%D9%81%D8%B6%D9%84-%D9%82%D8%B1%D8%A7%D8%A1%D8%A9-%D8%A7%D9%84%D9%82%D8%B1%D8%A2%D9%86-3/ is their some plugin to fix it or some manual change?

    | csfarnsworth
    0

  • Hello I'm dealing with some issues. Moz analyses is telling me that I have duplicate on some of my products pages. My issue is that: Concern very similar products IT products are from the same range Just the name and pdf are different Do you think I should use canonical url ? Or it will be better to rewrite about 80 descriptions (but description will be almost the same) ? Best regards.

    | AymanH
    0

  • I'm working on a site that is built on DNN.  For some reason the client has set all pages to convert to HTTPS (although this is not perfect as some don't when landing on them). All pages indexed in Google are straight HTTP, but when you click on the Google result a temp 302 header response to the corresponding HTTPS page for many. I want it changed to a 301 but unfortunately is an issue for DNN. Is there another way around this in IIS that won't break DNN as it seems to be a bit flaky?  I want to have the homepage link juice pass through for all links made to non HTTPS homepage.  Removing HTTPS does not seem to be an option for them.

    | MickEdwards
    0

  • I am building a new site for a client and we're discussing their inventory section. What I would like to accomplish is have all their products load on scroll (or swipe on mobile). I have seen suggestions to load all content in the background at once, and show it as they swipe, lazy loading the product images. This will work fine for the user, but what about how GoogleBot mobile crawls the page? Will it simulate swiping? Will it load every product at once, killing page load times b/c of all of the images it must load at once? What are considered SEO best practices when loading inventory using this technique. I worry about this b/c it's possible for 2,000+ results to be returned, and I don't want GoogleBot to try and load all those results at once (with their product thumbnail images). And I know you will say to break those products up into categories, etc. But I want the "swipe for more" experience. 99.9% of our users will click a category or filter the results, but if someone wants to swipe through all 2,000 items on the main inventory landing page, they can. I would rather have this option than "Page 1 of 350". I like option #4 in this question, but not sure how Google will handle it. http://ux.stackexchange.com/questions/7268/iphone-mobile-web-pagination-vs-load-more-vs-scrolling?rq=1 I asked Matt Cutts to answer this, if you want to upvote this question. 🙂
    https://www.google.com/moderator/#11/e=adbf4&u=CAIQwYCMnI6opfkj

    | nbyloff
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.