Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • My website currently has some strong SEO and i will be re-developing the website on a wordpress platform...which will change many of the existing URL.  Will this affect the current pages that are well indexed in Google?  Does using Wordpress or changing the URL extension (.html to .php) make a difference? If i want to make a clean transition without effecting our existing SEO...what are some essential steps i need to take?  Example.  Current page is www.mydomain.com/name.html .... and the new URL would be www.mydomain.com/product/name.php Thanks

    | Souk
    0

  • I'm trying to download every google search results for my company site:company.com. The limit I can get is 100. I tried using seoquake but I can only get to 100. The reason for this? I would like to see what are the pages indexed. www pages, and subdomain pages should only make up 7,000  but search results are 23,000. I would like to see what the others are in the 23,000. Any advice how to go about this? I can individually check subdomains site:www.company.com and site:static.company.com, but I don't know all the subdomains. Anyone cracked this? I tried using a scrapper tool but it was only able to retrieve 200.

    | Bio-RadAbs
    0

  • Hi, one of my clients is receiving the following error in SERP - "A description of the page is not available because of this site's robots.txt". The site is built on WordPress and I realized that by default, the settings were checked to blocks bots from crawling the site. So, I turned it off, fixed robots.txt and submitted the sitemap again. Since, then it's been almost 10 days, the problem still exists. Can anyone tell me what should be done to fix it or if there's a way to get Google to recrawl the pages again.

    | mayanksaxena
    0

  • Hello all, A bit of a confusing one but please bear with me... On our website we have a Used Cars section where each morning a feed is loaded onto our site with any changes to the stock. Some cars may have been sold and removed, some new cars may be added, some prices may be changed, every day every morning this very large section of our website is updated. The question I have is, should I be including these urls in my sitemap? The Used Cars section is a huge portion of our website content and is our most important area, the Used Cars overview is our most frequently visited page. The reason I ask is because of course Google might crawl and see car X, but tomorrow car X could be gone and be replaced with car Y. Should I be even mentioning these pages to Google if by tomorrow some of those urls could be gone? It's always changing and it's something we don't have control of. Thanks!

    | HB17
    0

  • Hello, I have just been approach by a website owner - site isn't mobile friendly in any way - and they've seen a significant fall off in traffic since 23 Jan... backlink profile is clean (and no linkbuilding undertaken) - nothing else has changed... - more than half their traffic is via mobile devices and they've lost a good 1/3 of their traffic - and drilling deeper it's their organic traffic that's been hit. Anybody else seeing similar? edit... for reference: https://www.davidnaylor.co.uk/google-released-mobile-algorithm-think.html

    | McTaggart
    0

  • Hi SEO Gurus, I have a question. How Google Adwords Can Impact SEO Ranking ?
    Positive , negative or neutral impact? I will appreciate if you will provide detailed answer Thank you for your time webdeal

    | Webdeal
    0

  • I'm doing SEO on a website, zing.co.nz, which is a soon to launch company.  At the moment there is a splash sight up, which will be replaced by the real sight in a few weeks upon launch.  Is it worth me putting in Schemas (for the first time) so that it is recognized as an organization?  Will this effect us in the serps? Thanks for your help 🙂

    | Startupfactory
    0

  • I came in this morning to a swath of email updates from Moz. Our site had jumped in ranking in the four geographic regions we target and we were seeing (more or less) the best results we've ever had. Most of the jumps were only 1 - 4 places but they were for our most competitive keywords. Late this afternoon I did some spot checks on keywords to confirm that we were still holding roughly in the same positions and before I reported this to the wider business. We're not. What's more we have dropped off the first page for some terms and don't seem to be ranked at all for for others. The only keywords we're getting good rankings for are branded search terms. I can only assume this is due to a technical issues over the weekend our developers caused. I'm not completely across this but I at some point our sitemaps stopped working and a huge number of links on the site were broken. Could a massive surge in 404s cause this? I'm checking google analytics and I can't see a drop in organic traffic yet although I don't have the full figures for today. Thanks

    | ahyde
    0

  • I recently added some ajax pages to automatically fill in small areas of my site upon page loading.  That is, the user doesn't have to click anything.  Therefore when Google and Bing crawl the site the ajax is executed too.  However, my understanding is that does not mean Google and Bing are also crawling the ajax content. I actually would prefer that the content  would be not be executed OR crawled by them.  In the case of Bing I would prefer that the content not even be executed because indications are that the program exits the ajax page for Bing because Bing isn't retaining session variables which that page uses, which makes me concerned that perhaps when that happens Bing isn't able to even crawl the main content..dunno..So, ajax execution seems potentially risky for normal crawling in this case. I would like to simply have my program skip the ajax execution for Google and Bing by recognizing them in the useragent and using an If robot == Y skip ajax approach.  I assume I could put the ajax program in the robots.txt file but that wouldn't keep Bing from executing it (and having that exit problem mentioned above).  It would be simpler to just have them skip the ajax execution altogether. Is that ok or is there a chance the search engines will penalize my site if they find out (somehow) that I have different logic for them than for the actual users?  In the past this surely was not a concern but I understand that Google is increasingly trying to become like a browser so may increasingly have a problem with this approach. Thoughts?

    | friendoffood
    0

  • I am working on an ecommerce site and I am going to add different views to the category pages. The views will all have different urls so I would like to add the rel="canonical" tag to them. Should I still add these pages to the sitemap?

    | EcommerceSite
    0

  • Hi, We are actually working on a new product information section for our network of websites (site A, B, C and D) where product landing pages allow to download information in pdf format and are active for downloads during a period of two months (active form for commercial reasons) with a unique URL (the case today). Here is a possible scenario for these product landing pages in the near future: Product is promoted in website A during 2 months (January to February) so canonical URL = A/page. Once expired, the product info. download form disappears. Customer decides to promote the same product in the same site A as well in site B from April to May so canonical URL will still be A/page. Canonical URL of B/page will point to A/page. Customer decides to relaunch his product promotion this time in site C from July to August so canonical URLs of pages A/page and B/page will now point to C/page as the latter will be the only product campaign active with a download form At the end of the year the customer does another campaign for the same product this time in website D so we will change the canonical URL of pages A/page, B/page and C/page to D/page as the latter will be the only product campaign active with a download form The obvious question here is: will this way of changing canonical URLs dynamically hurt the SEO of the section, pages, one particular website or the whole network ? Would it be better and safer to just keep the first canonical URL forever? A/page in this example Thanks so much for your input on this.

    | JulienLetellier
    0

  • Some people in the company I work for have suggested that we buy a keyword rich domain that matches a new product line that we're planning to release. I've advised that this in itself is not a good idea, as we'll need to produce high quality content for that site rather than just having it exist for ranking purposes. We already have a section on our main site focussed on this product line, so I don't think having the keyword match domain would really add anything unless we worked out what we'd use this site for. That said, I was wondering whether it might be worth buying the exact match domain anyway, in order to prevent a competitor from using it?

    | RG_SEO
    0

  • I check the "Links to my site" section in Google WMT on a regular basis. In the past couple of months I've been seeing more and more weird links, from pretty spammy domains and even a few from weird Iranian domains. It's Needless to say but I have never bought a link or been involved in any link schemes or the like. Like probably everyone in the Internet, I'm in a competitive vertical, and my competitors probably aren't so scrupulous. The question is, do I actively need to disavow suspicious links? Should I contact the domains and ask to remove them? I have usually just ignored these links, and not wasted time in doing anything with them (since weird automated links are always around) but the proliferation in the last couple months has started to worry me.

    | Don34
    1

  • I realise that often it is better put content in a subfolder rather than a subdomain, but I have another question that I cannot seem to find the answer to. Is there any ranking benefit to having a site on a .co.uk or .com domain rather than on a subdomain? I'm guessing that the subdomain might benefit from other content on the domain it's hosted on, but are subdomains weighted down in any way in the search results?

    | RG_SEO
    0

  • I only wants to be 100% sure about using the canonical tag.. I want to use it on pages that rankes together with the frontpage in Google, but i only want the frontpage to rank alone and to have the link juice from the other 2 sites direct-ed to the frontpage.. Hope you agre that its the correct way to doo so?? Wich one is correct: http://www.testtest.com/”> Or this http://www.testtest.com/”/>

    | seopeter29
    0

  • There are many, many broken links on the website. What normal strategy to use for that? http://www.txacspecialist.com/air-conditioning-equipment-service-austin/american-standard/ It's an AC site, so all the links to AC vendors who have changed their product pages, all of those links are broken So for instance, the carrier 20xl doesn't exist anymore. Now they sell the carrier 45abp. We link carrier 20xl and now the page and AC model is not exist. So what I can do to solve the broken link issue?

    | bondhoward
    0

  • Hello Mozzers, I'd love to receive some advice for a client of mine and insights you may have regarding pros and cons on changing your main domain to mask links. Within a competitive niche there are about 4 different sites that routinely rank 1-4. Our site crushes all three on just about all metrics except we have a high volume of nofollow links and our site remains at #4. Our site is much older so we have significantly more links than these smaller sites, including pre-penguin penalty spammy links (like blog comments that make up 50+ nofollow links from 1 comment per domain). Obviously we are attempting to remove any toxic links and disavow, however the blog comment nofollow links skew our anchor text ratio pretty intensely and we are worried that we aren't going to make a dent in removing this type of links. Just disavowing them hasn't worked alone, so if we are unable to remove the bulk of these poor quality links (nofollow, off-topic anchor text, etc..) we are considering 301 redirecting the current domain to a new one. We've seen success with this in a couple of scenarios, but wanted to see other insights as to if masking links with a 301 could send fresh signals and positively effect rankings. Also wanted to mention, 2 of the 3 competitors that outrank us have EMD's for the primary keywords. Appreciate your time, insights, and advice on this matter.

    | Leadhub
    0

  • Hello! The company that I work for has recently acquired two other companies. I was wondering what the best strategy would be as it relates to redirects / authority. Please help! Thanks

    | Colin.Accela
    0

  • We decided to redesign our site to make it responsive as Google is ranking sites based on mobile friendliness.  Along with this we have changed our URL structure, meta tags, page content, site navigation, internal interlinking. How stupid is it to launch this site right in the middle of record traffic?  Our traffic is climbing 10,000 more visitors every day with the current site.  Visitors have increased 34% over the last 30 days compared to the previous 30 days.

    | CFSSEO
    0

  • I'd be grateful if I could check my thinking... I've agreed to give some quick advice to a non profit organisation who are in the process of moving their website from an ac.uk subdomain to a .co.uk domain. They believe that their SEO can be improved considerably by making this migration. From my experience, I don't see how this could be the case. Does the unique domain in itself offer enough ranking benefit to justify this approach? The subdomain is on a very high authority domain with many pre-existing links, which makes me even more nervous about this approach. Does anyone have any opinions on this that they could share please? I'm guessing that it is possible to migrate safely and that there might be branding advantages, but from an actual SEO point of view there is not that much benefit? It looks like most of their current traffic is branded traffic.

    | RG_SEO
    0

  • Hi Folks, I'm in the process of going over our corporate website with a view to improving on-page optimisation, layout, design and user experience and I would like your feedback on what you think I should improve or change with respect to SEO. Some of my ideas include: Restructure Home Page to Better Show Our Services Possibly Add a Slider to the Home Page (I know engagement rates with these are generally low) Restructure the Course Pages Completely (https://purplegriffon.com/courses/itil-training/itil-foundation-training/itil-foundation) Restructure the Events Pages Completely (https://purplegriffon.com/event/2028/itil-foundation) Improve & Streamline the Booking Process AJAXIFY the Booking Process Improve Responsive Elements I'm also interested in conducting user testing before I go ahead and make any changes. What are your thoughts? What would you change? Thanks. Gaz

    | PurpleGriffon
    0

  • Hi, 🙂 My company is having new distributor contract, and we are starting to sell products on our own webshop. Bio-technology is an industry in question and over 1.000 products. Writing product description from scratch would take many hours. The plan is to re-write it. With permission from our contractors we will import their 'product description' on our webshop. But, I  am concerned being penalies from Google for duplicate content. If we re-write it we should be fine i guess. But, how can we be sure? Is there any good tool for comparing only text (because i don't want to publish the pages to compare URLs)? What else should we be aware off beside checking 'product description' for duplicate content? Duplicate content is big issue for all of us, i hope this answers will be helpful for many of us. Keep it hard work and thank you very much for your answers, Cheers, Dusan

    | Chemometec
    0

  • I have a client and I'm working on a project with their development team. We're creating dynamic landing pages populated with their lead data. Similar to the style of linkedin. For example, someone searches for "demand generation manager" and one of our pages with contact information (some) shows up in the search results. www.leadgenius.com/demandgenerationjobs (or something like that. we have yet to flesh this out). We're also looking into different backlinking strategies to support optimizing the above mentioned pages in addition to their new site launching this month. What is the best way to optimize the dynamic pages as well as the main site in tandem or independently?

    | Intergen
    0

  • Hi all! I'm beginning to travel down the road of becoming an SEO expert! I've attended the latest few webinars on Moz, and have started watching the White Board Fridays. However I'm wondering, for the current SEO Experts, how did you get to where you are today? I.e. What books did you read? Did you pay for classes or just learn everything from Moz? Where is a good place to get an SEO Expert Certification/Is it necessary? How long did it take you to become an expert? (Stuff like that) I suppose I'm looking to make a list for myself, organizing what I should learn first, and then create a timeline moving forward. Thanks for your help Mozzers! - Briana B.

    | JCWhelan1
    3

  • We are currently working with a client that relaunched their website two months ago to have hypertext transfer protocol secure pages (https) across their entire site architecture.   The problem is that their non secure (http) pages are still accessible and being indexed in Google. Here are our concerns: 1.  Are co-existing non secure and secure webpages (http and https) considered duplicate content? 
    2.  If these pages are duplicate content should we use 301 redirects or rel canonicals? 
    3.  If we go with rel canonicals, is it okay for a non secure page to have rel canonical to the secure version? Thanks for the advice.

    | VanguardCommunications
    0

  • Hi! This is a follow on question from my other post - http://moz.com/community/q/site-dropped-after-recovery. As mentioned there, I've ad a manual penalty revoked for http://www.newyoubootcamp.com/. This came after the forum was hacked and some poor quality SEO was done. We've managed to clean a large amount of links, but ones such as http://about1.typepad.com/blog/2014/04/tweetdeck-to-launch-as-html5-web-app-now-accepting-beta-testers.html (anchor is "microsoft") are still being found and indexed. My question is that although the forum is now 410'd, can these junk links still be causing any harm? A huge amount have been disavowed, and many others taken down after a manual outreach campaign, but still others are appearing. The site is performing poorly in search despite having a much better domain authority, driven by largely great links from national newspapers, than its competitors, as well as solid user metrics such as a bounce rate of 30% and few on-site issues. This makes me think it must be the link profile. Any advice would be much appreciated. S

    | Blink-SEO
    0

  • Hi, I know 70-80% of the links on Google have no-follow keyword. What I need to know is if link building by using guest posting and a combination of no-follow links through social media is still effective ? What would you suggest in terms of link building. I have read all the articles on moz and everything, but I need a personal touch on this matter. Thanks,
    Andrei

    | kiraftw
    0

  • Hello All, On my ecommerce landing pages, I currently have links to my products as H3 Tags. I also have useful guides displayed on the page with links useful articles we have written (they currently go to my news section). I am wondering if I should put those article links as additional H3 tags as well for added seo benefit  or do I have to many tags as it is ?. A link to my Landing Page I am talking about is - http://goo.gl/h838RW Screenshot of my h1-h6 tags - http://imgur.com/hLtX0n7 I enclose screenshot my guides and also of my H1-H6 tags. Any advice would be greatly appreciated. thanks Peter

    | PeteC12
    0

  • Hi everybody! I've been working for http://www.newyoubootcamp.com for some time now. They came to me as they had dropped heavily for their main term, "boot camp". This turned out to be due to a manual penalty, which was in part due to their forum being hacked, as well as some bad link building. Here's an example of the dodgy forum links - http://about1.typepad.com/blog/2014/04/tweetdeck-to-launch-as-html5-web-app-now-accepting-beta-testers.html. The anchor is "microsoft". They've all been 410'd now. Also, we cleaned up the other bad links as best we could, and got through the manual penalty. The site then returned to #5 for "boot camps", below its pre-crash peak of #2, but OK. Over the past few weeks, it has started to slide though. I'm certain it is not down to a lack of quality links - this site has great PR and links from national newspapers and magazines. There's been a few on-site issues too, but nothing outrageous. I'm getting a bit stumped though, and any fresh eyes would be much appreciated!

    | Blink-SEO
    0

  • My team sells sailboats and pontoon boats all over the country. So while they are both boats, the target market is two different types of people... I want to make a landing page for each state so if someone types in "Pontoon Boats for sale in Michigan" or "Pontoon boats for sale in Tennessee," my website will come up. But I also want to come up if someone is searching for sailboats for sale in Michigan or Tennessee (or any other state for that matter). So my question is, should I make 1 page for each state that targets both pontoon boats and sailboats (total of 50 landing pages), or should I make two pages for each state, one targeting pontoon boats and the other sailboats (total of 100 landing pages). My team has seen success targeting each state individually for a single keyword, but have not had a situation like this come up yet.

    | VanMaster
    0

  • Currently our corporate website and store website are under two domains. internationalcompany.com (DA: 51; Corporate Website) companystore.com (DA: 34; US Store Website) We were hoping to piggyback on the corporate website domain authority by moving our store to internationalcompany.com/store and when we learned that couldn't happen we opted for us.internationalcompany.com/store. The reason we are leaning towards the route of us.internationalcompany.com is because it is likely that we will be taking over the US branch of the corporate website so we thought it better that the store be a sub address of that. My main concerns... From what I have gathered it seems that I can't do a change of address to a subdomain within Webmaster Tools - I'd have to have access to internationalcompany.com which won't happen soon. So, is a 301 just as good in this case? As a subdomain, we won't actually reap the benefits of the domain authority of the parent domain will we? Are we just as well off considering a new domain and asking that regional tags be established on the current internationalcompany.com so that the content does not interfere with our SEO efforts? This is a broad explanation for a complicated issue. Please ask any question that may help clarify.

    | bearpaw
    0

  • Hello All, We are looking at our internal links and most of them say "More" or "View All" The "more" anchor Text links - are usually positioned on the Body Content as we only display a portion of the content and then the user clicks more to see all the content ? - Should we be changing the "More" Text to something more keyword /phrase friendly i.e " more information about carpet cleaning" or "more information on Tool hire"  or would that be deemed as spammy ? thanks Peter

    | PeteC12
    0

  • Hello all, I have 2 301 redirects on my some of my landing pages and wondering if this will cause me serious issues. I first did 301 directs across the whole website as we redid our url structure a couple of months ago. We also has location specific landing pages on our categories but due to thin/duplicate content , we have got rid of these by doing 301's back to the main category pages. We do have physical branches at these locations but given that we didnt get much traffic for those specific categories at those locations and the fact that we cannot write thousands of pages of unique content content , we did 301's. Is this going to cause me issues. I would have thought that 301's drop out of serps ? so is this is an issue than it would only be a temporary one ?.. Or should I have 404'd the location category pages instead. Any advice greatly appreciated. thanks Peter

    | PeteC12
    0

  • Hi, I had to edit some of the URLs. But, google is still showing my old URL in search results for certain keywords, which ofc get 404. By crawling with ScremingFrog it gets me 301 'page not found' and still giving old URLs. Why is that? And do I need to re-index pages with new URLs? Is 'fetch as Google' enough to do that or any other advice? Thanks a lot, hope the topic will help to someone else too. Dusan

    | Chemometec
    0

  • Hello Moz Community Hoping somebody can assist. We have a subdomain, used by our CMS, which is being indexed by Google. 
    http://www.naturalworldsafaris.com/
    https://admin.naturalworldsafaris.com/ The page is the same so we can't add a no-index or no-follow. 
    I have both set up as separate properties in webmaster tools I understand the best method would be to update the robots.txt with a user disallow for the subdomain - but the robots text is only accessible on the main domain. http://www.naturalworldsafaris.com/robots.txt Will this work if we add the subdomain exclusion to this file? It means it won't be accessible on https://admin.naturalworldsafaris.com/robots.txt (where we can't create a file). Therefore won't be seen within that specific webmaster tools property. I've also asked the developer to add a password protection to the subdomain but this does not look possible. What approach would you recommend?

    | KateWaite
    0

  • Hi, For domain http://www.homeadda-sobhaaspirationalhomes.in/ keyword Sobha Aspirational Homes, We had google india ranking 5 continuously for past 2 months. There is sudden drop in rank and we have been pushed to rank 114. I do not understand why this happened. Regards, Mithun

    | mithungowda
    0

  • Hello fellow Mozzers, We have two websites for two similar brands at my place of employment, the two brands currently serve slighly different products but could be held quite happily under one branded site. As part of a potential group merger into one sole brand, we will have to create one joined up website which will then feature all our products. The newly merged site will also have more scope to allow us to expand our product range where as currently one brand is kind of specific to a particular market due to its name. So as part of the Merge, I have to consider the potential implications for our search traffic, as this is an integral part of our business. Brand A - older, more authorative, great content, good organic positions - top 10 for pretty much all terms we favour. Brand B - younger, but has more marketing scope due to name, still good site and lots of content. Unfortunately Brand B has more in terms of potential lifespan, but is currently the less authorative of the two sites we run. it has lower DA and PR according to my Moz Analytics, a lower number of quality links and less content. In order to give the Brand B website the boost that is needed and in effect replace Brand A in the serps which has great organic positions, I need to make sure all bases are ticked for an action plan. So far this is what I have. Transfer all exisiting Brand A web pages to Brand B website. Rel canonical all Brand A pages to now point to Brand B websites new pages. 301 redirect all pages on Brand A to Brand B during the transfer. Once 301 redirects are in place then request external sites to actually repoint to Brand B website for any links. Update xml Sitemaps Update any content that mentions Brand B to now be Brand A. resubmit sitemaps to Webmaster tools Update all social profiles Update all local search profiles and listings Update all review sites with new brand name / merge any with both brands On a supplementary note for customer information, looking to also keep the older Brand A Home page up for a short time to help people understand the transition rather than a complete redirect which to our demographic could confuse and alienate people. Will also look to send a mass email to roughly 400K people informing them of the move abd how it affects them. I have no doubt there will be some glaringly obvious additions, any further advice would be much appreciated. Hope you are all well. Tim

    | TimHolmes
    1

  • Hi, I am having some issues with country targeting of our sites. Just to give a brief background of our setup and web domains We use magento and have 7 connected ecommerce sites on that magento installation 1.www.tidy-books.co.uk  (UK) - main site 2. www.tidy-books.com (US) - variations in copy but basically a duplicate of UK 3.www.tidy-books.it (Italy) - fully translated by a native speaker - its' own country based social medias and content regularly updated/created 4.www.tidy-books.fr  (France) - fully translated by a native speaker - its' own country based social medias and content regularly updated/created 5.www.tidy-books.de (Germany) - fully translated by a native speaker - uits' own country based social medias and content regularly updated/created 6.www.tidy-books.com.au (Australia) - duplicate of UK 7.www.tidy-books.eu (rest of Europe) - duplicate of UK I’ve added the country and language href tags to all sites. We use cross domain canonical URLS I’ve targeted in the international targeting in Google webmaster the correct country where appropriate So we are getting number issues which are driving me crazy trying to work out why The major one is for example If you search with an Italian IP in google.it  for our brand name Tidy Books the .com site is shown  first then .co.uk and then all other sites followed on page 3 the correct site  www.tidy-books.it The Italian site is most extreme example but the French and German site still appear below the .com site. This surely shouldn’t be the case? Again this problem happens with the co.uk and .com sites with when searching google.co.uk for our keywords the .com often comes up before the .co.uk so it seems we have are sites competing against each other which again can’t be right or good. The next problem lies in the errors we are getting on google webmaster on all sites is having no return tags in the international targeting section. Any advice or help would be very much appreciated. I’ve added some screen shots to help illustrate and happy to provide extra details. Thanks UK%20hreflang%20errors.png de%20search.png fr%20search.png it%20search.png

    | tidybooks
    1

  • We can add any number of links? from authority websites? Like - http://packforcity.com/save-money-on-a-trip-to-disneyland/ http://packforcity.com/how-to-save-money-when-traveling-to-disneyland-part-2/ http://packforcity.com/new-york-city-top-faqs-for-travels/ http://packforcity.com/what-to-wear-in-kyoto-in-february/

    | bondhoward
    0

  • I have been looking a GWT HTML improvements on our new site and I am scratching my head on how to stop some elements of the website showing up as duplicates for Meta Descriptions and Titles. For example the blog area: <a id="zip_0-anchor" class="zippedsection_title"></a>This blog is full of information and resources for you to implement; get more traffic, more leads an /blog//blog/page/2//blog/page/3//blog/page/4//blog/page/6//blog/page/9/The page has rel canonicals on them (using Yoast Wordpress SEO) and I can't see away of stopping the duplicate content.  Can anyone suggest how to combat this? or is there nothing to worry about?

    | Cocoonfxmedia
    0

  • Does google count YouTube links in the video description

    | bondhoward
    0

  • I remember reading some years ago that domains and pages that have smushed keywords, such as cheapbaseballs.com/redbaseball.html could be identified by Google as "cheap baseballs" and "red base ball".  Is this still correct?

    | CFSSEO
    0

  • Hi Mozzers, the very old theme of my client's site doesn't support the new WP menus, so the menu structure is built in the old way, with parent and child pages. For example the page whose url is www.site.com/rent/ is top level and the sub pages are like www.site.com/rent/cars/ or www.site.com/rent/trucks/ Also there are third generation pages like  www.site.com/rent/cars/fiat500/ Since I am changing the theme, I am going to use the new WP menu system and I'd like to know if this parent/child url structure may hurt link equity propagation. My question is: should I detach the child pages from the parents and, according to that, change the urls of the child pages from:
    www.site.com/rent/cars/ to www.site.com/rent-cars/ (and of course 301 redirect them) ? Or is it ok to leave the pages' url in the old parenting way? What is the best practice? Thank you very much! DoMiSoL Rossini

    | DoMiSoL
    0

  • Howdy Moz, We've recently bought a new domain and we're looking to change over to it. We're also wanting to change our permalink structure. Right now, it's a WordPress site that uses the post date in the URL. As an example: http://blog.mydomain.com/2015/01/09/my-blog-post/ We'd like to use mod_rewrite to change this using regular expressions, to: http://newdomain.com/blog/my-blog-post/ Would this be an appropriate solution? RedirectMatch 301 /./././(.) /blog/$1

    | IanOBrien
    0

  • My client has an old eCommerce website that is ranking high in Google. The website is not responsive for mobile devices. The client wants to create a responsive design mobile version of the website and put it on a different URL address. There would be a link on the current page pointing to the external mobile website. Is this approach ok or not? The reason why the client does not want to change the design of the current website is because he does not have the budget to do so and there are a lot of pages that would need to be moved to the new design. Any advice would be appreciated.

    | andypatalak
    0

  • My URL is: https://plentific.com/ Hi guys, About us: We are running an AngularJS SPA for property search.
    Being an SPA and an entirely JavaScript application has proven to be an SEO nightmare, as you can imagine.
    We are currently implementing the approach and serving an "escaped_fragment" version using PhantomJS.
    Unfortunately, pre-rendering of the pages takes some time and even worse, on separate occasions the pre-rendering fails and the page appears to be empty. The problem: When I manually submit pages to Google, using the Fetch as Google tool, they get indexed and actually rank quite well for a few days and after that they just get dropped from the index.
    Not getting lower in the rankings but totally dropped.
    Even the Google cache returns a 404. The question: 1.) Could this be because of the whole serving an "escaped_fragment" version to the bots? (have in mind it is identical to the user visible one)? or 2.) Could this be because we are using an API to get our results leads to be considered "duplicate content" and that's why? And shouldn't this just result in lowering the SERP position instead of a drop? and 3.) Could this be a technical problem with us serving the content, or just Google does not trust sites served this way? Thank you very much! Pavel Velinov
    SEO at Plentific.com

    | emre.kazan
    1

  • Hi There, A client is setting up a second website selling the same products from a separate domain with the same descriptions etc. The site will have a separate URL, but will administered from the same CMS. The only difference is the new site has only one brand instead of several on the main site. E.G The main site sells all plumbing brands, the second site just one brand. Your thoughts and advice for best practise would be much appreciated. Andy (Marz Ventures)

    | MarzVentures
    0

  • Hi guys. We've found Google is pulling the wrong information for our title tag and meta description. Instead of pulling the actual title tag, Google is pulling the menu name you click on to get to the page: "Bike Barcelona" instead of "Barcelona Bike Tours | ...." Also, we've found that, instead of pulling the meta description we wrote, Google is using text from the pages copy. Any tips?

    | BarcelonaExperience
    0

  • Hi, I actually asked it a year and a half ago (with a slight variation) but didn't get any real response and things do change over time. On my eCommerce website I have the main category pages with client side filtering and sorting. As a result, the number of page views is lower than can be expected. Do you think having more page views is still a ranking factor? and if so is it more important than user experience? Thanks

    | BeytzNet
    1

  • Hi,I have a strange problem. My website www.dealwithautism.com is just 2 months old and have 40+ high quality articles that are already beginning to see some organic traffic from Google without any off page SEO (link building, etc). By quality articles I mean:
    1. Each article is 1500+ words of unique and highly relevant content with solid on page SEO (images may be reused from Google images). Moz page grader=A for most pages 2. Pretty well structured (with good number of internal links) 3. Entire site (all pages) delivered over https SSL using 301 redirect 4. No malware or spammy backlinks 5. NAP details and social signals available 6. Already ranking top10 in google SERPs for long tail KWs 7. According to Google Webmasters, no crawl errors except for a few (less than 10) 404s 8. Fully responsive - all pages tagged as "Mobile Friendly" by Google However, since day 1, Bing has not indexed a single page on my website (xml sitemap was updated from day 1) even though they are crawling the site. I recently raised an Email ticket and this was their response: "Upon checking, it appears that your site did not meet the standards set by Bing to get indexed the last time it was crawled. However, we will be looking further into this issue along with the Product Group to review the content of your website for re-evaluation. We currently do not have an ETA for the update but please be assured that we will get back to you as soon as they become available." Now based on my previous experience, this could take months. Following are just a few sample pages on the website: https://www.dealwithautism.com/oppositional-defiant-disorder-treatment-and-odd-case-study/ https://www.dealwithautism.com/tourette-syndrome-symptoms-treatment-for-tourettes/ https://www.dealwithautism.com/autism-test-for-toddlers/ I believe the quality of these pages are quite good for a small new website.
    Then what does Bing mean by "website not meeting standards"? Am I missing a piece of the puzzle? I would have thought that Google was more quality focused than Bing but my SEO performance in Google is currently exceeding my expectation. Can you experts please help me out here?

    | DealWithAutism
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.