Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi Guys, First question here, after splitting our content across 2 subdomains (~6 months ago) we've noticed google showing several of our pages on page 1. Would it be better to somehow consolidate to just one page (in the hopes that together it would push the rank higher or is it better left to google to work out on its own? I've attached an example of this happening with one of our targeted keywords. HwEARxd

    | mattjamesaus
    0

  • Hello, We have developed our e-commerce site in Magento and we are launching our own blog. Currently we are using an aheadWorks blog extension, but I was wondering if it is better for SEO to use a Wordpress extension. What do you think? Thank you!!

    | DoitWiser
    0

  • Hi Guys, I am wondering if anybody can point me to a recent trusted report or study on international domain name structure and SEO considerations. I am looking to read up on the SEO considerations and recommendations for the different domain structures in particular using sub-directories i.e. domain.com/uk, domain.com/fr. Kind regards,
    Cian

    | WeAreContinuum
    1

  • Hi all, I have been researching the best way for back links building, and I would like to ask few questions before I start. Which one of these tools would you recommend for back link building diagnostics. www.linkrisk.com - www.linkdetox.com What would be the best procedure to begin creating healthy back links? Would looking at my competitors back links help me? What would be the recommended amount of back links created per week? Also how many blogs entries should we aim to create per week? The website i'm working on is manvanlondon.co.uk If you guys have any further suggestions please let me know. Many thanks for your time.

    | monicapopa
    0

  • We have a URL structure question: Because we have websites in multiple countries and in multiple languages, we need to add additional elements to our URL structure. Of the two following options, what would be better for SEO? Option 1: www.abccompany.com/abc-ca-en/home.htm Option 2: www.abccompany.com/home.abc.ca.en.htm

    | northwoods-260342
    0

  • I have a website that I am trying to get to up the SERPs. However the site isn't appearing in the search - even when I search for the business name. The site is http://www.jl-engineering.com/ The keyword targeted is DPF Cleaning. Could anyone explain to me why the site isn't showing at all - and how to fix this? Thanks

    | SWD.Advertising
    0

  • I have a previously very strong ranking page that is now omitted from the SERPs, but only for one specific keyword phrase.  I think I found the reason, which I'll explain, and I hope I can hear some confirmation of my theory and a way to correct it. Let's use the following made up domain and keywords: Political blog SiteA.com had a few news articles about "Blue Widgets" (like 10 out of 10,000 pages). They became exceedingly popular, so on SiteA.com we created a reference-type page about "Blue Widgets" and in the news articles we already had about Blue Widgets we added rich anchor text (Blue Widgets) links that pointed to this new About Blue Widgets page.  (long before we wised up about keyword rich anchor texts and Google!) After seeing how much traffic was coming to the About Blue Widgets page, we created a whole new site, SiteB.com, which was about Widgets (not just Blue Widgets), a page for each color of widget, and other pages about widgets. SiteB.com has an important and popular page, SiteB.com/blue-widgets, which is about Blue Widgets.   We then 301 redirected the SiteA.com's About Blue Widgets page to SiteB.com/blue-widgets. This page in SiteB.com ranked very high (like #2, #3) for years. Two weeks ago SiteB.com/blue-widgets fell out of the SERPs, but only for the phrase "Blue Widgets".  The page still gets lots of traffic from other queries, and even the "Blue Widgets" query will bring up other pages on SiteB.com.  So, the only thing hit is the specific query "Blue Widgets" for the specific page SiteB.com/blue-widgets. It seems obvious to me that Google took the combination of a) a site that it probably no longer liked since we sold it (SiteA.com) since it's gone downhill, b) the rich keyword anchor text on SiteA.com pages pointing to the SiteA.com page optimized for that keyword, and c) then being 301 Redirected to a SiteB.com Blue Widgets page optimized for that same anchor text. I only discovered the SiteA.com redirects last week, which I had completely forgotten about, and had them removed right away. My question is, 1) if this indeed was the issue, now that the redirects from SiteA.com to SiteB.com are gone will my ranking eventually go back to normal? and 2) is there anything I can do to get Google to notice the change and have it go back to how it was?

    | bizzer
    0

  • Hi All, I recently relaunched a new design on my tool hire eCommerce website and now display my products in grid form on my category landing pages as opposed to just a list view which we previously had on the old design. My bounce rates are alot higher than they use to be and my gut instinct is telling me maybe this is wrong . I want to do some a/b testing using a list view. My question is , previously in our list views we  just showed the images and pricing and had on page content on the bottom of the page. The user would click on the product image and they would then we taken to the product page which has the product description , t&c, etc etc.. If I was to do this in my a/b testing but change it so we also displayed the product descriptions as well on the category landing pages . Is there a special way to do this as in effect, we would have duplicate content as the product descriptions are also on the product page?. Does anyone have any thoughts on this as to whether its a No No from an SEO point of view ?... Heres a short url link to one of my category pages - http://goo.gl/QJv5gw Historically we use to rank well for the category landing pages and not for the product pages.Our Rankings are down , bounce rates are higher so I am trying to sort both. We have good content on pages etc. Any advice greatly appreciated as always thanks Pete

    | PeteC12
    0

  • Hi, We all know that Google doesnt like slow loading pages, fair enough! However, for one of my websites, user interactivity is key to its success. Now each of my pages are fairly large sized (ranges in the order or 1.8 to 2.5 MB) because it has a lot of pictures, css and at times some Java script elements. However, I have tried to ensure that the code is optimized - for example html minified and compressed, caching enables, images optimized and served through CDN, etc. In spite of high page size, my GTMetrix PageSpeed score is 93+ for most pages. However, the number of requests served is 100+ and page loading time is 4.5s + as per GTMetrix and Pingdom. My question is - should this matter from an SEO perspective. Is google likely to penalize me for high loading time even though I am serving highly optimized pages? I really dont want to cut down on the user interactiveness of my website unless I have to from an SEO perspective. Please suggest. Here is my homepage, just as to give you an idea of what i am talking about: www.dealwithautism.com

    | ashishb01
    0

  • Hi Everyone, I'm currently looking to optimise the inner page of a website opposed to the homepage itself. I was wondering if I should stick to some kind of link distribution? For instance, say my website is about widgets and the url is http://www.widgets.com, I want to optimise for a much easier "blue widgets" term on an inner page with the url: http://www.widgets.com/blue-widgets. Does google discriminate against a website with a higher number of links pointing to an inner page than the homepage? If so, what would you recommend a safe distribution between the two? Your thoughts would be greatly appreciated, Peter.

    | RoyalBlueCoffee
    0

  • Anyone else seeing this?  Or is it a test rather than full rollout?

    | Net66SEO
    0

  • Hi all, I have question about internal linking and canonical tags. I'm working on an ecommerce website which has migrated platform (shopify to magento) and the website design has been updated to a whole new look. Due to the switch to magento, the developers have managed to change the internal linking structure to product pages. The old set up was that category pages (on urls domain.com/collections/brand-name) for each brand would link to products via the following url format: domain.com/products/product-name . This product url was the preferential version that duplicate product pages generated by shopify would have their canonical tags pointing to. This set up was working fine. Now what's happened is that the category pages have been changed to link to products via dynamically generated urls based on the user journey. So products are now linked to via the following urls: domain.com/collection/brand-name/product-name . These new product pages have canonical tags pointing back to the original preferential urls (domain.com/products/product-name). But this means that the preferential URLs for products are now NOT linked to anywhere on the website apart from within canonical tags and within the website's sitemap. I'm correct in thinking that this definitely isn't a good thing, right? I've actually noticed Google starting to index the non-preferential versions of the product pages in addition to the preferential versions, so it looks like Google perhaps is ignoring the canonical tags as there are so many internal links pointing to non-preferential pages, and no on-site links to the actual preferential pages? I've recommended to the developers that they change this back to how it was, where the preferential product pages (domain.com/products/product-name) were linked to from collection pages. I just would like clarification from the Moz community that this is the right call to make? Since the migration to the new website & platform we've seen a decrease in search traffic, despite all redirects being set up. So I feel that technical issues like this can't be doing the website any favours at all. If anyone could help out and let me know if what I suggested is correct then that would be excellent. Thank you!

    | Guy_OTS
    0

  • We have several country specific sites set up as folders of our main domain. We use hreflang tags to get the relevant site served in each country, with mixed success. In about 60% of searches the US site appears. Beginning in late October our rankings for the US have slowly, but reasonably steadily dropped. Each week we'll see 10 or so keyword rises and 15-30 keywords drop. Generally by 1-3 places. This is much more movement than we were seeing in the months prior to this. Is this a result of penguin, or just coincidence? The US subfolder is the only one which has seen drops overall, the rest have actually improved slightly during this time period. I would expect any impact due to Penguin to effect the whole domain? I've been checking through our backlinks and we do a have a handful of bad links, along with a 100 or so which look a little odd and not completely relevant. I have contacted sites and had some of these removed, and created a disavow list with the worst of the rest. I haven't asked for the site to be re-considered yet. We haven't had any message in webmaster tools re: bad links or similar. Cheers.

    | ahyde
    0

  • Hi guys, Our brand site (http://urban3d.net) has been seeing steady decline due to algorithm updates for the past two years. Our previous SEO company engaged in some black-hat link building which has hurt us very badly.  We have recently re-launched the site, with better design, better content, and completed a disavow of hundreds of bad links. The site is technically indexed, but is still nowhere in the SERPs after months of work to recover it by our internal marketing team. The last SEO company also told us to build EMD sites for our core services, which we did: http://3dvisualisation.co.uk/ http://propertybrochure.com/ http://kitchencgi.com/ My question is - could these EMD sites now hurting us even further and stopping our main brand site from ranking? Our plan is to rescue our brand site, with a view to retiring these outlier sites. However, with no progress on the brand site, we can't afford to remove these site (which are ranking). It seems a bit chicken and egg. Any advice would be very much appreciated. Aidan, Urban 3D

    | aidancass
    0

  • I've noticed an issue with our site, which uses parallax on multiple pages.  Here is an example: If you search for About SQL Sentry, you get the correct title tag and description If you search for SQL Sentry Careers (which is on the about us page) it appears that Google has made up a title tag and description for it.  Is there any way to force a separate title tag for a part of a parallax  page?

    | Sika22
    0

  • I recently noticed that the domain authorities dropped by 5 or 6 points for us and for our competitors. And for many other pages that I check regularly. Was the algorithm changed or is it just me?

    | FCRMediaLietuva
    0

  • Hi Guys, I've been trying like mad to get Rich Snippets (star ratings, price, availability) to show for product page results for www.evo.com. What's very strange is when I test my product page URLs in the Structured Data Testing Tool, the previews look great and the extracted data is what I'd expect.  I am however getting "missing price" errors for every item in the Structured Data report in Webmaster Tools - which seems contradictory to what the testing tool shows. The error description says this can prevent Rich Snippets from showing.  If anyone here could take a look at the schema.org/Offer markup on one of our product pages and see if they can see anything wrong with our price markup I would greatly appreciate it! The plot thickens.... What's even stranger is that when I submit product page URLs (that have reviews) to Google's index using Fetch as GoogleBot, the Rich Snippets appear - but then disappear sometime in the following day(s).  The only thing I can think of is somehow my Merchant Center feed (which contains "product_review_count" and "product_review_average") which runs nightly is somehow 'breaking' the Rich Snippets that are generated after the page has been crawled. Any advice is greatly appreciated!
    Will

    | evoNick
    0

  • We recently implemented Schema.org/product on our site (www.evo.com).  In the Google Webmaster Tools Structured Data report we’re getting lots of errors: http://screencast.com/t/Z3QJBctjUvP which I believe is preventing our rich snippets (price, availability, ratings) from showing in search results. When I click into the “Product” data type on the Structured Data report I see that there’s 2 errors: missing price and missing best or worst rating: http://screencast.com/t/SuHVYFLFO5D We are adding the itemprop=“bestRating” code which should take care of the ‘missing best or worst rating’ error. The missing price error is what I want to ask about.  There’s a couple strange things here (using this URL as example : http://www.evo.com/skis/line-sir-francis-bacon.aspx - which has been indexed since the code was added): 1)      The Webmaster Tools report is finding the schema.org/offer data type and is recognizing the InStock and OutOfStock property of this: http://screencast.com/t/xtHouzeL37q BUT price is not being detected. 2)      When I enter the URL into the Structured Data Testing Tool it does detect price: https://www.google.com/webmasters/tools/richsnippets?url=http://www.evo.com/skis/line-sir-francis-bacon.aspx 3)      When I fetch the page as GoogleBot itemprop=“price”is present: http://screencast.com/t/Hnqda95N My hunch is that the reason our Rich Snippets are not showing is because of the “price” error.  The “?” by the error in WMT says: “This property is missing in the html markup or was not properly highlighted in the Data Highlighter. This can prevent the rich snippet from appearing” Does anyone have an idea why we’re getting the “price” error – or anything else that could prevent our Rich Snippets from displaying? Thanks so much! http://screencast.com/t/SuHVYFLFO5D

    | evoNick
    0

  • Hi I have a site which shows on page 25 in G serps for the main brand keyword which is also the url its a .com and as far as I can see has no penalties and has unique content. The keyword itself has no competition and the site should be no1 in G for it.  Our site domain is 11 years old.

    | MoneySite
    0

  • what to do with posts that should be pages. One of our clients is a nation wide company with different local pages, targeting city+business. This week we found out that every page has a publishing date in the search results. So the pages are not pages but posts? the publishing dates are a some time ago and we think they hurt the rankings. We want to make those post/ pages, but does this bring any risk? example: plumbing.com/chigaco to the same url? and just change code? the website is build in something called "send".

    | remkoallertz
    0

  • I've seen multiple questions about this but there's a few different answers on ways to approach it. Figured I'd personally ask for our situation. Any advice would be appreciated. We formed a new company with a new name / domain while at the same time buying an existing company in our industry. The domain and site of the company we acquired is ranking for some valuable keywords and still getting a significant amount of traffic (about half of what our new site is getting). A big downside has been, when they moved that site to a different server, something happened to where the site became uneducable so it's full of bad pricing and information. Because of that, we've had a maintenance page up for a little bit because it was generating calls to our sales team (GOOD) but the customer was having seen incredibly incorrect information (BAD) Rather than correcting those issues or figuring out why the site is un-editable, we just want to find a way where we can leverage that traffic and have them end up at our new site. Would we 301 redirect the entire domain to our new one? If we did that would the old domain still keep the majority of it's page rank?

    | HuskyCargo
    1

  • Hello Guys, I have 2 blogs with content. One is getting alot of visitors and the other gets alot less. I´m thinking in transferring all the content from the "weak" blog to the "strong" blog. Both websites are on wordpress. My questions is pretty simple. How can I transfer this content without loosing traffic and how can I avoid duplicate content? Whats the best SEO practices? Thanks!

    | Kalitenko2014
    0

  • My Google search rankings are improving rapidly at the moment, but a lot of my rankings are for images (presume that means the images are appearing near the top in Google Images). How do I capitalise on that? It's not really much help to me that my images are popular unless it results in traffic to the pages where those images are used. I am running Wordpress so I have the option to have images embed as "no link", "link to attachment page", "link to original image", etc. Is there any advantage of using one of these over the other? I'd really like to set it up so that when a Google Images user clicks "View Image" it loads the attachment page or the host content page rather than the image. Bad SEO? I'm not sure if the fact that I'm using Jetpack Photon CDN image hosting will make this more complicated or not. Tony

    | Gavin.Atkinson
    0

  • Hi Moz Community, We have the following robots command that should prevent URLs with tracking parameters being indexed. Disallow: /*? We have noticed google has started indexing pages that are using tracking parameters. Example below. http://www.oakfurnitureland.co.uk/furniture/original-rustic-solid-oak-4-drawer-storage-coffee-table/1149.html http://www.oakfurnitureland.co.uk/furniture/original-rustic-solid-oak-4-drawer-storage-coffee-table/1149.html?ec=affee77a60fe4867 These pages are identified as duplicate content yet have the correct canonical tags: https://www.google.co.uk/search?num=100&site=&source=hp&q=site%3Ahttp%3A%2F%2Fwww.oakfurnitureland.co.uk%2Ffurniture%2Foriginal-rustic-solid-oak-4-drawer-storage-coffee-table%2F1149.html&oq=site%3Ahttp%3A%2F%2Fwww.oakfurnitureland.co.uk%2Ffurniture%2Foriginal-rustic-solid-oak-4-drawer-storage-coffee-table%2F1149.html&gs_l=hp.3..0i10j0l9.4201.5461.0.5879.8.8.0.0.0.0.82.376.7.7.0....0...1c.1.58.hp..3.5.268.0.JTW91YEkjh4 With various affiliate feeds available for our site, we effectively have duplicate versions of every page due to the tracking query that Google seems to be willing to index, ignoring both robots rules & canonical tags. Can anyone shed any light onto the situation?

    | JBGlobalSEO
    0

  • Brief history: I am MD of a medium sized health organisation in the UK. We have one of the leading websites in the world for our industry. We were hit by a Google algorithm update last year (Penguin or Panda, I can't remember, but that's not relevant here I don't think) and our daily visits went down from around 10,000 to around 5,000 in two separate hits over a couple of months. Then there was a steady decrease to about 3,000-4,000 visits a day until we totally updated the design of the site and did some good work on the content. We have always been white-hat and the site has around 3,000 pages with unique content added daily. So things have really been on the up for the past couple of months. We have been receiving around 6,000 visits a day in recent weeks (a slow incline over the past few months), until Sunday. Sunday morning around 10am all of our organic listings pretty much disappear, including for our brand name. Monday morning a few come back, including our brand name and our main, most competitive keyword, which we were showing up on the third page for and we returned to this page. Then Tuesday morning another few of our most competitive keywords show up, back where they were before. This includes images which had disappeared from Google images. Our PPC and business listings were not really affected at all. My developer submitted a site map through webmaster tools on Monday morning and I'm not sure if this is the reason pages started to show up again. In our Webmaster tools the indexed pages are about a quarter of all of the ones on the site - all pages were indexed before. I just don't know what has happened! It doesn't make any sense as 1. Google don't seem to have rolled out any algorithm updates on that day 2. we do not have any messages in Webmaster Tools 3. a number of our main keywords have re-appeared - why would that happen if we had been hit by a Google update?! Our organic hits, which previously made up about 80% of all our hits, have gone down by 80% and this is drastically affecting business. If this continues it is likely we will have to downsize the business and I'm not sure what to do. When I saw that the 'indexed pages' in Webmaster tools started to increase (they were around 600 on Monday, around 900 yesterday and then this morning, around 1,300), I thought that we were on our way up and maybe this problem would just resolve itself and our listings would re-appear, but now our indexed pages have reduced slightly since this morning, back down to around 1,100 so the increase has stalled. Can anybody help?! Do you have any idea what could be causing this? Apparently there have been no changes made to robots.txt and my developer says that no changes were made that could have affected our listings. ANY ADVICE WOULD BE GREATLY APPRECIATED.

    | JH1
    1

  • Hi all, We've had a website originally built using static html with .htm extensions ranking well in Google hence we want to keep those pages/urls. We are on a dedicated sever (Windows IIS). However our developer has custom made a new DYNAMIC section for the site which shows new added products dynamically and allows them to be booked online via shopping cart. We are having problems displaying them both on the same domain even if we put the dynamic section withing its own subfolder and keep the static htms in the root. Is it possible to have both function on IIS (even if they may have to function a little separately)? Does anyone have previous experience of this kind of issue or a way of making both work? What setup do we need to do on the dedicated server.

    | emerald
    0

  • Does anybody have experience using hashbang? We tried to use it to solve indexation problem and I'm not fully sure do we use right solution now (developers did it with these FAQ and Guide to Ajax crawling as information source). One of our client has problem, that their e-shop categories, has solution where search engines aren't able to index all products. In this example a category, there is this "Näita kõiki (38)" that shows all category products for users but as I understand search engines aren't able to index it as /et#/activeTab=tab02 because of #. Now there is used #! (hashbang) and it is /et#!/activeTab=tab02. Is this correct solution? Also now example category URL is defferent for better indexation with:
    /et#!/
    ../et And when tabs "TOP ja uued" and "Näita kõik" where activated/clicked then:
    /et#/activeTab=tab01
    /et#/activeTab=tab02 I tried to fetch it in Google Webmaster Tools but it seems it didn't work. I would appreciate it if anybody can check this solution?

    | raido
    0

  • Hi All, I've just checked my rankings and everything on my eCommerce Site has pretty much tanked really badly since my new URL structure and site redesign was put in a place 2 weeks ago. My url structure was originally long and had underscores but we have now made it clean, shorter and use hyphens. We also have location specific pages and we have incorporated these into the new url structure.Basically it now pretty much follows the breadcrumb trail on our website. We were originally a general online hire site but now we have become niche and only concentrating on one types of products, so we got rid of all the other categories/products and pages we do not deal with anymore. Our Rankings issue , was only bought to light in the most recent MOZ Ranking report so it's looking site google hates our new store. Someone mentioned the other day, that Google may have been doing a Panda/Penguin refresh last weekend, but  I am surprised to have dropped like 20 to 50 places for most of my keywords. We have set up the 301 redirects, We have also made the site alot smaller and set up a few thousand 404's to get rid of a lot of redundant pages . We have cut down massively on the thin/duplicate content and have lots of good new content on there. We did new sitemaps , set up schema.org. , increase text to code ratio . Setup our H1-H5 tags on all our pages. made site mobile responsive.. Basically , we are trying to do everything right. Is there anything glaringly obvious , I should be checking ?. I attach a Short url link if anyone wants to have a quick glance- http://goo.gl/7mmEx i.e Could it be a problem with the new urls or anything else that I should be looking at ?.. I.e how can I check to make sure the link juice is being passed on to the new url ? Or is all this expected when doing such changes ? Any advice greatly appreciated .. Pete

    | PeteC12
    0

  • I am planning to do something I never did, and I am wondering if it's really a good idea or not. I have four websites, all of the same company, each one with a different domain and different content: one has been the main official site for 16 years, 200 unique per month, indexed for 134 keywords, Domain Authority 17, 13 linking root domains one has been used as the main site from 2003 to 2006, it's focused on a specific business they actually discontinued, still online, no update since 2006, 500 unique per month, indexed for 92 keywords, Domain Authority 13, 8 linking root domains another has been a built on 2010 and maintained for less than year, and it's focused on a business they never really started, still online, no update since 2010, 3000 unique per month, indexed for 557 keywords, Domain Authority 25, 84 linking root domains a fourth one has been also built on 2010 and focused on a business never really started, still online, no update since 2010, 100 unique per month, indexed for 4 keywords, Domain Authority 6, 3 linking root domains Each website has traffic and links, all links being natural, they never tried to gain links in any way, they never did on page optimization, they never ever thought about SEO. They are not event interlinked. So, my idea is to merge all of them, putting websites 2, 3 and 4 as subfolders of the main site and replicating the old content there. Because those sites have traffic, incredibly one of the abandoned sites has 3000 unique per month, while the main site just 200! My doubts are: does it make sense to merge everything from a SEO prospective? A part from doing 301 correctly, what else should I be careful to do or not to do? website number 4 it's really outdated, content and structure is not easy to merge with the rest, traffic is really small, is it worth spending the time to merge it? Finally I also have a problem; customer didn't want to merge them, they agreed to, but they don't want visitors of the main site to be able to navigate to the old ones, so once moved and redirected I would have to put them in the sitemap of the main site but avoid linking to them on the actual "main" site. As far as I know google crawler doesn't like to find pages in sitemaps which are not reachable through a linking path on the website, is that correct? Is that going to make all the merging work useless? Should I convince the client to at least put small links in the footer or on a page linked from the footer?

    | max.favilli
    0

  • I need help regarding some SEO strategy that need to be implemented to my website http://goo.gl/AiOgu1 . My website is a leading live chat product, daily it receives around 2000 unique visitors. Initially the website was impacted by manual link penalty, I cleaned up lot of backlinks, the website revoked from the penalty some where around June'14. Most of the secondary and longtail Keywords started ranking in Google, but unfortunately, it do not rank well for the primary keywords like (live chat, live chat software, helpdesk etc). Since I have done lot of onsite changes and even revamped the content but till now I dont find any improvement. I am unable to understand where I have got structed.
    can anyone help me out?

    | sandeep.clickdesk
    0

  • Assumed: The material around good migration/redesign practices recommend, logically enough, to change as few things as possible in any given step, thus giving search engines as little trouble as possible identifying and reindexing changes. So if someone is doing significant changes to content, including uri changes, and a rebranding that requires a domain migration, they are generally better off doing one, than the other. 1)  Beyond immediate testing and checking for correct crawl health being reestablished after one change, any thoughts on rules of thumb for when to do the second change?  Do you do it as soon as you see your rankings/traffic turn the corner and confirm an upward trend after the drop, or wait till you have it all back (or at least hit a plateau)? In the absence of data or best practice I'm thinking of just letting 1/3rd to 2/3rds come back. Is a change to HTTPS small enough/similar enough from the search engine's perspective that it makes more sense to do that at the same time as the rebrand driven domain change? Does this create any special risks or considerations beyond those that arise from the individual components of the change?

    | JFA
    0

  • We have a website called imones.lt
    and we have a mobile version for it m.imones.lt We originally put noindex for m.imones.lt. Is it a good decision or no? We believe that if google indexes both it creates double content. We definitely don't want that? But when someone through google goes to any of imones.lt webpage using smartphone they are redirected to m.imones.lt/whatever Thank you for your opinion.

    | FCRMediaLietuva
    0

  • I have a site that has around 5,000 pages now. Are there any recommened online free/paid tools to generate a sitemap for me?

    | rhysmaster
    0

  • I was wondering if everyone could recommend what XML Sitemap generators they use.  I've been using XML-Sitemap, and it's been a little hit and miss for me.  Some sites it works great, other it really has serious problems indexing pages.  I've also uses Google's, but unfortunately it's not very flexible to use. Any recommendation would be much appreciated.

    | alrockn
    0

  • Hi All, We implement most things on our Website that is recommended and most recently we did Schema.org. However, one area which we haven't done is fix our W3 Validation Errors. My developer thinks they are not so as such and it's more about ticking the boxes but does anymore have any experience whereby fixing all these did actually have an SEO /Ranking Benefit ?.. Most of our URL'S are indexed and google recrawls regularly so I am not sure as to it's importance. Also we have a mobile responsive version so I wasn't sure if it more important because of this. From what I read, I can't see to any benefit from fixing it all but just wanted some other opinions? thanks Pete

    | PeteC12
    0

  • Hi Community, There have probably been a few answers to this and I have more or less made up my mind about it but would like to pose the question or as that you post a link to the correct article for this please. I have a travel site with multiple accommodations (for example), obviously there are many filter to try find exactly what you want, youcan sort by region, city, rating, price, type of accommodation (hotel, guest house, etc.). This all leads to one invevitable conclusion, many of the results would be the same. My question is how would you handle this? Via a rel canonical to the main categories (such as region or town) thus making it the successor, or no follow all the sub-category pages, thereby not allowing any search to reach deeper in. Thanks for the time and effort.

    | ProsperoDigital
    0

  • Hey Everyone, I had a potential client contact me about doing SEO for their site and I see that they have an AJAX site where all the content is rendered dynamically via AJAX.  I've been doing SEO for years, but never had a client with an AJAX site.  I did a little research and see how you can setup alternative pages (or snapshots as Google calls them) with the actual content so the pages are crawlable and will get indexed, but I'm wondering if that is as effective as optimizing static HTML pages or if Google treats AJAX page alternatives as less trustworthy/valuable. Also, does having the site in AJAX effect link building and social sharing?  With the link structure, it seems there could be some issues with pointing links and passing link juice to internal pages Thanks! Kurt

    | Kurt_Steinbrueck
    1

  • I built out a few slideshare presentations, and they are getting a few views.  What's weird is when I search for them it takes me pages and pages of sifting to actually find them.  Thats even when I use quotation marks around my account name and the individual presentation title.  Any ideas on how I can better optimize my presentations?

    | jfeitlinger
    0

  • Hey Mozzers, I have been struggling with this issue, and I am hoping someone can help. I have a number of bad/spammy links to my site. We have never engaged in "bad SEO", but an old subdomain received a number of spammy blog comments, and everything seemed to escalate from there. We have removed a subdomain that received all of the bad links from our DNS settings (about a year ago), but these links are still there when using Ahrefs or MajesticSEO. I don't think we have been penalized for these links, but I would just like to clean them up because, well, it's the right thing to do. How does one do this when these sites seem so untouchable. Either they are from China, Russia, Denmark, abandoned in 2009, etc. If I look for someone to contact, I can't seem to find anyone to even email. Suggestions?

    | evan89
    0

  • Please let me know if this makes sense. I have a very limited knowledge of technical SEO but I am almost positive that my web developer did something wrong. I have a wordpress blog and he did add canonical code to some of the pages. However he directs the site to the same URL! Does this mean that the canonical code is setup incorrectly and actually harming my SEO performance. Also if I have one webpage with just the first paragraph of a blog post I wrote and a completely seperate page for the blog post itself, could this be considered duplicate content? Thanks!!

    | DR70095
    0

  • Hello, I have noticed that some travel sites rank for almost all the keywords but when I click the page, it has no relevant content and often no content at all. I remember Google once updated its algorithm to do away with such sites but I still found some. The question is - if they don't have relevant content or if they don't have content at all, how do they even rank? Secondly, how come they have pages for all keyword combination? How is this achieved? Regards

    | IM_Learner
    0

  • Hi All, We changed out url structure on our website to both reduce both the size of our category url structure (reduce the number of layers '/ ' )  and also to replace the underscores we originally had to hyphens... We did this during a new site design. Anyway we relaunched it a week ago. We did the 301 redirects from old to new , new site maps etc, and the latest moz ranking report is showing most of them dropping 5 to 10 positions  i.e from 3rd to 10th etc... Is this something to be expected , and then it should recover or should this be telling me alarm bells. I would have expected not such a negative shift in all my rankings ?.. Anyone thoughts of this would be greatly appreciated... thanks Pete .

    | PeteC12
    0

  • Hello, Thank you , if you help us our web url www.prismpharmamachinery.com before some time very top ranking but now going down 5-7 pages in google any SEO expert can help for that Regards pooja

    | Poojath
    0

  • Hi I have a strange query, Bing in the last four weeks have been seriously crawling our site, and while at the moment this isn't affecting our servers, coming into the busy time of the year I am just wondering if anyone else is seeing this. They are basically crawling the entire site (including nofollow  pages) even though according to Bing Webmaster tools, they know our sitemap. Is anybody else seeing any unusual activity with the Bing bot. Thanks Andy

    | Andy-Halliday
    0

  • Greetings Moz Community: I purchased a  SEMrush subscription recently and used it to run a site audit. The audit detected 168 duplicate content issues mostly relating to blog posts tags. I suspect these issues may be due to canonical tags not being set up correctly. My developer claims that since these blog URLs are set to "no-index" these issues do not need to be corrected. My instinct would be to avoid any risk with potential duplicate content. To set up canonicalization correctly. In addition, even if these pages are set to "no-index" they are passing page rank. Further more I don't know why a reputable company like SEMrush would consider these errors if in fact they are not errors. So my question is, do we need to do anything with the error pages if they are already set to "no-index"? Incidentally the site URL is www.nyc-officespace-leader.com. I am attaching a copy of the  SEMrush audit. Thanks, Alan BarjWaO SqVXYMy

    | Kingalan1
    0

  • Hi mozzers, We are about to launch a new site and right now I am worried that this new site may create thousands of duplicate content which will harm all the SEO that has been done in the last few years. Here is a situation: You land on the example.com/Los-angeles page (geo located) but if you modify URI to example.com/chico then a pop up appears and ask you for the location you want to be in (pop up attached). When choosing chico the URI switches to example.com/chico?franchise=chico instead of /chico only. This site has over 40 different microsites so my question are all these arguments ?franchise=city going to be indexed and create thousands of dups? or are we safe because this geo localization happens thanks to javascript? Thanks! GopRinh.png

    | Ideas-Money-Art
    0

  • Hi all, I’m looking for some expert advice on use of canonicals to resolve duplicate content for an e-Commerce site. I’ve used a generic example to explain the problem (I do not really run a candy shop). SCENARIO I run a candy shop website that sells candy dispensers and the candy that goes in them. I sell about 5,000 different models of candy dispensers and 10,000 different types of candy. Much of the candy fits in more than one candy dispenser, and some candy dispensers fit exactly the same types of candy as others. To make things easy for customers who need to fill up their candy dispensers, I provide a “candy finder” tool on my website which takes them through three steps: 1. Pick your candy dispenser brand (e.g. Haribo) 2. Pick your candy dispenser type (e.g. soft candy or hard candy) 3. Pick your candy dispenser model (e.g. S4000-A) RESULT: The customer is then presented with a list of candy products that they can buy. on a URL like this: Candy-shop.com/haribo/soft-candy/S4000-A All of these steps are presented as HTML pages with followable/indexable links. PROBLEM: There is a duplicate content issue with the results pages. This is because a lot of the candy dispensers fit exactly the same candy (e.g. S4000-A, S4000-B and S4000-C). This means that the content on these pages are the basically same because the same candy products are listed. I’ll call these the “duplicate dispensers” E.g. Candy-shop.com/haribo/soft-candy/S4000-A Candy-shop.com/haribo/soft-candy/S4000-B Candy-shop.com/haribo/soft-candy/S4000-C The page titles/headings change based on the dispenser model, but that’s not enough for the pages to be deemed unique by Moz. I want to drive organic traffic searches for the dispenser model candy keywords, but with duplicate content like this I’m guessing this is holding me back from any of these dispenser pages ranking. SOLUTIONS 1. Write unique content for each of the duplicate dispenser pages: Manufacturers add or discontinue about 500 dispenser models each quarter and I don’t have the resources to keep on top of this content. I would also question the real value of this content to a user when it’s pretty obvious what the products on the page are. 2. Pick one duplicate dispenser to act as a rel=canonical and point all its duplicates at it. This doesn’t work as dispensers get discontinued so I run the risk of randomly losing my canonicals or them changing as models become unavailable. 3. Create a single page with all of the duplicate dispensers on, and canonical all of the individual duplicate pages to that page. e.g. Canonical: candy-shop.com/haribo/soft-candy/S4000-Series Duplicates (which all point to canonical): candy-shop.com/haribo/soft-candy/S4000-Series?model=A candy-shop.com/haribo/soft-candy/S4000-Series?model=B candy-shop.com/haribo/soft-candy/S4000-Series?model=C PROPOSED SOLUTION Option 3. Anyone agree/disagree or have any other thoughts on how to solve this problem? Thanks for reading.

    | webmethod
    0

  • Hi All, We upgraded our framework , relaunched our site with new url structures etc and re did our site map to Google last week. However, it's now come to light that the rel=next, rel=Prev tags we had in place on many of our pages are missing. We are putting them back in now but my worry is , as they were previously missing when we submitted the , will I have duplicate content issues or will it resolve itself , as Google re-crawls the site over time ?.. Any advice would be greatly appreciated? thanks Pete

    | PeteC12
    0

  • We're in a games market and we have a link on every page to our download. The link is an aspx but there is no downloadpage as such - clicking on the link triggers an executable download that is just less than one meg. We've been looking at the top results in our very competitive market and the top 8 don't seem to have a download. Coincidence or a real factor?

    | dancape
    0

  • After reading Cyrus' article: http://moz.com/blog/seo-tips-https-ssl, I am now completely confused about what adding SSL could do to our site. Bluehost, our hosting provider, says if we get their SSL, they just add it to our site and it's up in a few hours: no problem whatsoever. If that's true, that'd be fantastic...however, if that's true, there wouldn't need to be like 10 things you're supposed to do (according to Cyrus' article) to ensure your rankings after the switch. Can someone clarify this for me? Thanks, Ruben

    | KempRugeLawGroup
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.