Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi all, We recently moved our blog to a sub-domain where it is hosted on Wordpress. It was very recent and we're actively working on the SEO, but any pointers on getting the subdomain to rank higher than the old blog posts would be terrific. Thanks!

    | DigitalMoz
    0

  • Hi, My online Shop is based on WordPress with the WooCommerce plugin. Now, I have met a SEO guy who told me that's bad in the eyes of Google: Because Google apparently sees my website as a blog and not as a E-Commerce site. Wow, this statement really confused my, since I am working so hard on content and good rankings. Any opinions on this would be appreciated. Best, Robin

    | soralsokal
    0

  • One of our clients lost rankings on the local map results. Last month we changed the phone number on the G+ page so the number is the same as on the website but it's still a call tracking number. We also changed the url to example.nl/plumber-newyork so it directly links to the local page and we made the local G+ author of the local page in the website. Can these changes have something to do with the ranking loss in google maps results?

    | remkoallertz
    0

  • Hi Moz! We have old locations pages that we can't redirect to the new ones because they have AJAX. To preserve pagerank, we are putting canonical tags on the old location pages. Will Googlebots still read these canonical tags if the pages have a javascript redirect? Thanks for reading!

    | DA2013
    0

  • Can anyone help me to show website structure data in Google when someone search my website in Google. I already added my website in Google and Google webmaster tool. Thanks in adv.

    | talkinnetventure
    0

  • Hi All, We're currently redesigning a website for a new home developer and we're trying to figure out the best way to deal with tabbed content in the URL structure. The design of the site at the moment will have a page for a development and within that you can select your house type, then when on the house type page there will be tabs displayed for the user to see things like the plot map, availability and pricing, specifications, etc. The way our development team are looking at handling this is for the URL to use a hashtag or a query string at the end of it so we can still land users on these specific tabs for PPC for example. My question is really, has anyone had any experience with this? Any recommendations on how to best display the urls for SEO? Thanks

    | J_Sinclair
    0

  • Hi Guys, Just after some clarification - I have recently been told that by placing content in <aside></aside> tags spiders will ignore the content. Is this the case? I always thought that content placed in these tags was to identify related content. To put the query into some context, we have the same content on multiple pages on a site, which is relevant to the main body copy - but could throw up duplicate content issues... Thanks in advance.

    | SEOBirmingham81
    1

  • Hi Moz! I'm working on a site that has thousands of pages of content that are not relevant to the business anymore since it took a different direction. Some of these pages still get a lot of traffic. What should I do with them? 404? Keep them? Redirect? Are these pages hurting rankings for the target terms? Thanks for reading!

    | DA2013
    0

  • Hey everybody! So, for those who have followed some of my posts I have myself in a bit of a quagmire that I am not going to get into. Some solutions have come to light and others are still pending and I will update my past questions with solutions! On the safer side of things I have a new situation. As I am going through our pages we have three different pages for "Admissions" Admissions Admission Guidelines Admission Information The "admissions' page has no link or feed to the other admissions page, and actually has no content on it at all. The "Admission Guidelines" page feeds to the "admission information" page, which although is extremely redundant is a different project for a different day. I am planning on putting a 301 on the "admissions" page and sending it to the "admission guidelines" page. When I do so, should I delete the old page? Does it matter? Is there a pro or con for either? Thanks guys!

    | HashtagHustler
    0

  • Hello, I have an e-commerce website selling about 20 000 different products. For the most used of those products, I created unique high quality content. The content has been written by a professional player that describes how and why those are useful which is of huge interest to buyers. It would cost too much to write that high quality content for 20 000 different products, but we still have to sell them. Therefore, our idea was to no-index the products that only have the same copy-paste descriptions all other websites have. Do you think it's better to do that or to just let everything indexed normally since we might get search traffic from those pages? Thanks a lot for your help!

    | EndeR-
    0

  • We recently moved our website over to the Magento eCommerce platform. Magento has functionality to make certain items not visible individually so you can, for example, take 6 products and turn it into 1 product where a customer can choose their options. You then hide all the individual products, leaving only that one product visible on the site and reducing duplicate content issues. We did this. It works great and the individual products don't show up in our site map, which is what we'd like. However, Google Webmaster Tools has all of these individual product URLs in its Not Found Crawl Errors. ! For example: White t-shirt URL: /white-t-shirt Red t-shirt URL: /red-t-shirt Blue t-shirt URL: /blue-t-shirt All of those are not visible on the site and the URLs do not appear in our site map. But they are all showing up in Google Webmaster Tools. Configurable t-shirt URL: /t-shirt This product is the only one visible on the site, does appear on the site map, and shows up in Google Webmaster Tools as a valid URL. ! Do you know how it found the individual products if it isn't in the site map and they aren't visible on the website? And how important do you think it is that we fix all of these hundreds of Not Found errors to point to the single visible product on the site? I would think it is fairly important, but don't want to spend a week of man power on it if the returns would be minimal. Thanks so much for any input!

    | Marketing.SCG
    0

  • I'm working with a site that has millions of pages. The link flow through index pages is atrocious, such that for the letter A (for example) the index page A/1.html has a page authority of 25 and the next pages drop until A/70.html (the last index page listing pages that start with A) has a page authority of just 1. However, the pages linked to from the low page authority index pages (that is, the pages whose second letter is at the end of the alphabet) get just as much traffic as the pages linked to from A/1.html (the pages whose second letter is A or B). The site gets a lot of traffic and has a lot of pages, so this is not just a statistical biip. The evidence is overwhelming that the pages from the low authority index pages are getting just as much traffic as those getting traffic from the high authority index pages. Why is this? Should I "fix" the bad link flow problem if traffic patterns indicate there's no problem? Is this hurting me in some other way? Thanks

    | GilReich
    0

  • My client has a "free" site he set-up years ago - www.montclairbariatricsurgery.com (We'll call this the old site) that consistently outranks his current "optimized" (new) website - http://www.njbariatricsurgery.com/ The client doesn't want to get rid of his old site, which is now a competitor, because it ranks so much better. But he's invested so much in the new site with no results. A bit of background: We recently discovered the content on the new site was a direct copy of content on the old site. We had all copy on new site rewritten. This was back in April. The domain of the new site was changed on July 8th from www.Bariatrx.com to what you see now - www.njbariatricsurgery.com. Any insight you can provide would be greatly appreciated!!!

    | WhatUpHud
    0

  • We have a new client (a mobile app) who created a program to create thousands of pages of "unique, user-generated content" for their website. An example: A person in the forum in app asks a question, and people respond. The client's program then compiles the question and responses into a unique, auto-generated page for the website. (I don't think the app is utilizing deep linking -- though I was going to recommend it -- so the app content is not indexed by search engines yet.) The pages are already created -- they are just not live on the site yet. I'm very skeptical. But the client says it's similar to what Stack Overflow does (or something like that). Basic example. Say that a question for which the client wants to rank is, "What Are the Symptoms of Cancer?" I'd think that a quality, human-created, referenced, well-written, authoritative page would obviously rank more highly than a UGC page based on a forum discussion on that topic. But of course, doing that for hundreds of questions is costly and hard to scale -- both of which are concerns of the client (a startup with little money). Has anyone had any experience in this? It's the first time I've tackled such an issue. Thanks in advance for any thoughts!

    | SamuelScott
    0

  • I am working on a client's site at the moment and I noticed that both HTTP and HTTPS versions of certain pages are indexed by Google and both show in the SERPS when you search for the content of these pages. I just wanted to get various opinions on whether HTTPS pages should have a meta no-index tag through an htaccess rule or whether they should be left as is.

    | Jamie.Stevens
    0

  • We have a site with over 3mm pages indexed, and an XML sitemap with over 12mm images (312k indexed at peak). Last week our traffic dropped off a cliff. The only major change we made to the site in that time period was adding a DNS record for all of our images that moved them from a SoftLayer Object Storage domain to a subdomain of our site. The old URLs still work, but we changed all the links from across our site to the new subdomain. The big mistake we made was that we didn't update our XML sitemap to the new URLs until almost a week after the switch (totally forgot that they were served from a process with a different config file). We believe this was the cause of the issue because: The pages that dropped in traffic were the ones where the images moved, while other pages stayed more or less the same. We have some sections of our property where the images are, and have always been, hosted by Amazon and their rankings didn't crater. Same with pages that do not have images in the XML sitemap (like list pages). There wasn't a change in geographic breakdown of our traffic, which we looked at because the timing was around the same time as Pigeon. There were no warnings or messages in Webmaster Tools, to indicate a manual action around something unrelated. The number of images indexed in our sitemap according Webmaster Tools dropped from 312k to 10k over the past week. The gap between the change and the drop was 5 days. It takes Google >10 to crawl our entire site, so the timing seems plausible. Of course, it could be something totally unrelated and just coincidence, but we can't come up with any other plausible theory that makes sense given the timing and pages affected. The XML sitemap was updated last Thursday, and we resubmitted it to Google, but still no real change. Anyone had a similar experience? Any way to expedite the climb back to normal traffic levels? Screen%20Shot%202014-07-29%20at%203.38.34%20PM.png

    | wantering
    0

  • The main domain is: http://www.eumom.ie/ And these would be some of the core pages: http://www.eumom.ie/pregnancy/ http://www.eumom.ie/getting-pregnant/ Any help from the Moz community is much appreciated!

    | IcanAgency
    0

  • I'm looking to make some optimizations to a website I'm working on but wanted more input before I get started: Currently, when blogs are posted to the website, the URL for each post looks like this:www.mywebsite.com/blogpost I've heard that for whatever reason, the best practice is to make sure that the blog posts get posted to a blog sub-directory like so: www.mywebsite.com/blog/blogpost If I were to make this change, I'm assuming I would have to 301 redirect all of the existing blogs to their new locations. Is this change worth doing and are there any other considerations I should be taking into account? Also, I'm aware that there are certain schools of thought that category and tag pages should be no-indexed to avoid duplicate content issues. Can anyone shed some light on this from first hand experience? Thanks in advance!

    | goldbergweismancairo
    0

  • I work for a small web agency and I noticed that many of the sites that we build have been using the same privacy policy. Obviously it can be a bit of a nightmare to write a unique privacy policy for each client so is Google likely to class this as duplicate content and result in a penalty? They must realise that privacy policies are likely to be the same or very similar as most legal writing tends to be! I can block the content in robots.txt or meta no-index it if necesarry but I just wanted to get some feedback to see if this is necessary!

    | Jamie.Stevens
    1

  • On our website, we show events from different cities. We have made different URL's for each city like www.townscript.com/mumbai, www.townscript.com/delhi. But the page of all the cities looks similar, only the events change on those different city pages. Even our home URL www.townscript.com, shows the visitor the city which he visited last time on our website(initially we show everyone Mumbai, visitor needs to choose his city then) For every page visit, we save the last visited page of a particular IP address and next time when he visits our website www.townscript.com, we show him that city only which he visited last time. Now, we feel as the content of home page, and city pages is similar. Should we show these pages as one page i.e. Townscript.com to Google? Can we do that by rel="canonical" ? Please help me! As I think all of these pages are competing with each other.

    | sanchitmalik
    0

  • I have 1 page that ranks well with unique written content located high up on page (http://www.honoluluhi5.com/new-condos-in-honolulu/). I struggle to rank for 200+ other pages where unique content requires scrolling (ex: http://www.honoluluhi5.com/oahu/honolulu-homes/). I am thinking to do as follows: Change layout of all my pages to have unique content higher on page When users are on my site (not coming from search engines) and use my search filters, then users will land on pages where unique content is lower on page (so keep this layout: http://www.honoluluhi5.com/oahu/honolulu-homes/). I will then add these pages to my robots.txt file so they do not show in Google's index. Reason: unique content lower on page offers best user experience. With unique content higher on page, I expect bounce rate to increase about 10% (based on the 1 page I have with unique content higher), but I think it is worthwhile, as I am sure search engines will start having my pages rank higher.

    | khi5
    0

  • Hi, I am looking into placing schema on my site that is hosted by Magento. There are a few different extensions that will automate the rich snippets around our content. Some of them are: http://www.magentocommerce.com/magento-connect/seo-rich-snippets-google-bing-yahoo-schema-org.html http://www.magentocommerce.com/magento-connect/google-rich-snippets-for-magento.html http://www.magentocommerce.com/magento-connect/msemantic-semantic-seo-for-rich-snippets-in-google-and-yahoo.html Does anyone have any experience in automating the rich snippets especially in Magento? Did it work? Are there any negative ramifications of doing it this way? Thank you,

    | EcomLkwd
    0

  • We recently redesigned a client's website, updated SEO (page titles, descriptions, internal linking), and have experienced a massive drop both in organic and direct search traffic. We had an issue with the sitemap that I resolved last week, but traffic doesn't show any signs of recovering. What should I expect and what can I do to get back to where we were? The site is www.rugcare.com, if anyone wants to take a look.

    | ScottImageWorks
    0

  • Hi all, I was wondering if any of you can advise whether it's no issue to use two separate custom 404 pages. The 404 pages would be different for different parts of the site. For instance, if you're on /community/ and you enter a non-existing page on: www.sample.com/community/example/ it would give you a different 404 page than someone who runs into a non existing page at: www.sample.com/definition/example/ Does anybody have experience with this and would this be fine?

    | RonFav
    0

  • We used to rank well on Google.  From what I understand the site was nailed with malware and the site was heavily penalized.  Now I am trying to get things straightened up.  I am going through and fixing onsite issues, but I get the sense that the issues are deeper with Google.Can someone help me?  I have tried getting a manual review through Google Web Master Tools, but it isn't an option for the site.

    | webgurucreative
    0

  • Greetings MOZ Community: I operate a real estate web site in New York (www.nyc-officespace-leader.com) that I suspect has been hit by Panda 4.0.  I believe a problem is thin content on product pages, which in my case are 350 listing pages. However I am also looking at how title and description tags are formatted for these 350 pages to ensure this is not a factor in the ranking drop. The title descriptions are written like this: <title></span><span class="webkit-html-tag">Flatiron loft for rent | West 21st Street | 1441SF $6604/month</span><span class="webkit-html-tag"></title> Is this sufficiently diverse? Will constantly repeating various street names,  square footages and prices work against me? Will Google in a sense consider this thin or repetitive content? It does provide the visitor with key information. The descriptions meta tags are written along these lines: description" content="One of the most desirable full floor sublets in Midtown South. Recent build out, pristine condition, panoramic views, tech chic, spectacular. Top location." /><meta< span=""></meta<> From an SEO perspective are these critical tags written the way they should be? Thanks everyone!! Alan

    | Kingalan1
    0

  • Hello SEO experts! We are encountering a difficult situation at our marketing firm with a client who wants to optimize her site for keyworks + counties, as she doesn't want to be restricted to one specific city. We have suggested alternate solutions like location pages, utilization of H2's, etc, however, she wants to know the effectiveness of using a specific city (ie: Winona, MN) vs a county (ie: Winona County, MN) for SEO purposes. The research I have conducted thus far hasn't gotten me very far, basically I'm seeing that it all comes back to what people search for (cleaning services in Winona, MN vs. cleaning services in Winona County, MN). Does anyone have any insight into this issue?

    | MLTGroup
    0

  • Hi, I understand a site which loads quickly is greater for the user but how does site speed affect rankings? I mean does Google log the speed pages load, the faster it loads the better the signal? So say I have a page which loads in 1.5sec would Google 'Rate' the site better if it loaded in say 0.8sec? Thanks.

    | followuk
    0

  • We have a client that is creating a new promotional website that consists of videos, brands and product reviews (SITE B).  After a visitor watches a video on SITE B they will be given a "click to purchase" option that will lead them to the original website (SITE A). Our client is paranoid that since all the outgoing links on the new SITE B are going to the original SITE A there might be algorithm penalty (for one website or both).   I find this very unlikely and even recommend "no follow" coding for a peace of mind.  However are there any resources/links out there that can back up my argument that they will be alright? Thanks

    | VanguardCommunications
    0

  • Hey everyone, On our site we have multiple pages that have similar content. As an example, we have a section on Cars (in general) and then specific pages for Used Cars, European Cars, Remodeled Cars etc. Much of the content is similar on these page and the only difference is some content and the additional term in the URL (for example car.com/remodeled-cars and /european-cars). In the past few months, we've noticed a dip in our organic ranking and started doing research. Also, we noticed that Google, in SERPs, shows the general page (cars.com/cars) and not the specific page (/european-cars), even if the specific page has more content. Can having multiple pages with similar content hurt SEO? If so, what is the best way to remedy this? We can consolidate some of the pages and make the difference between them a little clearer, but does it make that much of a difference for rankings? Thanks in advance!

    | JonathonOhayon
    0

  • My company is a Theater news and reviews site. We're building a google news sitemap and Google suggests some recommended keywords we can use with their <keywords>tag: https://support.google.com/news/publisher/answer/116037</keywords> Our writers also tag their stories with relevant keywords. What should we populate the <keywords>tag with?</keywords> We were thinking we'd automatically populate it with author-added tags, in addition to one or more of the recommended ones suggested by Google, such as Theater, Arts, and Culture (all of our articles are related to these topics). Finally, many of our articles are about say, celebrities. An author may tag an article with 'Bryan Cranston,' and when this is the case we're considering also tagging it with the 'Celebrities' tag. Are all or any of these worthwhile?

    | TheaterMania
    0

  • Hi, Does anyone have experience with Breadcrumb nodes for e-commerce? http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fwww.overstock.com%2FOffice-Supplies%2FOffice-Star-Professional-Air-Grid-Deluxe-Task-Chair%2F2605023%2Fproduct.html What happens if your product appears in more than one category? Should you let google spider the various breadcrumb routes to the category?? Which one would take preference in results? Right now, for ease of management, we have not enabled category URL paths to the product - so the product appears right after the domain, for example, www.mydomain.com/en/myproduct.html - If we do enable category URL paths, Any comments or opinions? Thanks!

    | bjs2010
    0

  • So I'm about to migrate one domain to another.  Lets say I'm migrating boo.com to foo.com.  Boo.com has good organic traffic & has some really well ranked pages.  For this reason (I think) I want to send that traffic to some where other than the foo.com homepage.  Perhaps a catered landing page.  My question is can I redirect some of the specific pages on boo.com to a landing page on foo.com & then redirect the delta to foo.com's homepage? Or am a risking not fully transferring the full credit of one domain to another if I take that approach & therefore I should just redirect one domain to the other in its entirety? Thanks, Rich

    | RPD
    0

  • Is there a way to do this? I have a new wiki set up at http://hiddentriforce.com/zelda-wiki/index.php/Main_Page I want to change the default underscores to hyphens. This is what I have been trying Options +FollowSymLinks
    RewriteEngine On
    RewriteBase / RewriteRule !.(html|php)$ - [S=4]
    RewriteRule ^([^]*)([^]*)([^]*)([^]*)(.)$ $1-$2-$3-$4-$5 [E=uscor:Yes]
    RewriteRule ^([^_])([^])([^])(.*)$ $1-$2-$3-$4 [E=uscor:Yes]
    RewriteRule ^([^])([^])(.*)$ $1-$2-$3 [E=uscor:Yes]
    RewriteRule ^([^])_(.)$ $1-$2 [E=uscor:Yes] RewriteCond %{ENV:uscor} ^Yes$
    RewriteRule (.*) http://hiddentriforce.com/zelda-wiki/index.php/$1 [R=301,L]

    | Atomicx
    0

  • I've read that Google frowns upon large numbers of internal links. We're building a site that helps users browse a list of shows via dozens of genres. If the genres are expose, say, as a pulldown menu as opposed to a list of static links, and selecting the pulldown option filters the list of shows, would those genres count against our internal links count?

    | TheaterMania
    0

  • Greetings MOZ Community: Webmaster Tools under "Index Status" shows 850 URLs indexed for our website (www.nyc-officespace-leader.com). The number of URLs indexed jumped by around 175 around June 10th, shortly after we launched a new version of our website. No new URLs were added to the site upgrade. Under Webmaster Tools under "Crawl, Site maps", it shows 637 pages submitted and 599 indexed. Prior to June 6th there was not a significant difference in the number of pages shown between the "Index Status" and "Crawl. Site Maps". Now there is a differential of 175. The 850 URLs in "Index Status" is equal to the number of URLs in the MOZ domain crawl report I ran yesterday. Since this differential developed, ranking has declined sharply. Perhaps I am hit by the new version of Panda, but Google indexing junk pages (if that is in fact happening) could have something to do with it. Is this differential between the number of URLs shown in "Index Status" and "Crawl, Sitemaps" normal? I am attaching Images of the two screens from Webmaster Tools as well as the MOZ crawl to illustrate what has occurred. My developer seems stumped by this. He has submitted a removal request for the 175 URLs to Google, but they remain in the index. Any suggestions? Thanks,
    Alan

    | Kingalan1
    0

  • When I use a canonical tag for pages that are variations of the same page, it basically means that I don't want Google to index this page. But at the same time, spiders will go ahead and crawl the page. Isn't this a waste of my crawl budget? Wouldn't it be better to just disallow the page in robots.txt and let Google focus on crawling the pages that I do want indexed? In other words, why should I ever use rel=canonical as opposed to simply disallowing in robots.txt?

    | YairSpolter
    0

  • Hi all, So yesterday we set up to Remove URL's that got into the Google index that were not supposed to be there, due to faceted navigation... We searched for the URL's by using this in Google Search.
    site:www.sekretza.com inurl:price=
    site:www.sekretza.com inurl:artists= So it brings up a list of "duplicate" pages, and they have the usual: "A description for this result is not available because of this site's robots.txt – learn more." So we removed them all, and google removed them all, every single one. This morning I do a check, and I find that more are creeping in - If i take one of the suspecting dupes to the Robots.txt tester, Google tells me it's Blocked. - and yet it's appearing in their index?? I'm confused as to why a path that is blocked is able to get into the index?? I'm thinking of lifting the Robots block so that Google can see that these pages also have a Meta NOINDEX,FOLLOW tag on  - but surely that will waste my crawl budget on unnecessary pages? Any ideas? thanks.

    | bjs2010
    0

  • I have heard conflicting answers to this.   I always figured that it was okay to selectively copy and paste on page content into the meta description tag.....especially if the onpage content is well written.    How can it be duplicate content if it's pulling from the exact same page? Does anybody have any feedback from a credible source about this? Thanks.

    | VanguardCommunications
    1

  • I have a site that does pretty well on a .co domain, but would like to switch to over .com (we own the .com already). If we were to transfer .com and 301 redirect all the .co pages over to their .com version, would we suffer at all? What would you guys recommend?

    | StickyWebz
    0

  • Good Afternoon! Does anybody know what sort of impact I can expect to see from switching hosting? Not only that but how long it takes to come back from that sort of thing? Our website has steadily been dropping since I took it over about a month ago. I have been slowly, tediously trying to prune the bad stuff, and one of our issues is with out host. Any thoughts would be great! Thanks.

    | HashtagHustler
    0

  • Hello Moz Helpsters! I write you since I feel I'm missing something BIG and I'm sure that with all your experience you can find easily what's wrong. Here is my situation: We rank pretty bad on several KW and our competitor does well. BUT:
    - our pages are graded higher
    - our pages load faster
    - our pages have more more links
    - our pages have more popularity I feel there is something BIG I'm missing but I can't get to it. Can you have a look and see if you find any BIG issue we could have missed? Here are two pages for a good comparison. Both are ranking on "medecin generaliste paris" (translation: "general practitioner paris"):
    - our page ranking #90: https://www.keldoc.com/medecin-generaliste/paris
    - our competitor ranking #11: https://www.doctolib.fr/medecin-generaliste/paris Any help on this is veeeeryyy welcome 🙂

    | marc_steen
    0

  • Hi, We have just launched a new e-commerce site and 301'd a lot of the products and categories to the new site. I have also added a link or 2 from other domains. We launched it on 10th of June - but still using site explorer, there is no domain authority showing - just a 1. Why is that? Any ideas?

    | bjs2010
    0

  • Pages for my companies play process load slowly because the process is heavy. Below the play process there is a block of text, put mostly there for SEO purposes. R&D are proposing to load the SEO Area only after the play process is loading.
    This seems like a very bad solution, because loading the SEO Area asynchronously will make the content unreadable to Google. Am I missing something?

    | theLotter
    0

  • Six months after starting a marketing campaign and spending a lot of money on SEO audits, link removals, wire frames, copywriting and coding my web site  (www.nyc-officespace-leader.com) traffic dropped significantly after I launched a new version of my site in early June. Traffic is down about 27%, but most of the traffic from competitive terms is gone and the number of leads (phone calls, form completions) is off by about 70%. On june 6th an upgraded version of the site with mostly cosmetic changes (narrower header without social media buttons, streamlined conversion forms, new right rail was launched. No URLs were changed, and the text remained mostly the same. But somehow my developers botched up either canonical tags or Robot Text and 175 URLs with very little/no content were indexed by Google. At that point my ranking and traffic. A few days ago a request to remove those pages was made via Google WebmasterTools and now the number of pages indexed is down to 675 rather than the incorrect 850 from before. But ranking, traffic and lead generation have not yet recovered. After spending almost $25,000 over nine months this is rather frustrating. I might add the site has very few links from incoming domains and those links are not high quality. An SEO audit was performed in February and in April a link removal campaign occurred with about 30 domains agreeing to remove links and a disavow file being submitted for another 70-80 domains that would not agree to remove links. My SEO believes that we should focus on improving visitor engagement rather that on more esoteric SEO like trying to build incoming links. They think that improving useability will improve conversions and would generate results faster than traditional SEO. Also, they think that improving click through rates, reducing bounce rates will improve ranking by signaling to Google that the site is providing value to visitors. Does this sound like a reasonable approach? On one hand I don't see how my site with a MOZ domain authority could possibly compete against sites with a high number of quality incoming links and that maybe building a better link profile would yield faster results. On the other hand, it seems logical that Google would reward a site that creates a better user experience. Any thoughts from the MOZ community???? Does it sound like the recent loss of traffic is due to the indexing of the 175 pages? If so, when should my traffic and ranking return? Incidentally, these are the steps taken since last November to improve SEO: SEO Traffic & Ranking Drop Analysis and Recommendations (included in-depth SEO technical audit and recommendations). Unnatural Link Removal Program Content Optimization (Audit & Strategy with 20 page keyword matrix) CORE (also provided wireframe for /visitor-details pages at no-charge) SEO Copywriting for 10 pages New wire frames implemented on site on June 6th Jump in indexed pages by 175 on June 10th. Google Webmaster Tools removal request made for those low quality pages on June 23rd. Thanks, Alan

    | Kingalan1
    1

  • We have recently launched an store, and we had created 301 redirects from the old product pages to the new ones. But we found that the new website performance was a quite slow, and that it would improve a lot if you add a number at the end of the product page url. But that implies to change the 301 redirect and to add a new one for the pages that has already been updated by the search engine. I mean. www.store.com/oscommerce/catalog/productpage.html has been now 301 redirected to www.store.com/sectionname/productpage this was 48 hours ago. But now, (I know it, we should check this before...) we need to change the url to www.store.com/sectionname/productpage/5487 this way the performance improves a lot. But we are afraid of doing two 301 in just 48 hours. any advice?
    Shoud we find another way of solving the performance issue? Thanks

    | teconsite
    0

  • I operate a New York City commercial real estate web site (www.nyc-officespace-leader.com). Ranking and traffic have dropped steeply since early June. Around May 20th a new Panda update was launched by Google and I wonder if that could partially explain the drop. My site contains the following: -300 listing pages. These are product pages and often contain less than 100 words. Many have not been changed in two years. -150 Building pages. These contain less than 220 words. Many have not been changed in two years. -40 blog pages. We have been adding 1 or 2 per month. -50 or 60 neighborhood and type of space pages. These contain 200-600 words. Could our drop in traffic be due to Panda? I might add that an upgraded version of the site with new forms, a modified right rail an header was launched on June 6th. Also, we submitted a disavow file with Google on April 20th for about 100 toxic domains, one third of the 300 domains that link to us. In order to take remedial action we need to understand what has happened. Any ideas??? Thanks, Alan

    | Kingalan1
    0

  • We have member pages on our site that are initially empty, until the member does some activity. Currently, since all of these pages are soft 404s, we return a 404 for all these pages and all internal links to them are js links (not links as far as bots are concerned). As soon as the page has content, we switch it to 200 and make the links into regular hrefs. After doing some research, I started thinking that this is not the best way to handle this situation. A better idea would be to noindex/follow the pages (before they have content) and let the links to these pages be real links. I'd love to hear input and feedback from fellow Mozzers. What are your thoughts?

    | YairSpolter
    0

  • We have a profile page on our site for members who join. The profile page has child pages that are simply more specific drill-downs of what you get on the main profile page. For example: /roger  displays all of roger's posts, questions, and favorites and then there are /roger/posts, /roger/questions, /roger/favorites. Since the child pages contain subsets of the content on the main profile page, we canonical them back to the main profile page. Here's my question: The main profile page has navigation links to take you to the child pages. On /roger, there are links to:  /roger/posts, /roger/questions, and /roger/favorites. Currently, we nofollow these links. Is this the right way to do it? It seems to me that it's a mistake, since the bots will still crawl those pages but will not transfer PR. What should we do instead: 1. Make the links js links so the child pages won't be crawled at all? 2. Make the links follow so that PR will flow (see Matt Cutts' advice here)? Apprehension about doing this: won't it dilute crawl budget (as opposed to #1)? 3. Something else? In case the question wasn't confusing enough... here's another piece: We also have a child page of the profile that is simply a list of members (/roger/friends). Since this page does not have any real content, we are currently noindex/nofollow -ing it and the link to this page is also nofollow. I'm thinking that there's a better solution for this as well. Would love your input!

    | YairSpolter
    0

  • Hello, We have a site that sells a certain product on www.example.com. This site contains thousands of pages including a whole section of well written content that we invested a lot of money in making. The site ranks on many KWs both brand and non-brand related. SERPs include the Homepage and many of the articles mentioned. We receive traffic and clients to this site from around the world, BUT our main geo-targeting is UK. Due to lack of resources and some legal needs we now have to create a new site - www.example.co.uk that all UK traffic will be able to purchase the product only from this site and not from the .com site anymore. We have no resources to create new content for the new .co.uk site and that is the reason we want to duplicate the site on both domains and use a canonical tag to point the .co.uk site as the primary site. Does anyone have experience with such activity? will this work across the whole site? We need to have a fast solution here, as we do not have too much time to wait because of the legal issue I mentioned. What is the best solutions you can offer to do this so we do not lose important SERPs. On the one hand since our main market is the UK, we assume the main site to promote will be www.example.co.uk but as said earlier, we still have users from other parts of the world as well. Is there any risk that we are missing here? Thanks James

    | Tit
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.