Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi there, I’ve been working on a website for about 6 months now and the page rank still remains at 0 - Google Page Rank. Fresh content has been created across the majority of the site, blog implemented, titles and meta’s, schema.org, we've built some good links etc. There are a lot of 404’errors but a lot of this is to do with stocking issues, products being sold/taken down and new products being put up. Do you think this is the major reason the page rank is not moving – but 404’s are a regular occurrence on a lot of E-Commerce sites. Also, the server went off line on two occasions(obviously Google frowns upon this) but in general server is grand. Also when we started working on the website it wasn't in the best of shape DA: 11, now it's DA:17. I know still not great but moving in the right direction. Just wondering yer thoughts on the PR?

    | niamhomahony
    0

  • Would you take the time to fix external links on your site on pages that are noindex, follow on pages that no one ever visits? The only reason to do it would be to present a tidier site to Google, but would it really care if those pages are noindex/folllow? The thing that makes it a non-trivial amount of work is that there are hundreds of these on a large site. Do you think Google cares, if they're noindex/follow? I know the safe answer is always fix everything, but really it has to get weighed against the likely benefit and other projects with a limited amount of time to work with. Best... Mike

    | 94501
    0

  • Bit of a specific one, but if there's anyone that can help it's you guys. If a .co.uk website is looking to rank in Baidu with a .co.uk/cn subdirectory, would we require an Internet Content Publishing License? Taking a step back, it might be there's two parts to my question: How likely is it a .co.uk. could rank in Baidu, with simplified Chinese language content? In order to have a chance of ranking, would we require an Internet Content Publishing License?

    | ecommercebc
    0

  • There has been recent suggestion (from Rand) that hosting your blog as a folder rather than a subdomain is much better from an SEO point of view. Unfortunately, our blog is hosted on a subdomain with a different technology stack to the main e-commerce site. We are finding it quite tricky to migrate to a folder given the different technologies. Is the following a suitable solution? - 301 redirect from mysite.com/blog/cool-blog-post to blog.mysite.com/cool-blog-post - And then put mysite.com/blog/cool-blog-post" /> on blog.mysite.com/cool-blog-post Would be great to have your thoughts on this guys - I can't figure out if it will work or be an SEO fail.

    | HireSpace
    0

  • We are in charge of a website with 7 languages for 16 countries. There are only slight content differences by countries (google.de | google.co.uk). The website is set-up with the correct language & country annotation e.g. de/DE/ | de/CH/ | en/GB/ | en/IE. All unwanted annotations are blocked by robots.txt. The «hreflang alternate» are also set. The objective is, to make the website visible in local search engines. Therefore we have submitted a overview sitemap connected with a sitemap per country. The sitemap has been submitted now for quite a while, but Google has indexed only 10 % of the content. We are looking for suggestion to boost the index process.

    | imsi
    0

  • There's definitely not going to be a "right" answer to this question, but I think it can lead to a great discussion. We are building a website for a client who has two locations, we are going to use a URL structure similar to this: www.Brand.com (this would be a landing page where users would select a location) www.Brand.com/Atlanta www.Brand.com/Boston However, we still want to focus on local SEO - so our deeper URL structure will be: www.Brand.com/Atlanta/Auto-Accident-Lawyer www.Brand.com/Atlanta/Motorcycle-Accident-Lawyer www.Brand.com/Boston/Auto-Accident-Lawyer www.Brand.com/Boston/Motorcycle-Accident-Lawyer The content on those pages will be unique and target local keywords. Each "version" of the website will have a navigation specific to that location. For example, once a user clicks into the Boston website, all of the navigation items will pertain to Boston. However, we run into an issue with the blog. Both locations will be using the same blog content, which ends up looking something like this: www.Brand.com/Atlanta/Blog/Blog-Article www.Brand.com/Boston/Blog/Blog-Article This obviously creates duplicate content. We could do something such as this: www.Brand.com/Blog/Blog-Article However, as noted above, each local version of the website has a separate navigation (this keeps a user in Boston on the Boston version of the website). So have a centralized blog is far from ideal unless navigations for both locations are included - which would allow users to return back to their local website. From my understanding, duplicate content doesn't necessarily "hurt" your SERPs, it simply keeps one of the duplicated pages from ranking. So the question comes down to this, is duplicate content a big enough issue to restructure a website to use a centralized blog?

    | McFaddenGavender
    0

  • Hi! A company I work with has purchased several (70-something) domain names that are relevant to their business. According to their IT pro, they're currently using DNS to point those domains to our IP address, with a catch-all header on IIS for that IP address. Essentially, we have 70-something domain names that direct to the homepage. I noticed that some have been indexed by Google and are pulling in the meta of the homepage they're being directed to. Is this potentially an issue? If so, would 301 redirects fix this or are we okay with the status quo and the indexing is no big deal? Thanks in advance!

    | 19958
    0

  • Hey all - I hesitate to ask this question, but have spent weeks trying to figure it out to no avail. We are a real estate company and many of our building pages do not show up for a given address. I first thought maybe google did not like us, but we show up well for certain keywords 3rd for Houston office space and dallas office space, etc. We have decent DA and inbound links, but for some reason we do not show up for addresses. An example, 44 Wall St or 44 Wall St office space, we are no where to be found. Our title and description should allow us to easily picked up, but after scrolling through 15 pages (with a ton of non relevant results), we do not show up. This happens quite a bit. I have checked we are being crawled by looking at 44 Wall St TheSquareFoot and checking the cause. We have individual listing pages (with the same titles and descriptions) inside the buildings, but use canonical tags to let google know that these are related and want the building pages to be dominant. I have worked though quite a few tests and can not come up with a reason. If we were just page 7 and never moved it would be one thing, but since we do not show up at all, it almost seems like google is punishing us. My hope is there is one thing that we are doing wrong that is easily fixed. I realize in an ideal world we would have shorter URLs and other nits and nats, but this feels like something that would help us go from page 3 to page 1, not prevent us from ranking at all. Any thoughts or helpful comments would be greatly appreciated. http://www.thesquarefoot.com/buildings/ny/new-york/10005/lower-manhattan/44-wall-st/44-wall-street We do show up one page 1 for this building - http://www.thesquarefoot.com/buildings/ny/new-york/10036/midtown/1501-broadway, but is the exception. I have tried investigating any differences, but am quite baffled.

    | AtticusBerg1
    0

  • We are building URLs dynamically with apache rewrite.
    When we detect that an URL is matching some valid patterns, we serve a script which then may detect that the combination of parameters in the URL does not exist. If this happens we produce a 301 redirect to another URL which serves a 404 error page, So my doubt is the following: Do I have to worry about not serving directly an 404, but redirecting (301) to a 404 page? Will this lead to the erroneous original URL staying longer in the google index than if I would serve directly a 404? Some context. It is a site with about 200.000 web pages and we have currently 90.000 404 errors reported in webmaster tools (even though only 600 detected last month).

    | lcourse
    0

  • I have been doing a little research, but all the articles are really old.  Even the Moz site page is pretty old. So I am wondering, has the strategy changed? Is it OK to still use internal links with your keywords in them?  Do you have multiple links on a page?  What about a blog post?  Do you no follow? What are the thoughts out there on this?

    | netviper
    0

  • I was reading the following Q&A on site wide footer links, http://moz.com/community/q/site-wide-links-from-another-domain-could-these-cause-a-problem I feel my situation is slightly different however,we have lots of international sites linking to each other through these links like our sites for different counties and languages so our German, French and Spanish sites, http://www.cirrusresearch.co.uk/ Our main UK site has always ranked very well and has never really had a problem despite always having had these followed sitewide footer links, Because of this we regularly get high amount of visitors performing English language searches from different counties and i don't think it is a bad thing having more country/language specific sites of ours available in the footer for visitors that may prefer a more localized site, Our main website has to be at least 10+ years old at least, has a lot of strong links compared to our competitors, but the smaller German and Spanish sites are relatively smaller in size and most only 1-2 years old, my big fear is that these smaller sites would not be able to stand on there own without these footer links from our main site, After reading the community question caused me to question this ?, should i take a leap of faith and no-follow all of these site wide footer links connecting all of our sites ? we never really had a problem ranking so i don't really see the need but would this be the best thing to do ? Thank you, James

    | Antony_Towle
    0

  • Moz, I have a particularly tricky competitive keyword that i have finally climbed our website to the 10th position of page 1, i am particularly pleased about this as all of the website and content is German which i have little understanding of and i have little support on this, I am pleased with the content and layout of the page and i am monitoring all Google Analytics values very closely, as well as the SERP positions, So as far as further progression with this page and hopefully climbing further up page 1, where do you think i should focus my efforts ? Page Speed optimization?, Building links to this page ?, blogging on this topic (with links) , Mobile responsive design (More difficult), further improvements to pages and content linked from this page ? Further improvements to the website in general?,further effort on tracking visitors and user experience monitoring (Like setting up Crazyegg or something?) Any other ideas would be greatly appreciated, Thanks all, James

    | Antony_Towle
    0

  • Greeting MOZ Community: I operate a real estate web site in New York City (www.nyc-officespace-leader.com). Of the 600 pages, about 350 of the URLs are product pages, written about specific listings. The content on these pages is quite short, sometimes only 20 words. My ranking has dropped very much since mid-May, around the time of the new Panda update. I suspect it has something to do with the very short product pages, the 350 listing pages. What is the best way to deal with these pages so as to recover ranking. I am considering these options: 1. Setting them to "no-index". But I am concerned that removing product pages is sending the wrong message to Google. 2. Enhancing the content and making certain that each page has at least 150-200 words. Re-writing 350 listings would be a real project, but if necessary to recover I will bite the bullet. What is the best way to address this issue? I am very surprised that Google does not understand that product URLs can be very brief and yet have useful content. Information about a potential office rental that lists location, size, price per square foot is valuable to the visitor but can be very brief. Especially listings that change frequently. So I am surprised by the penalty. Would I be better off not having separate URLs for the listings, and for instance adding them as posts within building pages? Is having separate URLs for product pages with minimal content a bad idea from an SEO perspective? Does anyone have any suggestions as to how I can recover from this latest Panda penalty? Thanks, Alan

    | Kingalan1
    0

  • We have changed the design of a website, from an oscommerce site to a new responsive website with customized programming. After the 301 redirects we have lost 1 to 2 positions in Google Rankings of the most visited categories. This are real data <colgroup><col span="5" width="80"></colgroup>
    | page | brand | page | CTR | average position |
    | old | fagor | http://www.electrorecambio.es/tienda/fagor-m-41.html | 15% | 6,6 |
    | new | fagor | http://www.electrorecambio.es/fagor | 13% | 7,2 |
    | old | teka | http://www.electrorecambio.es/tienda/teka-m-39.html | 12% | 7,2 |
    | new | teka | http://www.electrorecambio.es/teka | 9% | 8,8 |
    | old | balay | http://www.electrorecambio.es/tienda/balay-m-81.html | 12% | 7,4 |
    | new | balay | http://www.electrorecambio.es/balay | 11% | 8,6 |
    | old | bosch | http://www.electrorecambio.es/tienda/bosch-m-44.html | 10% | 7,4 |
    | new | bosch | http://www.electrorecambio.es/bosch | 8% | 11 | Edited: As this table is not shown properly I have added an image For you to check the old page you can see the old urls in the folder tienda2. For example http://www.electrorecambio.es/tienda/bosch-m-44.html can be checked in http://www.electrorecambio.es/tienda2/bosch-m-44.html I would like to know if you see any important information that could justify this drop down in rankings Thanks!!! data-webmaster-tools.jpg

    | teconsite
    1

  • Category A spans over 20 pages (not possible to create a "view all" because page would get too long). So I have page 1 - 20. Page 1 has unique content whereas page 2-20 of the series does not. I have "noindex, follow" on page 2-20. I also have rel=next prev on the series. Question: Since page 2-20 is "noindex, follow" doesn't that defeat the purpose of rel=next prev? Don't I run the risk of Google thinking "hmmm….this is odd. This website has noindexed page 2-20, yet using rel=next prev." Even though I do not run the risk, what is my upset in keeping rel=next prev when, again, the pages 2-20 are noindex, follow. thank you

    | khi5
    0

  • Hello, We have a site that was recently put through the redesign process-a couple of weeks ago. It was a tired site that was optimized well, but still struggled because it was so outdated. I went ahead and re-optimized, submitted a new sitemap, and did the fetch. Have I missed a step? Could someone offer insight into what they do when a site is redesigned and the steps taken to make sure that Google crawls and "appreciates" 🙂 the new site as soon as possible? Thanks in advance for any and all help!

    | lfrazer
    0

  • Hello all! I have always worked successfully with SEO on E-commerce sites, however we are currently revamping an older site for a client and so I thought I'd turn to the community to ask what the best practices that you guys are experiencing for url structures at the moment. Obviously we do not wish to create duplicate content and so the big question is, what would you guys do for the very best structure for URLs on an E-commerce site that has products in multiple categories? Let's imagine we are selling toy cars. I have a sports car for sale, so naturally it can go in the sports cars category and it could also go in to the convertibles category too. What is the best way you have found recently that works and increases rankings, but does not create duplicate content? Thanks in advance! 🙂 Kind Regards, JDM

    | Hatfish
    0

  • With so many articles on the web talking about how difficult Joomla is to work with in regards to SEO, I'm curious as to what techniques / changes you guys make when using Joomla with your SEO / inbound practices? Any extensions that you love? An extensions that you hate?

    | DougHoltOnline
    0

  • What format to use (how to write) to ask customers to +1 the page of the service/product they used and liked on Google+?

    | MasonBaker
    0

  • Hello! We are producing multiple videos (each about 1-minute long) for a company website. We have decided to use Wistia to host them, in order get the full SEO benefits of links to the videos. I have two questions: 1. Would it definitely be better for SEO to divide up the videos and place them on the various existing pages of the site that are related to the video content, rather than putting all the videos together on a separate video page? 2. If we do put different videos on different pages, would it be a bad idea also to have a video page with all the videos together? Would this be considered duplicate content? Thank you very much!

    | nyc-seo
    0

  • We have two sites .com and .co.uk. Both are selling sites and the .com sells in $ and .co.uk in £s.
    75% of the text is from the .co.uk site and used on the .com site. Each site has 6000+ pages, 4000+ contain product descriptions that are identical. We have looked at canonical and hreflang, but neither seem to fix the problem of duplication issues. We can add into the product detail master page rel alternative, but this will not fix the other potential clashes on the other pages. Can anyone advise if we can add a site wide html to each site or one that will fix this. Many thanks

    | BruceA
    0

  • I'm considering moving a site from http://www.domain.com/ to https://domain.com/.  I would put a 301 redirect in place to make sure all of the links and traffic transfer over but am worried about losing rankings since we have many years worth of links going to the old urls.  My understanding is that a 301 will transfer 90%+ of the link weight to the new url, but not 100%.  Is there an exception to this rule when doing a 301 redirect within the same domain (but to a different protocol and subdomain)?  Should we expect to lose 1-10% of our link weight if we chose to make this switch?

    | DepositAccounts
    0

  • Our company has a complex mobile situation, and I'm trying to figure out the best way to implement bidirectional annotations and a mobile sitemap. Our mobile presence consists of three different "types" of mobile pages: Most of our mobile pages are mobile-specific "m." pages where the URL is completely controlled via dynamic parameter paths, rather than static mobile URLs (because of the mobile template we're using). For example: http://m.example.com/?original_path=/directory/subdirectory. We have created vanity 301 redirects for the majority of these pages, that look like http://m.example.com/product that simply redirect to the previous URL. Six one-off mobile pages that do have a static mobile URL, but are separate from the m. site above. These URLs look like http://www.example.com/product.mobile.html Two responsively designed pages with a single URL for both mobile and desktop. My questions are as follows: Mobile sitemap: Should I include all three types of mobile pages in my mobile sitemap? Should I include all the individual dynamic parameter m. URLs like http://m.example.com/?original_path=/directory/subdirectory in the sitemap, or is that against Google's recommendations? Bidirectional Annotations: We are unable to add the rel="canonical" tag to the m. URLs mentioned in section #1 above because we cannot add dynamic tags to the header of the mobile template. We can, however, add them to the .mobile.html pages. For the rel="alternate" tags on the desktop versions, though, is it correct to use the dynamic parameter URLs like http://m.example.com/?original_path=/directory/subdirectory as the mobile version target for the rel="alternate" tag? My initial thought is no, since they're dynamic parameter URLs. Is there even any benefit to doing this if we can't add the bidirectional rel="canonical" on those same m. dynamic URLs? I'd be immensely grateful for any advice! Thank you so much!

    | Critical_Mass
    0

  • We are looking to create a new subfolder approach within our website versus our current subdomain approach. How should we go about handling this politely as to not lose everything we've worked on up to this point using the subdomain approach? Do we need to redirect every subdomain URL to the new subfolder page? Our current local pages subdomain set up: stores.websitename.com How we plan on adding our new local subfolder set-up: websitename.com/stores/state/city/storelocation Any and all help is appreciated.

    | SEO.CIC
    0

  • Since Google shows more pages indexed than makes sense, I used Google's API and some other means to get everything Google has in its index for a site I'm working on. The results bring up a couple of oddities. It shows a lot of urls to the same page, but with different tracking code.The url with tracking code always follows a question mark and could look like: http://www.MozExampleURL.com?tracking-example http://www.MozExampleURL.com?another-tracking-examle http://www.MozExampleURL.com?tracking-example-3 etc So, the only thing that distinguishes one url from the next is a tracking url. On these pages, canonical tags are in place as: <link rel="canonical<a class="attribute-value">l</a>" href="http://www.MozExampleURL.com" /> So, why does the index have urls that are only different in terms of tracking urls? I would think it would ignore everything, starting with the question mark. The index also shows paginated pages. I would think it should show the one canonical url and leave it at that. Is this a problem about which something should be done? Best... Darcy

    | 94501
    0

  • Hey Community, Got a question about the disavow tool for you. My site is getting thousands of 404 errors from old blog/coupon/you name it sites linking to our old URL structure (which used underscores and ended in .jsp). It seems like the webmasters of these sites aren't answering back or haven't updated their sites in ages so it's returning 404 errors. If I disavow these domains and/or links will it clear out these 404 errors in Google? I read the GWT help page on it, but it didn't seem to answer this question. Feel free to ask any questions that may help you understand the issue more. Thanks for your help,
    -Reed

    | IceIcebaby
    0

  • We are re-launching a client site that does very well in Google. The new site is on a www2 domain which we are going to send a controlled amount of traffic to, 10%, 25%, 50%, 75% to 100% over a 5 week period. This will lead to a reduction in traffic to the original domain. As I don't want to launch a competing domain the www2 site will not be indexed until 100% is reached. If Google sees the traffic numbers reducing over this period will we drop? This is the only part I am unsure of as the urls and site structure are the same apart from some new lower level pages which we will introduce in a controlled manner later? Any thoughts or experience of this type of re-launch would be much appreciated. Thanks Pete

    | leshonk
    0

  • Hi, At the moment im confused. I have a page which shows up for the query 'bank holidays' first page solid for 2 years - this also applies to the terms 'mothers day', 'pancake day' and a few others (UK Google). And there still ranking. Here is the problem: Usually I would rank for 'bank holidays 2014' (the terms with the year in are the real traffic drivers) and would be position 3/5. Over the last 3 months this has decayed dropping position to 30+. From the screenprint you can see the term 'Bank Holidays' is holding on but the term 'bank holidays 2014' is slowly decaying. If you query 'bank holidays 2015' we don't appear in rankings at all. What is causing this? The content is ok, social sharing happens and the odd link is picked up hear and there. I need help, how do I start pushing this back in the other direction, its like the site is slowly dying. And what really kills me, is 2 pages are ranking on page1 off link farms. URL: followuk.co.uk/bank-holidays serp-decay.jpg

    | followuk
    0

  • Which one do u prefer and why? Does RDFa is better for SEO or is just the same as microdata?

    | SeoMartin1
    0

  • Hello! I am a beginner SEO specialist and a have a problem with duplicate pages content. The site I'm working on is an online shop made with Prestashop. The moz crawl report shows me that I have over 4000 duplicate page content. Two weeks ago I had 1400. The majority of links that show duplicate content looks like bellow:
    http://www.sitename.com/category-name/filter1
    http://www.sitename.com/category-name/filter1/filter2 Firstly, I thought that the filtres don't work. But, when I browse the site and I test it, I see that the filters are working and generate links like bellow:
    http://www.sitename.com/category-name#/filter1
    http://www.sitename.com/category-name#/filter1/filter2 The links without the # do not work; it messes up with the filters.
    Why are the pages indexed without the #, thus generating me duplicate content?
    How can I fix the issues?
    Thank you very much!

    | ana_g
    0

  • Greetings MOZ community: If I have a site with about 200 thin content pages that I want Google to remove from their index, should I set them to "No Index, No Follow" or to "No Index, Follow"? My SEO firm has advised me to set them to "No Index, Follow" but on a recent MOZ help forum post someone suggested "No Index, No Follow". The MOZ poster said that telling Google the content was should not be indexed but the links should be followed was inconstant and could get me into trouble. This make a lot of sense. What is proper form? As background, I think I have recently been hit with a Panda 4.0 penalty for thin content. I have several hundred URLs with less than 50 words and want them de-indexed. My site is a commercial real estate site and the listings apparently have too little content. Thanks, Alan

    | Kingalan1
    0

  • I have a client that wants to rank for a bunch of locations around his primary location. Say 30 minutes away.  So we created a bunch of pages for cities around his location.  So far it seems to be working pretty well.  That said, I heard from someone else that Google really doesn't like these type of pages anymore and that we are better off with just one location page and list the areas we server on it. What are your thoughts and experiences?

    | netviper
    0

  • Greeting MOZ Community: I run www.nyc-officespace-leader.com, a real estate website in New York City. The page count in Google Webmaster Tools Index status for our site is 850. The page count in our Webmaster Tools Sitemap is 637. Why is there a discrepancy between the two? What does the Google Webmaster Tools Index represent? If we filed a removal request for pages we did not want indexed, will these pages still show in the Google Webmaster Tools page count despite the fact that they no longer display in search results? The number of pages displayed in our Google Webmaster Tools Index remains at about 850 despite the removal request. Before a site upgrade in June the number of URLs in the Google Webmaster Tools Index and Google Webmaster Site Map were almost the same. I am concerned that page bloat has something to do with a recent drop in ranking. Thanks everyone!! Alan

    | Kingalan1
    0

  • Hi all, My site is www.actronics.eu Moz reports virtually every product page as duplicate content, flagged as HIGH PRIORITY!. I know why. Moz classes a page as duplicate if >95% content/code similar. There's very little I can do about this as although our products are different, the content is very similar, albeit a few part numbers and vehicle make/model. Here's an example:
    http://www.actronics.eu/en/shop/audi-a4-8d-b5-1994-2000-abs-ecu-en/bosch-5-3
    http://www.actronics.eu/en/shop/bmw-3-series-e36-1990-1998-abs-ecu-en/ate-34-51 Now, multiply this by ~2,000 products X 7 different languages and you'll see we have a big dupe content issue (according to Moz's Crawl Diagnostics report). I say "according to Moz..." as I do not know if this is actually an issue for Google? 90% of our products pages rank, albeit some much better than others? So what is the solution? We're not trying to deceive Google in any way so it would seem unfair to be hit with a dupe content penalty, this is a legit dilemma where our product differ by as little as a part number. One ugly solution would be to remove header / sidebar / footer on our product pages as I've demonstrated here - http://woodberry.me.uk/test-page2-minimal-v2.html since this removes A LOT of page bloat (code) and would bring the page difference down to 80% duplicate.
    (This is the tool I'm using for checking http://www.webconfs.com/similar-page-checker.php) Other "prettier" solutions would greatly appreciated. I look forward to hearing your thoughts. Thanks,
    Woody 🙂

    | seowoody
    1

  • I would really like to get everyone's opinion on how you all think Google deals with negative reviews, or just mentions on negative websites. One of my clients has a page on a powerful negative website (one that is designed to shame all those on it), and no other real reviews around the web. I have never seen any evidence that Google takes positive or negative reviews into account when ranking websites. But maybe one of you has? Currently when you search for my client by name, the negative website comes up second, which is obviously embarrassing for them. If we sought out a whole load of positive (obviously genuine) reviews from happy clients, do you think this might influence the prominent placement of this negative website? Also would it influence the ranking of the website in general? I would love to hear your opinions on this topic. [BTW We have already explored the path of the right to be forgotten, but it seems to be inundated so we are not holding our breaths.]

    | Wagada
    0

  • My website www.weddingphotojournalist.co.uk appears to have been penalised by Google. I ranked fairly well for a number of venue related searches from my blog posts. Generally I'd find myself somewhere on page one or towards the top of page two.  However recently I found I am nowhere to be seen for these venue searches. I still appear if I search for my name, business name and keywords in my domain name. A quick check of Yahoo and I found I am ranking very well, it is only Google who seem to have dropped me.  I looked at Google webmaster tools and there are no messages or clues as to what has happened. However it does show my traffic dropping off a cliff edge on the 19th July from 850 impressions to around 60 to 70 per day. I haven't made any changes to my website recently and hadn't added any new content in July. I haven't added any new inbound links either, a search for inbound links does not show anything suspicious. Can anyone shed any light on why this might happen?

    | weddingphotojournalist
    0

  • Hi, When I search for my site www.docslinc.com  as "docslinc.com" the results on the SERPS have the home page and the site map but not the other indexed pages. The other issue occurs when I search for the company name alone "docslinc", the homepage does not show up at all, and some of the other pages show up. I have looked all over the place and cannot find an answer. I have checked the onsite optimization and it all seems to be correct. Any suggestions would be amazing. Thanks, zulumanf

    | zulumanf
    0

  • Right now we rank on page 2 for many KWs, so should i now focus my attention on getting links to my home page to build domain authority or continue to direct links to the internal pages for specific KWs? I am about to write some articles for several good ranking sites and want to know whether to link my company name (same as domain name) or KW to the home page or use individual KWs to the internal pages - I am only allowed one link per article to my site. Thanks Ash

    | AshShep1
    0

  • A company is performing some major updates to a website and the proposal to go live with the updates was explained as follows: Once the updates are done on the testing environment and the site is ready to go live, we switch the DNS to the testing environment and then this testing environment becomes the production site. And the old production site becomes the new testing environment. Are there any potential negatives to this? Is there a name for this technique? Of course, we've already considered : additional hosting cost potential performance differences- reinstalling and setting up server settings - SSL, etc.

    | Motava
    0

  • Good Morning! So I have recently been putting in a LOT of 301's into the .htaccess, no 301 plugins here, and GWMT is still seeing a lot of the  pages as soft 404's. I mark them as fixed, but they come back. I will also note, the previous webmaster has ample code in our htaccess which is rewriting our URL structure. I don't know if that is actually having any effect on the issue but I thought I would add that. All fo the 301's are working, Google isn't seeing them. Thanks Guys!

    | HashtagHustler
    0

  • We run a training center. We had 1 main website and 2 dedicated websites to certain themes. The 2 dedicated websites are older and the main website is about 6 months old. The 2 dedicated websites had a top 5 ranking for their most important keywords. 2 weeks ago we imported all the content from the dedicated websites into the main website. Then immediately after we did a perfect 301 redirect of these websites to the main website. 2 SEO companies checked it for us and so I'm very sure this is done right. Google immediately caught this up and gave the main website a boost. We where in the top 10 for many important keywords for 1 week. The next week all our rankings dropped. We only have a top 50 ranking for 10 keywords. Before it was 75 keywords in the top 20. Do you know what could have caused this? Any suggestion, thought, ... is welcome!

    | wellnesswooz
    0

  • Hi everyone.  Google has recently started using our old .net instead of .com in the SERPS.  I went to do a change of address for the old http://www.sqlsentry.net and it gives me the "There is no change of address pending for your site."  I've tried for both www and non-www and still get the same result.  All pages seem to be redirecting to the new site with no issues. Is there anything else I can do to change this?  Or is there a step I'm missing?  Thanks!

    | Sika22
    0

  • Hello All; After reading a lot of great info on Moz, I wanted to ask my first question.  We are thinking about buying a competitor.  It is a smaller ecommerce site that sells a subset of the products we already sell.  The main value of the site is good ranking for a small set of keywords.  For example, if we acquire this site, we would now have up to 4 listing on page 1 for some targeted keywords. We plan to operate the site separately from our own with its current name and technology.  We plan to be transparent with the domain registration info, change the contact info on the new site, etc.  Will the fact that both sites are owned by the same company negatively affect keyword rankings? Instead of having 4 listings for some terms will one of our sites be lowered since one company operates both sites? The site we are buying does not have a high MozRank, MozTrust or Domain Authority score.   But it does have a domain that has been around for a long time.    In addition to the keywords, there is value is using this new site to do marketing tests and experiment a bit.  Thanks for any input. Paul

    | paulgerst
    0

  • Hi Moz community! Wondering if someone can help? One of my clients (online fashion retailer) has been receiving huge increase in server errors (500's and 503's) over the last 6 weeks and it has got to the point where people cannot access the site because of server errors. The client has recently changed hosting companies to deal with this, and they have just told us they removed the DNS records once the name servers were changed, and they have now fixed this and are waiting for the name servers to propagate again. These errors also correlate with a huge decrease in pages blocked by robots.txt file, which makes me think someone has perhaps changed this and not told anyone... Anyone have any ideas here? It would be greatly appreciated! 🙂 I've been chasing this up with the dev agency and the hosting company for weeks, to no avail. Massive thanks in advance 🙂

    | labelPR
    0

  • Hi, Would like some advice re our internal linking structure and possible keyword self cannibalization on our ecommerce site.. Will try and give you an overview. Imagine this page structure: Site
           Brand 1
           Brand 2
                      Brand 2 Shoes
                                 Products
                       Brand 2 Sweaters Then say in Brand 2 Shoes page we have the shoes, e.g., the products labeled as Brand 2 Shoes "Name of Model"
    Brand 2 Shoes "Name of Model" Now, what I'm worried about is that if I do a search for "Brand 2 Shoes" it should bring up my landing page right? But it doesn't, it brings up some of the products instead... I'm worried that we may be self cannibalizing some of the keywords - and thinking of changing the product page to be "Brand Name of Model Shoes" or "Name of Model Shoes by Brand" Any ideas or comments appreciated! Thanks all

    | bjs2010
    0

  • I have been doing some searches on google to see where my new site shows up, I started using the search words "graphic design firm st. louis" as a gauge, because my title is St. Louis Missouri Graphic Design Firm. I showed up on about page 5 to start , if I include the word "firm" and a few pages further back if I just search "graphic design st. louis", without the word firm. It seemed i was slowly moving up pages with both searches and then a few days ago I jumped to page 1 for search "graphic design firm st. louis" the thing is it doesnt show up at all now if i search "graphic design st. louis" without the word firm. what would cause the one search to jump so high while the other one dissapeared completely?? and what can i do? my keyword density is same for both , any ideas.

    | eric6966
    0

  • We recently sold our established domain -- for a compelling price -- and now have the task of transitioning to our new domain. What steps would you recommend to lesson the anticipated decline from search engines in this scenario?

    | accessintel
    0

  • I operate a real estate web site in New York City (www.nyc-officespace-leader.com). It was hit by Penguin in April 2012, with search volume falling from 6,800 per month in March 2012 to 3,300 by June 2012. After refreshing content and changing the theme, volume recovered to 4,300 per month in October 2013. There was a big improvement in early October 2013, perhaps tied to a Panda update. In November 2013 I hired an SEO company. They are reputable; on MOZ's recommended list. After following all their suggestions (searching and removing duplicate content, disavowing toxic links, improving the site structure to make it easier for Google to index listings, re-writing ten key landing pages, improving the design of the user interface) ranking and traffic started to decline in April of 2014 and crashed in June 2014 after an upgraded design with improved user interface was launched.  Search volume is went from 4700 in March to around 3800 in June. However ranking on the keywords that generate conversions has really declined, and clicks from those terms are down at least 65%. My online business is severely compromised after I have spent almost double the anticipated budget to improve ranking and conversion. A few questions: 1. Could a drop in the number of domains lining to our site have led to this decline? About 30 domains that had toxic links to us agreed to remove them. We had another 70 domains disavowed in late April. We only  have 78 domains pointing to our domain now, far less than before (see attached AHREFs image). It seems there is a correlation in the timeline between the number of domains pointing to us and ranking performance. The number of domains pointing to us has never been this low. Could this be causing the drop? My SEO firm believes that the quality of these links are very low and the fact that many are gone is in fact a plus. 2. The number of indexed pages has jumped to 851 from 675 in early June (see attached image from Google Webmaster tools), right after a site upgrade. The number of pages in the site map is around 650. Could the indexation of the extra 175 page somehow have diluted the quality of the site in Google's eyes? We have filed removal request for these pages in Mid June and again last week with Google but they still appear. In 2013 we also launched an upgrade and Google indexed an extra 500 pages (canonical tags were not set up correctly) and search volume and ranking collapsed. Oddly enough when the number of pages indexed by Google fell, ranking improved. I wonder if something similar has occurred. 3. May 2014 Panda update. Many of our URLs are product URLs of listings. They have less than 100 words. Could Google suddenly be penalizing us for that? It is very difficult to write descriptions of hundreds of words for products that change quickly. I would think the Google takes this into account. If someone could present some insight into this issue I would be very, very grateful. I have spent over $25,000 on SEO reports, wireframe design and coding and now find myself in a worse position than when I started. My SEO provider is now requesting that I purchase even more reports for several thousand dollars and I can't afford it, nor can I justify it after such poor results.  I wish they would take it upon themselves to identify what went wrong. In any case, if anyone has any suggestions I would really appreciate it. I am very suspicious that this drop started in earnest at the time of link removal and the disavow and accelerated at the time of the launch of the upgrade. Thanks, Alan XjSCiIdAwWgU2ps e5DerSo tYqemUO

    | Kingalan1
    0

  • Hey everyone, How goes it? I've got a bunch of duplicate content issues flagged in my Moz report and I can't figure out why. We're a ticketing site and the pages that are causing the duplicate content are for events that we no longer offer tickets to, but that we will eventually offer tickets to again. Check these examples out: http://www.charged.fm/mlb-all-star-game-tickets http://www.charged.fm/fiba-world-championship-tickets I realize the content is thin and that these pages basically the same, but I understood that since the Title tags are different that they shouldn't appear to the Goog as duplicate content. Could anyone offer me some insight or solutions to this? Should they be noindexed while the events aren't active? Thanks

    | keL.A.xT.o
    1

  • LEDSupply.com is my site, and before becoming familiar with schema mark-up I used the 'data-highlighter' in webmaster tools to mark-up as much of the site as I could.  Now that Schema is set-up I'm wondering if having both active is bad and am thinking I should delete the previous work with the 'data highlighter' tool. To delete or not to delete?  Thank you!

    | saultienut
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.