Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Now I have facets setup with ajax and ajax just adding parameter #facet1... at end of URL and I have setup canonical so that domain.com/category/#facet1 refers to
    domain.com/category/ Would you make the facet links no-follow or better better not to add no-follow for better link juice distribution?
    Would you hide the whole facet block from google and if so how? Any thoughts?

    | lcourse
    0

  • -In early January 2013, we had to switch servers after many years with the same one.  We were highly ranked and getting about 8500 unique visitors per month.  -We didn't notice the traffic falling because we were focussed on a major site redesign and addition that we launched in April 2013.  Visits continued to fall, this time also because the company that launched it didn't double check their work and had some dead links etc.  Those were all fixed by approximately June 2013.- early January 2014 we switched servers again because we were afraid the new server we moved to was perhaps ranked poorly or was possibly a spamming site before.  Currently, nothing has changed.  What was about 8500 unique visitors per month 18 months ago, is now about 1,000 and no leads are coming in at all.

    | HasitR
    0

  • I have a simple site that has cities as subdirectories (so URL is root/cityname). All of my content is localized for the city. My "root" page simply links to other cities. I very specifically want to rank for "topic" pages for each city and I'm trying to figure out where to put the sitemap so Google crawls everything most efficiently. I'm debating the following options, which one is better? Put the sitemap on the footer of "root" and link to all popular pages across cities. The advantage here is obviously that the links are one less click away from root. Put the sitemap on the footer of "city root" (e.g. root/cityname) and include all topics for that city. This is how Yelp does it. The advantage here is that the content is "localized" but the disadvantage is it's further away from the root. Put the sitemap on the footer of "city root" and include all topics across all cities. That way wherever Google comes into the site they'll be close to all topics I want to rank for. Thoughts? Thanks!

    | jcgoodrich
    0

  • One of my customers has a mature site that performs very well in ranking and traffic for major keywords, prosun.com.   A few years ago we started welproma.com because they were changing their name and branding.  We built up welproma.com as an eventual replacement and ramped up to 30% of the Prosun.com traffic.   Penguin hit a bit in 2012 but very bad May 24, 2013 and it keeps getting worse. Now they are backing out of the name change, reverting back to prosun.com as the main website. Unfortunately the Welproma.com content is far better in quantity and quality so we would prefer not to waste it.   Does anyone think it is a problem to take essentially the exact content from the newer, penalized site and move it to the older well performing site.  We will use no links whatsoever between the two sites and take down the new one once we switch.

    | phogan
    0

  • Does Google penalise content that sits behind a read gate? Currently, most of the content on our site sits behind a read gate. People have to register before they can view the detailed content. Currently, our forums are accessible to all which draws a lot of long tail traffic. Google does seem to be indexing some of our gated content, but can someone advise me how they view this content more generally please?

    | RG_SEO
    0

  • My question is how to best structure the links on a "Card" while maintaining usability for touchscreens. I've attached a simple wireframe, but the "card" is a format you see a lot now on the web: it's about a "topic" and contains an image for the topic and some text. When you click the card it links to a page about the "topic". My question is how to best structure the card's html so google can most easily read it. I have two options: a) Make the elements of the card 2 separate links, one for the image and one for the text. Google would read this as follows. //image
    [](/target URL) //text
    <a href=' target="" url'="">Topic</a href='> b) Make the entire "Card" a link which would cause Google to read it as follows: <a></a> <a>Bunch of div elements that includes anchor text and alt-image attributes above along with a fair amount of additional text.</a> <a></a> Holding UX aside, which of these options is better purely from a Google crawling perspective? Does doing (b) confuse the bot about what the target page is about? If one is clearly better, is it a dramatic difference? Thanks! PwcPRZK

    | jcgoodrich
    0

  • We do not have enough content rich page to target all of our keywords. Because of that My SEO guy wants to set some corner stone blog articles in order to rank them for certain key words on Google. He is asking me to use the following rule in our article writing(We have blog on our website):
    For example in our articles when we use keyword "wolf", link them to the blog page:
    https://www.mywebsite.com/blog/tag/wolf/
    It seems like a good idea because in the tag page there are lots of material with the Keyword "wolf" . But the problem is when I search for keyword "wolf" for example on the Google, some other blog pages are ranked higher than this tag page. But he tells me in long run it is a better strategy. Any idea on this?

    | AlirezaHamidian
    0

  • I’m currently planning to a retire a discontinued product and put a 301 redirect to a related product (although not identical). The thing is, I’m still getting significant traffic from people searching for the old product by name. Would Google send this traffic to the new pages via the re-direct? Is Google likely to display the new page in place of the old page for similar queries or will it serve other content? I’d like to answer this question so that I can decide between the two following approaches: 1)      Retiring the old page immediately and putting a 301 redirect to the new related pages. This will have the advantage of transferring the value of any link signals / referring traffic. Traffic will also land on the new pages directly without having to click through from another page. We would have a dynamic message telling users that the old product had been retired depending on whether they had visited out site before. 2)      Keep the old product pages temporarily so that we don’t lose the traffic from the search engines. We would then change the old pages to advise users that the old product was now retired, but that we have other products that might solve their problems. When this organic traffic decreases over time, then we will proceed with the re-direct as above. I am worried though that the old product pages might outrank the new product pages. I’d really appreciate some advice with this. I’ve been reading lots of articles, but it seems like there are different opinions on this. I understand that I will lose between 10% - 15% of page rank as per the Matt Cutts video.

    | RG_SEO
    0

  • Hi all. I'm working on this page - http://www.alwayshobbies.com/dolls-houses - for the term 'dolls houses'. It's not doing great at the minute (23rd in GUK) and I was wondering if it might be down to the volume of exact match keywords on the page (32). If not, does anyone have any other pointers? Thanks!

    | Blink-SEO
    0

  • I have a real estate site with MLS data (real estate listings shared across the Internet by Realtors, which means data exist across the Internet already). Important pages are the "MLS result pages" - the pages showing thumbnail pictures of all properties for sale in a given region or neighborhood. 1 MLS result page may be for a region and another for a neighborhood within the region:
    example.com/region-name and example.com/region-name/neighborhood-name
    So all data on the neighborhood page will be 100% data from the region URL. Question: would it make sense to "NoIndex" such neighborhood page, since it would reduce nr of non-unique pages on my site and also reduce amount of data which could be seen as duplicate data? Will my region page have a good chance of ranking better if I "NoIndex" the neighborhood page? OR, is Google so advanced they know Realtors share MLS data and worst case simple give such pages very low value, but will NOT impact ranking of other pages on a website? I am aware I can work on making these MLS result pages more unique etc, but that isn't what my above question is about. thank you.

    | khi5
    0

  • My local website was hit by google and I have done all steps to remove the penalty, But it's still not ranked. So it is better to make a new website with new content and start working on it?

    | Dan_Brown1
    0

  • I've just found out that a client has multiple domains which are being indexed by google and so leading me to worry that they will be penalised for duplicate content.  Wondered if anyone could confirm a) are we likely to be penalised? and b) what should we do about it?  (i'm thinking just 301 redirect each domain to the main www.clientdomain.com...?). Actual domain = www.clientdomain.com But these also exist: www.hostmastr.clientdomain.com www.pop.clientdomain.com www.subscribers.clientdomain.com www.www2.clientdomain.com www.wwwww.clientdomain.com ps I have NO idea how/why all these domains exist I really appreciate any expertise on this issue, many thanks!

    | bisibee1
    0

  • I have decided to replace my seo company. The pint is this company has been partly my developer too. So he has set up a demo server of my website. 1- Should I be worried about duplicate material when I end my cooperation with this company(The demo server) 2- Should I be worried that if they do not like it, they go and delete all the submitted materials and destroy my pages rankings? Thanks all

    | AlirezaHamidian
    0

  • We just merged with another company and are redirecting their domains (competitive/similar content) to our own. We'll have several domains, redirecting (301) several hundred thousand URL's to our domain (not all the same page, very unique mappings). Will adding utm_source, et al parameters to the URL's have a negative impact on how google transfers value to the pages based on the redirect authority passed? Any points of view? We have a self referencing canonical, but given that we have 90 million pages on the current domain (and climbing), seems like cleanest approach would be to not use redirects. Thanks, Jeff

    | jrjames83
    0

  • Hello, I'm a big fan of clean urls. However i'm curious as to what you guys do, to remove them in a friendly way which doesn't cause confusion. Standard URLS
    http://www.example.com/example1.html
    http://www.example.com/example2.html
    http://www.example.com/example3.html
    http://www.example.com/example4.php
    http://www.example.com/example5.php What looks better (in my eyes)
    http://www.example.com/example1/
    http://www.example.com/example2/
    http://www.example.com/example3/
    http://www.example.com/example4/
    http://www.example.com/example5/ Do you keep extensions throughout your website, avoiding any sort of confusion and page duplication; OR Put a canonical link pointing to the extension-less version of each page, with the anticipation of this version indexing into Google and other Search Engines. OR 301 Each page which has an extension to an extension-less version, and remove all linking to ".html" site wide causing errors within software like Dreamweaver, but working properly. OR Another way? Please emphasise I'm sorry if this is a little vague and I appreciate any angles on this, I quite like clean url's but unsure a hassle-less way to create it. Thanks for any advice in advance

    | Whittie
    0

  • I'm a bit divided about the URL structure for ecommerce sites. I'm using Magento and I have Canonical URLs plugin installed. My question is about the URL structure and length. 1st Way: If I set up Product to have categories in the URL it will appear like this mysite.com/category/subcategory/product/  - and while the product can be in multiple places , the Canonical URL can be either short or long. The advantage of having this URL is that it shows all the categories in the breadcrumbs ( and a whole lot more links over the site ) .  The disadvantage is the URL Length 2nd Way: Setting up the product to have no category in the URL URL will be  mysite.com/product/ Advantage: short URL.  disadvantage - doesn't show the categories in the breadcrumbs if you link direct. Thoughts?

    | s_EOgi_Bear
    1

  • We have a site that has a mirror - i.e. www.domain.com and domain.com - there is not redirect both url's work and show pages so basically a site with 2 sets of URLs for each page. We have changed it so the domain.com and all assorted pages 301 redirect to the right URL with www. i.e. domain.com/about 301's to www.domain.com/about In the search engines the domain.com is the site indexed and the only www. page indexed is the homepage. I checked in the robots.txt file and nothing blocking the search engines from indexing both the www. and non www. versions of the site which makes me wonder why did only one version get indexed and how did the clients avoid a duplicate content issue? Secondly is it best to get the search engines to unidex domain.com and resubmit www.domain.com for the full site? We are definately staying with the www.domain.com NOT domain.com so need to find the best way to get the site indexed with www. and remove the non www. Hope that makes sense and look forward to everyone's input.

    | JohnW-UK
    0

  • Is there much google juice to be had by moving a key "money making" product up the URL structure? For example, in this URL http://www.over50choices.co.uk/Funeral-Planning/Over-50-Life-Insurance.aspx will we gain any juice moving "Over-50-life-insurance" out of the "funeral planning" category and directly to the Domain eg www.over50choices/over-50-life-insurance.aspx  ? The page currently ranks on page 2 and 3 for various phrases and we are looking to get to page 1 - its a very competitive set of keywords! Thanks Ash

    | AshShep1
    0

  • We’re a software company. Would someone be able to help me with a basic process for retiring old product pages and re-directing the SEO value to new pages. We are retiring some old products to focus on new products. The new software has much similar functionality to the old software, but has more features. How can we ensure that the new pages get the best start in life? Also, what is the best way of doing this for users? Our plan currently is to: Leave the old pages up initially with a message to the user that the old software has been retired. There will also be a message explaining that the user might be interested in one of our new products and a link to the new pages. When traffic to these pages reduces, then we will delete these pages and re-direct them to the homepage. Has anyone got any recommendations for how we could approach this differently? One idea that I’m considering is to immediately re-direct the old product pages to the new pages. I was wondering if we could then provide a message to the user explaining that the old product has been retired but that the new improved product is available. I’d also be interested in pointing the re-directs to the new product pages that are most relevant rather than the homepage, so that they get the value of the old links. I’ve found in the past that old retirement pages for products can outrank the new pages as until you 301 them then all the links and authority flow to these pages. Any help would be very much appreciated 🙂

    | RG_SEO
    0

  • I would like to migrate my current website, which is asp.net, to WordPress. However the current asp.net is sitting on hosting which is windows based and WordPress isn't very compatible. Do I need to migrate hosting to a Linux based hosting provider? But if I do can I still migrate the asp.net files from my current website so I can 301 redirect? Any help on this would be great. Regards, Tom

    | CoGri
    0

  • In reviewing my crawl results I have 5666 pages of duplicate content. I believe this is because many of the indexed pages are just different ways to get to the same content. There is one primary culprit. It's a series of URL's related to CatalogSearch - for example; http://www.careerbags.com/catalogsearch/result/index/?q=Mobile I have 10074 of those links indexed according to my MOZ crawl. Of those 5349 are tagged as duplicate content. Another 4725 are not. Here are some additional sample links: http://www.careerbags.com/catalogsearch/result/index/?dir=desc&order=relevance&p=2&q=Amy
    http://www.careerbags.com/catalogsearch/result/index/?color=28&q=bellemonde
    http://www.careerbags.com/catalogsearch/result/index/?cat=9&color=241&dir=asc&order=relevance&q=baggallini All of these links are just different ways of searching through our product catalog. My question is should we disallow  - catalogsearch   via the robots file? Are these links doing more harm than good?

    | Careerbags
    0

  • Hey Guys Would very much appreciate all opinions on our following situation, we have an .uk based ecommerce sports nutrition site www.cardiffsportsnutrition.co.uk Previously we worked with an SEO, that to put it simply did not follow webmaster guidelines(money anchor heavy, bad links etc), we reached some very good ranks too quickly and subsequently after the first penguin we where hit. We didn't receive any link warning or manual penalties just what i am assuming algorithmic...Rankings and traffic drop significantly, but not business ending. Since the first penguin we have done very little to no SEO, some unique content, re-writing of product pages, lots of social activity and didnt really lose much traffic after that, some small ups and down after refresh s and a slight slow decline on some keywords. Come Penguin 2.0...things that where still ranking for have now dropped even further, impressions in webmasters is now down over 50% and we have a had a wkly but not drastic drop in traffic since then. Over the last couple of months we have obtained some good quality links, have added lots of great unique content that has been shared significantly and generated some great traffic to our blog, added more unique product pages and category pages. But organically things are starting to look pretty grim apart from our brand keywords and everything is still in a slow decline and no increase in impressions in webmasters either jsut small drops We have been working to remove the poor quality and toxic links that the previous SEO built,getting anchor text corrected and collating information on the whole process ready to submit a file of links to disavow tool. which we are planning to do within the next couple of wks. Now i have read some successful stories and some not so successful one, so im starting to think of how to deal with this worse case scenario, If our domain is too damaged by the previous SEO guys We have the same domain name but on the .com that will help us carry over our brand name directly, but my concern is even though we have not had any manual penalty and not 301'd the .com back to the .co.uk or any other form of link will the penalties be carried over to the new domain just on the basis of brand association. We wouldn't plan to redirect any of the .co.uk traffic back to the .com but rather focus on our already strong ablate less converting traffic from the likes of twitter and facebook and run a small PPC campaign for some brand keyword to help buffer the traffic loss. While we focus on building good quality links and putting up plenty of new quality content on the site on the new domain that does not have any poor quality links back to it. What im trying to avoid is carry on spending time money and effort on the .co.uk domain for the next 3/4 months and continue to lose traffic slowly and then have to switch the domain anyway. I plan to wait and see for the next 4-6wks after we run the dissavow to but October time would be time i would have to make a decision and go for it. Any advice or opinions would be appreciated marc

    | CSN
    0

  • Anyone know of any medical or health-related sites that have widely implemented medical schema types? For example: MedicalCode, MedicalTest, MedicalSignorSymptom, etc. and others listed here: http://schema.org/MedicalEntity I've reviewed the examples on schema.org, but it would he helpful to see some live examples in the wild. Thanks!

    | Allie_Williams
    1

  • I have to develop a strategy for link building. The SEO guy I have been speaking with has started putting links on .edu sites etc . To me - this "stinks" of manipulating the search engines - which I know we will get stung by at some point. I hope this isn't standard practice - but I don't know what the best way to improve rankings in terms of links etc. We sell health products and are starting to put out 3-4 high quality articles per week. Ideas? Kind Regards Martin

    | s_EOgi_Bear
    1

  • How write and what to include in an SEO cover letter to apply a job on Odesk? Any example??

    | ross254sidney
    0

  • Anyone have any experience and thoughts about the woo rank website and seo tool?

    | casper434
    1

  • Hello all, I am trying to recover a site from a manual penalty.  I already submitted once.  Here's what we did. We took the link profile from webmaster tools, majestic seo, ahrefs, link detox, and ose.  We manually looked at every link to exclude good links.  Then used a tool to run the removal campaign.  Submitted a disavow file and reconsideration request. Google came back with a denial.  When I looked at the three example links that Google provided, they were definitely spammy (forum profile and comment spam).  But none of them were in any of the original csv downloads from GWT, Ahrefs, Majestic, OSE, or LinkDetox. What can I do? Thanks in advance for any help.

    | NicoleDeLeon
    0

  • We redesigned a site and relaunched it on the same domain. All 301 redirects were completed and are working properly. Around the same time, they fired an seo company who was published inbound links to their site on spammy directories (and this was during the same time period that Google's Hummingbird algorithm change took place). After the website relaunch, their keyword rankings fell off dramatically; and in all of our research, we're not seeing what has caused this issue. I'm not seeing any red flags in their moz reports or even in their google analytics traffic; but organic keywords are way down, and now leads from organic traffic are also way down. Help??

    | grapevinemktg
    0

  • TL;DR - Is creating a page that has 80% of duplicated content from the past year's product model where 20% is about the new model changes going to be detrimental to duplicate content issues. Is there a better way to update minor yearly model changes and not have duplicated content? Full Question - We create landing pages for yearly products. Some years the models change drastically and other years there are only a few minor changes. The years where the product features change significantly is not an issue, it's when there isn't much of a change to the product description & I want to still rank on the new year searches. Since I don't want duplicate content by just adding the last year's model content  to a new page and just changing the year (2013 to 2014) because there isn't much change with the model, I thought perhaps we could write a small paragraph describing the changes & then including the last year's description of the product. Since 80% of the content on the page will be duplicated from the last year's model, how detrimental do you think this would be for a duplicate content issue? The reason I'm leaving the old model up is to maintain the authority that page has and to still rank on the old model which is still sold. Does anyone else have any other better idea other than re-writing the same information over again in a different way with the few minor changes to the product added in.

    | DCochrane
    0

  • Wistia (video hosting) has an embed feature, which can be set up to include a backlink. In other words, a user could embed a video on their site, but would automatically create a back link to the original page where it is posted. Is there a product to do similar with pictures, where I could give users options to easily take the pictures from my website, but it would include a back link to my site when they do use such picture?

    | khi5
    0

  • Hi folks, Our company has been doing SEO for www.3dincites.com since October. Since the website hadn't been optimized at all, one of the first things we did was to add keyword-optimized page title and meta description to the homepage. The person who implemented didn't do it right the first time so we needed to change the page title again after a week or so. We also changed the preferred domain in GWT. Ever since those changes were implemented, the homepage has been decreasing in rankings for the the keywords we optimized it for. This is very surprising as the website has high-quality unique content and pretty solid backlinks. Moreover, the search traffic to other pages is through the roof (2K increase since October) so the client is happy, however, this decrease in rankings for the homepage (from 3d to 6th page for "3d IC technology" which is the main keyword) doesn't let me sleep well at nights. Any ideas why this is happening?

    | zoomzoom
    0

  • Hi everyone,
    one of my sites has about 1000 'nofollow' links from the footer of another of my sites. Are these in any way hurtful? Any help appreciated..

    | romanbond
    0

  • We have an affiliate program for an educational related course product and I am becoming worried that links to us on our affiliate's websites are hurting our site rankings. I have read that google is usually pretty good about picking up on affiliate links and not giving the follow links credit, but not sure if that is just for the big affiliate networks or if they can spot less obvious affiliate programs. With this in mind, would you ask all affiliates to use the nofollow tag on all links coming in, or would you make sure that the links are more branded in nature?  There are a mix of text links along with banners and other display components. There would be editing that would need to be done to the core files of our affiliate/member software (aMember Pro) to make all links nofollow and we want to see if there are other recommendations before doing so. We are trying to fight out way out of what we believe is an over-optimized anchor text penalty and are evaluating all areas that we can make improvements. Any advice is greatly appreciated!

    | youngb55
    0

  • UPDATE - 19.02.2014: Hi, We got another negative answer from Google pointing again to our affiliate links, so the 301 redirect and block was not enough.
    I understand the need of contacting all of them and ask for the nofollow, we've started the process, but it will take time, alot of time. So I'd like to bring to your attention another 2 scenarious I have in mind: 1. Disavow all the affiliate links.
    Is it possible to add big amount of domains (>1000) to the disavow doc.? Anyone tryed this? 2. Serve 404 status for urls coming from affiliates that did not add noffolow attribute.
    This way we kinda tell G that content is no longer available, but we will end up with few thousand 404 error pages.
    The only way to fix all those errors is by 301 redirecting them afterwards (but this way the link juice might 'restart' flowing and the problem might persist). Any input is welcomed. Thanks Hi Mozers, After a reconsideration request regarding our link profile, we got a 'warning' answer about some of our affiliate sites (links coming from our affiliate sites that violate Google's quality guidelines). What we did (and was the best solution in trying to fix the 'seo mistake' and not to turn off the affiliate channel) was to 301 redirect all those links to a /AFFN/ folder and block this folder from indexing.
    We're still waiting for an answer on our last recon. request. I want to know you opinion about this? Is this a good way to deal with this type of links if they're reported? Changing the affiliate engine and all links on the affiliate sites would be a big time and technical effort, that's why I want to make sure it's truly needed. Best,
    Silviu

    | Silviu
    0

  • Hello, I desperately need a hand here! Firstly I just want to say that I we never infracted google guidelines as far as we know. I have been around in this field for about 6 years and have had success with many websites on the way relying only in natural SEO and was never penalised until now. The problem is that our website www.turbosconto.it is and we have no idea why. (not manual) The web has been online for more than 6 months and it NEVER started to rank. it has about 2 organic visits a day at max. In this time we got several links  from good websites which are related to our topic which actually keep sending us about 50 visits a day. Nevertheless our organic visita are still 1 or 2 a day. All the pages seem to be heavily penalised ... when you perform a search for any of our "shops"even including our Url, no entries for the domain appear. A search example: http://www.turbosconto.it zalando What I will expect to find as a result:  http://www.turbosconto.it/buono-sconto-zalando The same case repeats for all of the pages for the "shops" we promote. Searching any of the brads + our domain shows no result except from "nike" and "euroclinix" (I see no relationship between these 2) Some days before for these same type of searches it was showing pages from the domain which we blocked by robots months ago, and which go to 404 error instead of our optimised landing pages which cannot  be found in the first 50 results. These pages are generated by our rating system... We already send requests to de index all theses pages but they keep appearing for every new page that we create. And the real pages nowhere to be found... Here isan example: http://www.turbosconto.it/shops/codice-promozionale-pimkie/rat
    You can see how google indexes that for as in this search: site:www.turbosconto.it rate Why on earth will google show a page which is blocked by the robots.txt displaying that the content cannot  retrieved because it is blocked by the robots instead of showing pages which are totally SEO Friendly and content rich... All the script from TurboSconto is the same one that we use in our spanish version www.turbocupones.com. With this last one we have awesome results, so it makes things even more weird... Ok apart from those weird issues with the indexation and the robots, why did a research on out backlinks and we where surprised to fin a few bad links that we never asked for. Never the less there are just a few and we have many HIGH QUALITY LINKS, which makes it hard to believe that this could be the reason. Just to be sure we, we used the disavow tool for these links, here are the bad links we submitted 2 days ago: domain: www.drilldown.it #we did not ask for this domain: www.indicizza.net #we did not ask for this domain: urlbook.in #we did not ask for this, moreover is a spammy one http://inpe.br.way2seo.org/domain-list-878 #we did not ask for this, moreover is a spammy one http://shady.nu.gomarathi.com/domain-list-789 #we did not ask for this, moreover is a spammy one http://www.clicdopoclic.it/2013/12/i-migliori-siti-italiani-di-coupon-e.html #we did not ask for this, moreover and is a copy of a post of an other blog http://typo.domain.bi/turbosconto.it I have no clue what can it be, we have no warning messages in the webmaster tools or anything. 
    For me it looks as if google has a BUG and went crazy on judging our italian website. Or perhaps we are just missing something ??? If anyone could throw some light on this I will be really glad and willing to pay some compensation for the help provided. THANKS A LOT!

    | sebastiankoch
    0

  • Hi Mozzers, We are having an issue with our XML sitemap images not being indexed. The site has over 39,000 pages and 17,500 images submitted in GWT.  If you take a look at the attached screenshot, 'GWT Images - Not Indexed', you can see that the majority of the pages are being indexed - but none of the images are. The first thing you should know about the images is that they are hosted on a content delivery network (CDN), rather than on the site itself. However, Google advice suggests hosting on a CDN is fine - see second screenshot, 'Google CDN Advice'.  That advice says to either (i) ensure the hosting site is verified in GWT or (ii) submit in robots.txt.  As we can't verify the hosting site in GWT, we had opted to submit via robots.txt. There are 3 sitemap indexes: 1) http://www.greenplantswap.co.uk/sitemap_index.xml, 2) http://www.greenplantswap.co.uk/sitemap/plant_genera/listings.xml and 3) http://www.greenplantswap.co.uk/sitemap/plant_genera/plants.xml. Each sitemap index is split up into often hundreds or thousands of smaller XML sitemaps. This is necessary due to the size of the site and how we have decided to pull URLs in.  Essentially, if we did it another way, it may have involved some of the sitemaps being massive and thus taking upwards of a minute to load. To give you an idea of what is being submitted to Google in one of the sitemaps, please see view-source:http://www.greenplantswap.co.uk/sitemap/plant_genera/4/listings.xml?page=1. Originally, the images were SSL, so we decided to reverted to non-SSL URLs as that was an easy change.  But over a week later, that seems to have had no impact.  The image URLs are ugly... but should this prevent them from being indexed? The strange thing is that a very small number of images have been indexed - see http://goo.gl/P8GMn. I don't know if this is an anomaly or whether it suggests no issue with how the images have been set up - thus, there may be another issue. Sorry for the long message but I would be extremely grateful for any insight into this.  I have tried to offer as much information as I can, however please do let me know if this is not enough. Thank you for taking the time to read and help. Regards, Mark Oz6HzKO rYD3ICZ

    | edlondon
    0

  • Does anyone knows when .guru domains will became active or if they are already

    | maestrosonrisas
    0

  • For their brand keyword search - miniclip - Google SERP includes a search box reading "Search miniclip.com". Any one has an idea how this can be done?

    | vivekg
    0

  • I have "no index, follow" on some pages, which I set 2 weeks ago. Today I see one of these pages showing in Google Search Results. I am using rel=next prev on pages, yet Page 2 of a string of pages showed up in results before Page 1. What could be the issue?

    | khi5
    0

  • Hi, I think its called local search position, what I'm referring to is when you do a search on a keyword and google lists not only the best matches but also usually the second match is a group of 3 businesses with telephone numbers, google reviews and at the bottom of the group it will say something like:  "See results for <your keyword="">on a map.  This is what I'm referring to.  in anycase my question is if I click on the link to see more results on a map I'm listed as number 3, however on the search page before where the link is displayed which I just clicked on I'm not being listed and instead one business name is being listed three times,  each of the listings uses the same address but a different telephone number,  In addtion the business that is being listed three times is also listed in the results being returned above in this case position #1 for the keyword I have searched.  I assume this has something to do with them also being listed in the group of local businesses below three time..  The business I'm interested in getting listed in this group of results is currently being listed page 2 position 5 for the keyword..</your> Any suggestions would be greatly appreciated.. Thanks in advance..

    | robdob1
    1

  • Let's say I have a page targeting a keyword, "New York Restaurants". There are also several "very close" variations of this keyword which I could also target. Here are the volume estimates: New York Restaurants - 100
    Restaurants New York - 40
    Best Restaurants New York - 30
    Best Restaurants in New York - 20
    etc. Given this, which of the following is the better overall approach? A) Have one page and work all of these keywords so the page targets all of them. For example here try to weave in "Best" in different ways. B) Have multiple pages and use 301 redirects. Create one page only targeted at "New York Restaurants" and then create additional pages with the other terms in the URL and Headline, which 301 redirect to my "New York Restaurants" page. This is similar to how wikipedia does redirects, for example "Bourne 2" 301 redirects to "Bourne Supremacy". Thanks! | New York Restaurants | 12,100 | Medium | $0.93 | 0% | ACCOUNT |
    | Restaurants New York | 2,900 | Medium | $1.00 | 0% | ACCOUNT |
    | Best Restaurants in New York | 3,600 | Low | $0.69 | 0% | ACCOUNT |
    | Best New York Restaurants | 2,400 | Low | $0.80 | 0% | ACCOUNT |
    | New York's Best Restaurants | 260 | Low | $0.76 | 0% |

    | jcgoodrich
    0

  • I am looking at one of our category pages and it has 25 additional pages for a total of 26 pages. The url for the first page looks good, then the next one ends with ?SearchText=768&SearchType=Category All additional pages have the same url. My first concern was duplicate content, but after looking no pages after the 1st are even indexed. What is the best way to handle this?

    | EcommerceSite
    0

  • Hello Mozers My client asked a very good question today.  I didn't know the answer, hence this question.  When you submit a 'Removing content for legal reasons report': https://support.google.com/legal/contact/lr_legalother?product=websearch will the person(s) owning the website containing this inflammatory content recieve any communication from Google? My clients have already had the offending URL removed by a court order which was sent to the offending company.  However now the site has been relocated and the same content is glaring out at them (and their potential clients) with the title "Solicitors from Hell + Brand name" immediately under their SERPs entry. **I'm going to follow the advice of the forum and try to get the url removed via Googles report system as well as the reargard action of increasing my clients SERPs entries via Social + Content. ** However, I need to be able to firmly tell my clients the implications of submitting a report.  They are worried that if they rock the boat this URL (with open access for reporting of complaints) will simply get more inflammatory)!  By rocking the boat, I mean, Google informing the owners of this "Solicitors from Hell" site that they have been reported for "hosting defamatory" content.  I'm hoping that Google wouldn't inform such a site, and that the only indicator would be an absence of visits.  Is this the case or am I being too optimistic?

    | catherine-279388
    0

  • Hi All, Is it possible to have visibility in Google local places as well first page in Google for same set of keywords?

    | RuchiPardal
    0

  • Hello Community, What is your experience with site redesign when it comes to preserving the traffic?  If a large enterprise website has to go through a site-wide enhancement (resulting in change of all URLs and partial content), what do you expect from Organic rankings and traffic?  I assume we will experience a period that Google needs to "re-orientate" itself with the new site, if so, do you have similar experience and tips on how to minimize the traffic loss? Thanks

    | b.digi
    0

  • Hey everyone, I'm beginning to think our site is toxic i.e. it'll never rank properly again irrespective of what we do. I recently published some data (2 months ago) in an interactive visual called the "iPhone 5S Price Index". I outreached and got thousands of links from sites including Forbes, Gizmodo (various international versions), Washington Post, The Guardian, NY Times, etc etc. All of these results dominate the Google rankings, all with links pointing to us. YET, we're no where to be seen. What incentive are Google giving content creators, like me, to continue producing content that is obviously popular if we can't even rank for it? The traffic we received was fantastic. In one day the traffic was 40 times our average, which made me smile like a Cheshire Cat from ear-to-ear but we need to improve our rankings overall otherwise the value to us is lost. The traffic wasn't there to buy our service, they were there to see the graphic. Hopefully our brand exposure leads to future sales, but it's a pittance compared to our previous rankings income. I've had this type of success 3 times in the last few months on this site alone. Yet nothing changes. We suffered from a loss of rankings in September 2012, fighting ever since to get it back. Now I'm losing hope it is even possible. Does anyone know why our site wouldn't rank when we're undeniable the source that created the work? Also, why wouldn't the increase in domain authority (which has jumped about 10 points according to OSE) have a knock on effect for the rest of our keywords - or even let us appear within the top 100 for ones we obviously serve? We do Real Company Shit - and we're good at it. But I need these rankings back. It's driving me nuts. Thanks.

    | purpleindigo
    0

  • I was having speed issues when I ran a test under Google Page Speed test and, as a result, switched to using Google Page Speed Service.  This meant I had to switch my site from the non-www to the www.  Since the switch my page is running faster but my ranking has dropped. What I'm trying to find out is the drop due to all of my previous links going to the non-www or is it because of the site being considered new and is more of a temporary issue.  If it is a link issue I will contact everyone I can to see who will update the site address. Thanks everyone!

    | toddmatthewca
    0

  • I just moved my site from a Wordpress hosted site to Squarespace. We have the same domain, however, the content is now located on a different URL (again, same base domain). I'm unable to easily set up 301 redirects for the old content to be mapped to the new content so I was wondering if anyone had any recommendations for a workaround. Basically, I want to make sure google knows that Product A's page is now located at this new URL. (www.domain.com/11245 > www.domain.com/product-a). Maybe it's something that I don't have to worry about anymore because the old content is gone? I mean, I have a global redirect set up that no matter what you enter after the base domain, it now goes to the homepage but I just want to make sure I'm not missing something here. Really appreciate your help!

    | TheBatesMillStore
    1

  • Hi, We have a potential client who has multiple stores targeting different countries using Shopify. He has set the domains for specific countries using geo targeting in webmaster tools. Anybody had any experience on how fool proof this is? (all the sites have the exact same products/content on them) Thanks

    | OnlineAssetPartners
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.