Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Two web pages on my website, previously ranked well in google, consistent top 3 places for 6months+, but when the site was modified, these two pages previously ending .php had the page names changed to the keyword to further improve (or so I thought). Since then the page doesn't rank at all for that search term in google. I used google webmaster tools to remove the previous page from Cache and search results, re submitted a sitemap, and where possible fixed links to the new page from other sites. On previous advice to fix I purchased links, web directories, social and articles etc to the new page but so far nothing... Its been almost 5 months and its very frustrating as these two pages previously ranked well and as a landing page ended in conversions. This problem is only appearing in google. The pages still rank well in Bing and Yahoo. Google has got the page indexed if I do a search by the url, but the page never shows under any search term it should, despite being heavily optimised for certain terms. I've spoke to my developers and they are stumped also, they've now added this text to the effected page(s) to see if this helps. Header("HTTP/1.1 301 Moved Permanently");
    $newurl=SITE_URL.$seo;
    Header("Location:$newurl"); Can Google still index a web page but refuse to show it in search results? All other pages on my site rank well, just these two that were once called something different has caused issues? Any advice? Any ideas, Have I missed something? Im at a loss...

    | seanclc
    0

  • Hi all. Could do with a second opinion on this please... At present a client of ours owns two shops (both doing the same but in towns about 20 miles apart - they sell flooring, but using different names) and has a website for each. The plan is to rebrand both of these stores the same and merge both websites into one. The problem comes that both of the individual websites rank very well in their respective Google Local search results and I fear that killing one of the sites will mean that one store will vanish from the local listings. One domain is a DA 45 and the other a DA 11 so the plan is to use the stronger of the two domains. The question I would like to ponder with people wiser than myself is how can we ensure that the new single domain ranks for both locations in the local?  Would the easiest solution be to have pages such as domain.com/store1 and domain.com/store2 with full listings for that store inc name, address, phone number, customer reviews etc? At present the DA 45 domain ranks very well in it's Google local so we need to find a way to change the homepage of that to have both the stores phone numbers but without affecting the local listing. I was considering adding the second phone number as a text based image so that it's visible for people but not for bots Finally, would 301 redirecting the now unused store to domain.com/store2 help with ensuring that we do not lose any local listing for that keyword? If not, are there any suggestions people could offer up Many thanks for any help and sorry for the very long question Carl

    | GrumpyCarl
    0

  • I've  attempted to follow advice from the Q&A section. Currently on the site www.cherrycreekspine.com, I've edited the .htaccess file to help with 301s - all pages redirect to www.cherrycreekspine.com. Secondly, I'd added the canonical statement in the header of the web pages. I have cut the Duplicate Page Content in half ... now  I have a remaining 40 pages to fix up. This is my practice site to try and understand what SEOmoz can do for me. I've looked at some of your videos on Youtube ... I feel like I'm scrambling around to the Q&A and the internet to understand this product. I'm reading the beginners guide.... any other resources would be helpful.

    | deskstudio
    0

  • I have a site where we have search and result pages, google webmaster tool was giving me duplicate content error for page 1 / 2 / 3 etc etc so i have added canonical on these pages like http://www.business2sell.com/businesses/california/ Is this is correct way of using canonical ?

    | manish_khanna
    0

  • I have a client's site that is a vanity URL, i.e. www.example.com, that is setup as a meta refresh to the client's flagship site: www22.example.com, however we have been seeing Google include the Vanity URL in the index, in some cases ahead of the flagship site. What we'd like to do is to de-index that vanity URL. We have included a no-index meta tag to the vanity URL, however we noticed within 24 hours, actually less, the flagship site also went away as well. When we removed the noindex, both vanity and flagship sites came back. We noticed in Google Webmaster that the flagship site's robots.txt file was corrupt and was also in need of fixing, and we are in process of fixing that - Question: Is there a way to noindex vanity URL and NOT flagship site? Was it due to meta refresh redirect that the noindex moved out the flagship as well? Was it maybe due to my conducting a google fetch and then submitting the flagship home page that the site reappeared? The robots.txt is still not corrected, so we don't believe that's tied in here. To add to the additional complexity, the client is UNABLE to employ a 301 redirect, which was what I recommended initially. Anyone have any thoughts at all, MUCH appreciated!

    | ACNINTERACTIVE
    0

  • Hi, I have a site that is indexed (and ranking very well) in Google, but when I do a "site:www.domain.com" search in Bing and Yahoo it is not showing up.  The team that purchased the domain a while back has no idea if it was indexed by Bing or Yahoo at the time of purchase.  Just wondering if there is anything that might be preventing it from being indexed?  Also, Im going to submit an index request, are there any other things I can do to get it picked up?

    | dbfrench
    0

  • The old site is small, only 100 pages or so, and about 10 of them are particularly useful. I would like to 301 those 10 pages to 10 similar pages on the new site, and also 301 the other 90 pages to the new site... the new site's home page, I suppose. Does it make sense to do this and if so how? I think if I simply 301 the whole of the old domain to the new one, the juice will be shared among the new site's page equally which is not what I want. I know where the htaccess file is and I can 301 a page within a domain but I'm at a loss with this. Thanks for any help. EDIT: I'm hoping for something like this: old.com/page_1  >>  new.com/page_A old.com/page_2  >>  new.com/page_B ... and 8 more of those And then the other 90 pages: old.com/Remaining pages  >>  new.com/index

    | Brocberry
    0

  • I just used the PR formula for google and noticed that I when I cross link between my categories which are on the same level the juice / PR  of my homepage diminishes... Is there s way to boast the sub pages and the the homepage at the same time or has google put together some algorithm that does this that I do not know about ?

    | seoanalytics
    0

  • Hi guys / girls, We have a few clients in some very competitive areas that are struggling to gain the top spots. We have been building some good quality links, relevant directories, quality citations, guest posting from good quality websites, broad anchor texts profiles, really strong social signals, competitor back links lookup etc. The issue is that nearly all the websites that are out ranking us have really bad links profiles, lots of spammy links, abusing anchor texts etc So what can you do in this situation. The obvious path is to think, right well lets match them for cr*p links and get some results. However I am more than aware that Googles Algorithm eventually will pick up on this... well at least you hope so. Its just very frustrating when you're getting your ass kicked by poor link building techniques and you're doing good work. I am sure other people have come across this, and I was just wondering if there are any bits of advice of how to move past this? Or is it simply a case of keep doing good work and eventually we will get rewarded? Thanks!

    | Jon_bangonline
    1

  • Question, if you had a site with more than 10 million pages (that you wanted indexed) and you considered each page to be equal in value how would you submit sitemaps to Google? Would you submit them all at once: 200 sitemaps 50K each in a sitemap index? Or Would you submit  them slowly?  For example, would it be a good idea to submit 300,000 at a time (in 6 sitemaps 50k each).  Leave those those 6 sitemaps available for Google to crawl for 7 days then delete them and add 6 more with 300,000 new links?  Then repeat this process until Google has crawled all the links?  If you implemented this process you would never at one time have more than 300,000 links available for Google to crawl in sitemaps. I read somewhere that eBay does something like this, it could be bogus info though. Thanks David

    | zAutos
    0

  • My site hovers on page 2 of google for the key search phrase. I have a blog that gets updated often, I have genuine links coming in, I promote on all social media channels, I do email campaigns, I have a site map, I have my local address on what I believe are the most important sites like Yelp, Manta, Google Local etc. All this and my site is still on page 2 while other sites with less of everything are on page 1. What is missing?

    | bronxpad
    0

  • I realized that we are missing a suite number. It is not on the website or the recently updated Google/Bing/Yahoo revisions I did. Should I go and fix? Or should I go and adjust old listings. Does a suite number matter in the NAP?

    | greenhornet77
    0

  • My humble opinion is that Google's disavow tool.... is a utter waste of your time! My site, http://goo.gl/pdsHs  was penalized over a year ago after the SEO we hired used black hat techniques to increase ranking. Ironically, while having visibility, Google itself had become a customer. (I guess the site was pretty high quality, trust worthy and user friendly enough for Google employees to purchase from.) Soon enough the message about detecting unnatural links had shown up on the webmaster tools and as expected, our rankings sank and out of view. For a year we had contacted webmasters, asking them remove links pointing back to us. 90% didn't respond, the other 10% complied). Work on our site continued, adding high quality, highly relevant unique content.
    Rankings never recovered and neither did our traffic or business….. Earlier this month, we learned about Google’s "link disavow tool" and were excited! We had hoped that following the cleanup instruction, using the “link disavow tool”, we would get a chance at recovery!
    We watched Matt Cutts’ video, read the various forums/blogs/topics online that were written about it, and then we felt comfortable enough to use it... We went through our backlink profile, determining which links were either spammy or seemed a result of black hat practices or the links added by a 3rd party possibly interested in our demise and added them to a .txt file.  We submitted the file via the disavow tool and followed with another reconsideration request. The result came a couple of weeks later… the same cookie cutter email in the WMT suggesting that there are “unnatural links” to the site. Hope turned to disappointment and frustration.  Looks like the big box companies will continue to populate the top 100 results of ANY search, the rest will help Google’s shareholders… If your site has gotten in the algorithm crosshairs, you have a better chance of recovering by changing your URL than messing around with this useless tool.

    | Prime85
    0

  • Hi guys, Before I dive into my question, let me give you some background.. I manage an ecommerce site and we're got thousands of product pages. The pages contain dynamic blocks and information in these blocks are fed by another system. So in a nutshell, our product team enters the data in a software and boom, the information is generated in these page blocks. But that's not all, these pages then redirect to a duplicate version with a custom URL. This is cached and this is what the end user sees. This was done to speed up load, rather than the system generate a dynamic page on the fly, the cache page is loaded and the user sees it super fast. Another benefit happened as well, after going live with the cached pages, they started getting indexed and ranking in Google. The problem is that, the redirect to the duplicate cached page isn't a permanent one, it's a meta refresh, a 302 that happens in a second. So yeah, I've got 302s kicking about. The development team can set up 301 but then there won't be any caching, pages will just load dynamically. Google records pages that are cached but does it cache a dynamic page though? Without a cached page, I'm wondering if I would drop in traffic. The view source might just show a list of dynamic blocks, no content! How would you tackle this? I've already setup canonical tags on the cached pages but removing cache.. Thanks

    | Bio-RadAbs
    0

  • If i hired you or your company to do SEO for my site (http://goo.gl/XUH3f) what would be the first steps you'd take? I'm pretty sure i've covered all of the basics myself, I'm just left trying to figure out what i should do next... rankings have been going up and down for the last few weeks, but even when they're up, they're not high enough 🙂 (and then they go back down anyway) ... I know some of you are going to say build links, please at least give me an example of one or two sites you'd try to get to link to mine... I'm open to any advice or feedback as I'm just a website owner who's been doing their own SEO & learning on the fly... Thanks a lot!

    | Prime85
    0

  • Hello, One of my sites has a strange link profile; it has 40000 in bound links but 30000 of them are from the site http://ourlipsaresealed.skynetblogs.be/ with the anchor text "haarstijl (2)" which is dutch for hairstyles. I haven't paid for or even asked for these links and I don't think its negative seo. I think they just set up a template with hundreds of links they thought were useful to their visitors and produce several pages a day. So the question is do I use the new google disavowel tool? I've held off so far because A. they link to a competitor who haven't been anywhere near as affected as we have although they seem to have been affected to an extent by a drop for some reason and they have a much better link profile overall than mine. and B. in the video Matt cutts goes on over and over that this tool is for people that have done some dodgy link building in the past but I haven't. Thanks, Ian

    | jwdl
    0

  • Ive managed to set up the Rel:Author tag on my wordpress site, and the thumbnail is now showing in Google search results Each article I write on my blog has a link to me Author page, which has on it a link to my Google Plus page. So this works great. However I write articles for another few websites and would like to get my author thumbnail image showing for them articles too. How do I do this? Do I need an Author page on the other sites also? I dont think I'll be able to get an author page on the other sites, is it not possible to somehow link direct from the article to my G+ page? How would guest articles work too? Any ideas? Thanks in advance guys

    | JohnPeters
    0

  • Hello all, I manage an ecommerce website and product prices are shown depending on what country you select. When a user does a product search or lands on a product page, they are immediately redirected to a 'select your country' page. After selecting their option, the user is redirected back to the product or search result page. The problem I face is that, this is leading to a high 'Temporary Redirects' list in my crawl diagnostic page. Looking at the list of temporary redirects, 90% are users being bounced to a 'select your country' page. Any advice to tackle this? Have you guys faced anything similar? Thanks Cyto

    | Bio-RadAbs
    0

  • We launched a new website in June. Traffic plummeted after the launch, we crept back up for a couple of months, but now we are flat, nowhere near our pre-launch traffic or previous year's traffic. For the past 6 months our analytics have been worrying us - Overall traffic and new visitor traffic is down over 10%,  bounce rate is up almost 35% since site launched, keywords aren't ranking where they used to, and of course, web sales are down. Is this supposed to happen when a new site is launched, and how long does a new this transition last? We have done all the technical audits, adding relevant content, we're at a loss. Any suggestions where to look next to improve traffic to pre-launch numbers?

    | WaySEO
    0

  • Hi guys Does anyone have a gut feel for how often the Places search option is used via the left hand side menu of Google? I have a non-slip solutions (flooring/decking etc) client. Google does not provide a 'local search' type results page even when the location is added to the search term. So I am thinking I should prioritise energies on the website and not local search activities (like citations). But I am wondering how many searches are conducted using the Places option. Any views? Many thanks Wendy

    | Chammy
    0

  • Recently I had a drop in the over all number of search queries my website was ranking for (about 50%) on October 5th. I did not lose rankings for my target keywords. How can I regain these lost opportunities?

    | raph3988
    0

  • We have about 30,000 pages that are variations of "<product-type>prices/<type-of-thing>/<city><state "<="" p=""></state></city></type-of-thing></product-type> These pages are bringing us lots of free conversions because when somebody searches for this exact phrase for their city/state, they are pretty low-funnel. The problem that we are running into is that the pages are showing up as dupe content. One solution we were discussing is to 301-redirect or canonical all the city-state pages back to jus tthe "<type of="" thing="">" level, and then create really solid unique content for the few hundred pages we would have at that point.</type> My concern is this. I still want to rank for the city-state because as I look through our best-converting search-terms, they nearly always have the city-state in the search term, so the search is some variation of " <product-type><type of="" thing=""><city><state>"</state></city></type></product-type> One thing we thought about doing is dynamically changing the meta-data & headers to add the city-state info there. Are there other potential solutions to this?

    | editabletext
    0

  • I've always been under the assumption that when building a micro-site it was better to use a true path (e.g. yourcompany.com/microsite) URL as opposed to a sub domain (microsite.yourcompany.com) from an SEO perspective. Can you still generate significant SEO gains from a sub domain if you were forced to use it providing the primary (e.g. yourcompany.com) had a lot of link clout/authority? Meaning, if I had to go the sub domain route would it be the end of the world?

    | VERBInteractive
    0

  • so what is the best practice for getting Google to be able to read text that populates via JQuery in a carousel. If the text is originally display none, is Google going to be able to crawl it? Are there any limits to what Google can crawl when it comes to JavaScript and text? Or is it always better just to hardcopy the text on the page source?

    | imageworks-261290
    0

  • I noticed recently that a clients google plus business page (Set up as a personal page) has a followed link pointing to their site. They have many links on the web pointing to the google+ page, however that page is an https page. So the question is, would a google+ page that is https still pass authority and link juice to the site linked in the about us tab?

    | iAnalyst.com
    0

  • I bought a real  1Yr old PR4 domain and used it to make a blog that would rank easily for new trending keywords (Ex: product launch keywords). I used Yoast SEO and made sure I did all the on-page recommendations it gave me and had linklicious ping the post and a couple high PR backlinks that I gave the page, but it won't even rank page 10 let alone index. My domain is indexed and the home page links to my post. I know a average amount of SEO but I hate doing it because stuff like this frustrates me. Can someone help me? Do I need to get certain backlinks? Is there a way to get my site and post to index faster? BTW the keywords i'm trying to rank for have websites that are brand spanking new some of them are blogspot websites. Most of them don't have a single backlink to them.

    | Jamal4193
    0

  • I want to track conversions using utm parameters from guest blog posts on sites other than my own site. Will Google penalize my site for having a bunch of external articles pointing to one page with unique anchor text but utm code? e.g. mysite.com/seo-text?utm_campaign=guest-blogs

    | wepayinc
    0

  • I have 3 web 2.0 sites that look like theyve been hit by a penalty. I have checked their backlinks and there are a lot of backlinks from sites that have been deindexed. I have requested the removal of lots of the links, but now I need to resubmit the site to Google. Is this even possible with them being a web 2.0 site? I don't have webmaster tools for the site so how would I do this?

    | JohnPeters
    0

  • Hi first post and look forward to contributing at some point. I've learnt our sector is going to have a shakeup with some big trademark bans on the way. I'm trying to get an idea of the percentage of traffic that currently clicks on paid ads at google vs organic results. Are there reports that have been done that show CRT for each position for each category?

    | cruiserDan
    0

  • We've earned a great link from a popular website but it is in a strange format: <a data-uri="http:;;;;;;;;www.domain.com;;;;" target="_blank">blue widgets</a> It is still visible as a link from the web browsers, but I was wondering how will it perform in terms of SEO visibility and crawabillity? Any ideas?
    Thanks!
    Martin

    | MartinPanayotov
    0

  • Does anyone have a best of the web directory promo code for november yet?

    | unitedfitness
    0

  • We have 10,000 of bad pages, which panda could track and penalize us for that. If we delete them we will get 404 error, and after that we could again get penality from G algo. How can i delete them to follow google rules and avoid penalities? If we make redirect of 10k pages with 301 to index, can 10k old pages be treated as duplicate?

    | bele
    0

  • Hey everyone, I'm about to launch a new website for an accounting firm. They currently have a website, which has an 11 year old domain. They are doing very well locally for SEO, and i'm guessing it's because of the aged domain, as their website is very badly built, and contains almost no content. They would like to launch the new site with a simpler, easier to remember domain. If i launch the new site, point the aged domain using a 301 redirect, and do redirects for all of the old pages to the newer versions of them, is there a chance the company will lose their current SEO rankings? Thanks!

    | RCDesign74
    0

  • I have a newsletter module on my website on many of my website and was wondering if google boot see this module a a link or no ? I noticed this module has its own website that google has when I do site : mywebsite.com Thank you,

    | seoanalytics
    0

  • Is there such a thing?

    | ResourceLab
    0

  • The latest Google updates have said that reciprocal linking isn't such a hot thing - so I am wondering if anyone has any guidance for those of us who work with WordPress bloggers?

    | dotJ
    0

  • It would be helpful for our visitors if we were to include an expandable list of FAQs on most pages. Each section would have its own list of FAQs specific to that section, but all the pages in that section would have the same text. It occurred to me that Google might view this as a duplicate content issue. Each page _does _have a lot of unique text, but underneath we would have lots of of text repeated throughout the site. Should I be concerned? I guess I could always load these by AJAX after page load if might penalize us.

    | boxcarpress
    0

  • Hi, I have seen this a few times but maybe someone can shed some light as to why this happens? If I search for a generic keyword im targeting in the title tag it shows the actual title tag placed in the code. But if I search for the brand name, the title tag changes to show just the brand name, so completely different to the default title tag. Any ideas why it does this? And is this bad, is Google saying the content on the site is not relevant and therefore decides to change it? Cheers

    | activitysuper
    0

  • Hello, Just wondering how google treats the TOp and bottom menu that you see on each page of a website ? Does it count it on all the pages in terms of link juice, or is it just there for user experience and only what it counts are the links in the content of a page or on the side ? Thank you,

    | seoanalytics
    0

  • Hi there, We currently have a url www.example.com/health/back-pain/ We are wanting to promote this page on our product packaging however making the URL simpler www.example.com/back-pain/ is it just a case of using a 301? are there any issues here? Thanks for any feedback

    | Paul78
    0

  • On a website when I link across in the same category should all the categories all pear on each page. Let's say I have 6 categories and 6 pages should I have the 6 links on all the pages ( such as A, B, C, D, E, on page 1 ( let's imagine this page is page F ), then on page A have link B, C D, E, F and so on for the 6 pages ( meaning all the links appear on all the pages across the category ) or should i just have let's say 3 links on page 1 ( link A, B, C ) , then link ( D, E, F ) on page 2, then A, E, F on page 3, link B, C F on page 4 and so on... ( which means that i vary the links that appear and that it is naturally ( at least I think ) going to boost the link that appears the most of the 6 pages ? I hope this is not too confusing, Thank you,

    | seoanalytics
    0

  • Hey forum, I'm curious about Image Maps. Few things I'm not sure about: 1. Will the links be followed? If so, will Google respect rel="nofollow"? 2. Will the image be considered 1 image? (indexed as image, etc.) Or will each map segment be treated as a separate image? 3. Any other SEO pros\cons to consider when adding an image map to an existing page? Thanks, Corwin.

    | corwin
    0

  • I have a site with about 1000 pages. I'm planning to add about 30,000 pages to it. Can increasing the footprint by such an amount all of a sudden have any negative consequences for existing organic or hoped-for benefits from new pages? Would the site draw any increased scrutiny from Google for doing this? Any other considerations? Thanks... Darcy

    | 94501
    0

  • Our ranking has drop since a few weeks. I did not do any major change in my site. Surfing WebMaster Tool, I found lots of new URL linking at our site: url.org linkarena.com seoprofiler.com folkd.com digitalhome.ca bustingprice.com surepurchase.com lowpricetoday.com oyax.com couponfollow.com aspringcleaning.com pamabuy.com etzone.ca How do I find if those was done intentionelly to hurt SEO? Could it be possible? Thank you, BigBlaze

    | BigBlaze205
    0

  • Hi Is my site has any issue with duplicate pages within the site , have i define my canonical tag properly , can any one advise please help. childrensfunkyfurniture.com

    | conversiontactics
    0

  • On Google's local search results, i.e when the 'Google places' data is displayed along with the map on the right hand side of the search results, there is also an element 'At a glance:'
    The data that if being displayed is from some years ago and the client would if possible like it to reflect there current services, which they have been providing for some five years. According to Google support here - http://support.google.com/maps/bin/answer.py?hl=en&answer=1344353 this cannot be changed, they say 'Can I edit a listing’s descriptive terms or suggest a new one?
    No; the terms are not reviewed, curated, or edited. They come from an algorithm, and we do not help that algorithm figure it out. ' My question is has anyone successfully influenced this data and if so how.

    | DeanAndrews
    0

  • I still need this question answerd and I know it's something I must have changed. But google is ranking my sitemap for 100s of key terms versus the actual page. It's great to be on the first page but not my site map...... Geeeez.....

    | ursalesguru
    0

  • With all of the recent changes are there any article submission websites worth considering?

    | casper434
    0

  • I wanted to know if comments on my blog count as social signals. I'm getting a minimal amount of shares and tweets but mainly having success with people commenting on the actual blog posts. I wanted to know if Google sees this as a social signal; not necessarily to help with rankings but to increase authority of my site. Thank you.

    | raph3988
    0

  • I have a page that ranks 5 and to get a rich snippet I'm thinking of adding a relevant video to the page. Thing is, the video is already on another page which ranks for this keyword... but only at position 20. As it happens the page the video is on is the more important page for other keywords, so I won't remove it. Will having the same video on two pages be considered a duplicate?

    | Brocberry
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.