Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • We found some repetitive pages on site which has mostly sort or filter parameters, tried lot to remove them but nothing much improvement Is it correct way that:- a) We are creating new pages altogther of that section and putting up rel canonical tag from old ones to new ones b) Now, after canonical declared, we will noindex the old pages Is it a correct way to let new pages supercede the old pages with new pages.

    | Modi
    0

  • I am doing some SEO work for client that has a restaurant reservation plugin with a review website connected to it, they handle reservations through an iframe plugin on all pages of the restaurants that are connected to it. Can I place a link in the iframe and get Google to index it? Would be nice. Google is indexing the iframes when you look for certain very longtail keywords. Google displays a page where only the iframe is displayed, this is not relevant for the user and I would like to remove it. But I prefer links and indexed iframes over no links and no indexed iframes on longtail keywords.

    | Lebron27
    0

  • This questing is regarding an ecommerce website that I hand wrote(html) in 1997.  One of the first click and buy websites, with cart/admin system that I also developed. After all this time, the Old plain HTML look just doesnt cut it.  I just updated to XHTML w/ a very modern look, and believe the structured data will index better.  All products and current category pages will have the identical vrls taken from the old version.  I decided to go with the switch after manual penalty, which has since been removed... I figured now is the time to update. My big question is that over the years, a lot of my backlinks came from products/news that are either no longer relevant or just not available.  The pages do exist, but can only be found from the Outbound Link Source. For SEO purposes, I have thought a few things I can do but can't decide which one is the best choice.  Any Insight or suggestions would be Awesome! 1. Redirect the old link to the most relevant page in my current catalog. 2. Add my new header/footer to old page(this will add a navigation bar w/ brands/cats/etc) 3. Simply add a nice new image to the top of these pages linking home & update any broken/irrelevant links.  I was also considering adding just the very top 2 inches of my header(logo,search box, phone, address) *note, some of these pages do receive some traffic.  Nothing huge, but consider the 50+ pages, it ads up.

    | Southbay_Carnivorous_Plants
    0

  • Hi, I'm looking for how I can use a star rating for a q&a discussion or article/blog post to achieve a rich snippets search result. I'm thinking about a user rating for "Was this helpful?" 1 to 5 stars. As I look at schema.org and do and other reading on it, it looks like it's possible to rate only a set group of content types, blogs and discussions not included. However, I've seen rich snippets ratings in SERPs for blog posts, like this example https://www.google.com/search?q=erp+implementation+challenges&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a#q=panorama+consulting+blog&client=firefox-a&hs=gId&hl=en&rls=org.mozilla:en-US:official&ei=QmCBUYLLCOfwiwKHhIAQ&start=20&sa=N&bav=on.2,or.r_cp.r_qf.&bvm=bv.45921128,d.cGE&fp=eb2f15e2a98a4631&biw=2144&bih=995 On page, it looks like they used some simple span tags. So, my question is, which content type category does that fit into for rating  and is that strategy safe enough going forward? Also, are there more steps to making this work? It it is okay to have users rate the helpfulness of a discussion or article and get rich snippets, I'd kinda like to do it. Best... Darcy

    | 94501
    0

  • Hi, I have recently started a project on one of my sites, working with a branch of the U.S. government, where I will be hosting and publishing some of their PDF documents for free for people to use.  The great SEO side of this is that they link to my site.  The thing is, they are linking directly to the PDF files themselves, not the page with the link to the PDF files.  So my question is, does that give me any SEO benefit? While the PDF is hosted on my site, there are no links in it that would allow a spider to start from the PDF and crawl the rest of my site.   So do I get any benefit from these great links?  If not, does anybody have any suggestions on how I could get credit for them.  Keep in mind that editing the PDF's are not allowed by the government. Thanks.

    | rayvensoft
    0

  • We researched a list of about 1000 domains, which are all in one industry segment. Any tool you can recommend to identify corresponding contact emails, based on domain whois or email on website or in contact page? what is your experience with sending emails just to info@DOMAIN_NAME ?

    | lcourse
    0

  • We are looking at working on a site that needs a warning for users visiting - This splash/warning page is the only just google sees this not performing well in search engine - The sites are Wordpress sites - Would we use script to force a full screen pop up? This would be needed on a visit but if the user leaves and returns to the site the warning would need to reappear. Any ideas?

    | JohnW-UK
    0

  • I have one or two competitors (in the UK) in my field who buy expired 1 - 8 year old domains on random subjects (SEO, travel, health you name it) and they are in the printing business and they stick 1 - 2 articles (unrelated to what was on there before) on these and that's it. I think they stick with PA and DA above 30 and most have 10 – 100 links so well used expired domains, hosted in the USA and most have different Ip’s although they now have that many (over 70% of their backlink profile) that some have the same ip. On further investigation none of the blogs have any contact details but it does look like they have been a little smart here and added content to the about us (similar to I use to run xxx but now do xxx) also they have one or two tabs with content on (article length) that is on the same subject they use to do and the titles are all the same content. So basically they are finding expired 1 – 10 year old domains that have only been expired (from what I can see) 6 months max and putting 1 – 2 articles on the home page in relation with print (maybe adding a third on the subject the blog use to cover), add 1 – 3 articles via tabs at the top on subjects the sites use to cover, registering the details via  xbybssgcf@whoisprivacyprotect.com and that’s it. They have been ranking via this method for the last couple of years (through all the Google updates) and still do extremely well. Does Google not have any way to combat link networks other than the stupid stuff such as public link networks, it just seems that if you know what you are doing you get away, if your big enough you get away with it but the middle of the ground (mum and pop sites) get F*** over with spam pointing to there site that no spammer would dream of doing anyway?

    | BobAnderson
    0

  • If you install google custom search on a site - does it record a list of all the searches people type into the search box? Is there a Joomla & Wordpress Search plugin/extension that keeps a track of the search history used on your site(s).

    | JohnW-UK
    0

  • What are your feelings about indexing search results? I know big brands can get away with it (yelp, ebay, etc). Apart from UGC, it seems like one of the best ways to capture long tail traffic at scale. If the search results offer valuable / engaging content, would you give it a go?

    | nicole.healthline
    0

  • Hi i'm currently designing an infographic and the infographic is based on a article my writer had created. I was thinking of ways in which i can add the article and infographic so they complement each other. Obviously the infographic is more in-depth then the article as it contains much more information. The infographic is designed to go viral. I was thinking of putting the info graphic on the top of the page and the written content below it. This way the person looking at the infographic can scroll down to find the more in-depth written discussion/article on the topic. Also from a SEO perspective, the search engines can index the written content (as it won't be able to index the the infographic since it's a image. What do you guys think is the best approach for this situation? Regards, Matt

    | Mattcarter08
    0

  • I am having a difficult time determining how to silo the content for this website (douwnpour). The issue I am having is that as I see it there are several different top-level keyword targets to put at the top of the silos, however due to the nature of the products they fit in almost every one of the top-level categories. For instance our main keyword term is "Audio Books" (and derivatives thereof). but we also want to target "Audiobook Downloads" and "Books on CD".  Due to the nature of the products, almost every product would fit in all 3 categories. It gets even worse when you consider normal book taxonomy.  The normal breakdown would be from audiobooks>Fiction(or Nonfiction).  Now each product also belongs to one of these categories, as well as "download", "CD", and "Audiobook". And still worse, our navigation menus link every page on the site back to all of these categories (except audiobooks, as we don't really have a landing page for that besides the home page, which is lacking in optimized content, but is linked from every page on the site.) So, I am finding siloing, or developing a cross-linking plan that makes sense very difficult.  It's much easier at the lower levels, but at the top things become muddy.  Throw in the idea that we may eventually get e-books as well, and it gets even muddier. I have some ideas of how to deal with some of this, such as having the site navigation put in an i frame, instituting basic breadcrumbs, and building landing pages, but I'm open to any advice or ideas that might help, especially with the top level taxonomy structure. TIA!

    | DownPour
    0

  • Hi, While going through link analysis of competition, there were links on high da sites, all these sites were reputable, wondering how to get dofollow links as what competition has attained any suggestions http://jalopnik.com/5965768/why-india-is-making-the-worlds-most-interesting-cars http://features.rr.com/topic/Maserati http://www.blouinartinfo.com/news/story/volkswagen-plans-two-compact-suvs http://www.firstpost.com/tag/honda-amaze

    | Modi
    0

  • Please see attached ranking history chart. On June 5th the chart shows that my main site is not coming up under our main keyword "door hangers" From then on, our blog took over. Any ideas why? Thanks Andrea lpEBciu.jpg

    | JimDirectMailCoach
    0

  • Hi there! Currently all the URLs on my website, even the home page, end it .html, such as http://www,consumerbase.com/index.html Is this bad?
    Is there any benefit to this? Should I remove it and just have them end with a forward slash?
    If I 301 redirect the old .html URLs to the forward slash URLs, will I lose PA? Thanks!

    | Travis-W
    0

  • We have about 250, 404 errors due to changing alot of page names throughout our site. I've read some articles saying to leave them and eventually they will go away. Normally I would do a 301 redirect. What's the best solution?

    | JimDirectMailCoach
    0

  • I have seen that almost all of my website pages need rel=canonical tag. Seems that something's wrong here since I have unique content to every page. Even show the homepage as a rel=canonical which doesnt make sense. Can anyone suggest anything? or just ignore those issues.

    | arcade88
    0

  • Hi all! I currently had someone remove about 70 bad backlinks pointing to our site. Is it worth disavowing the rest? We don't have any manual penalty or anything like that, but is it the sort of thing to use as a precautionary measure only? There are about 100 links in the disavow file that I had built. Thank you!

    | freeunlocks
    0

  • Hi, If I do a search in Google - for one our products on our site, our site comes up - but it would appear that google is adding our domain name as a suffix to our title in the results... Anyone else seen this? Can I do anything about it? I would prefer it not to appear. Thanks!

    | bjs2010
    0

  • If we have some good content (info graphic and video) is there anything wrong with submitting to as many video and info graphic directories and video hosts as possible or are 99% of them as bad as submitting to directories and would we be looking at a penalty? Also another side question, are CSS gallery's and niche pacific directories such as running a printing company and finding directories on (graphic design, craft and art) a bad idea to submit to.in any quantity are they as bad as the free for all directories that Google doesn't like. We compete in a not very hard niche and need to build a few low authority links (build out our link graph) but not low enough to be classed as low quality (Google penalty's in the future). Most of our competitors are happy to submit to anything but we wan the site to last.

    | BobAnderson
    0

  • Hi, We have a web app.  All our competitors are on http://www.appappeal.com.  We can suggest ourselves here http://www.appappeal.com/contact/suggest.  If we get reviewed and the link is a follow link is this a good thing or a bad thing.  They call themselves a directory and you can pay to get a "priority" review. Should we avoid or is it a good link as the DA is 58?

    | Studio33
    0

  • Greetings Mozzers, I'm working on a web based companies SEO where their services can be optimized for each area.  We have different pages for different keywords with a region attached to the keyword string we are going after.  For example, if the keyword is "Belts" and the regions you want to go after is New York and Miami. You are located in Miami so you have a page for "Miami Belts" and you rank well for it, but you want to start selling them online and want to rank for New York, so we have a "New York Belts" optimized web page that is completely unique content, however it isn't ranking at all. Any thoughts? I know the address would be helping the the Miami page, but besides that why would the New York optimized page not rank? Lets say the two regions are equally competitive. Any clarification and information is greatly appreciated.

    | MonsterWeb28
    0

  • I have been using link baits like infographics to get quality links to  my site and I have observed that these tactics are great to get links to the home page or that particular post page where infographic was originally posted. But we have various other important landing pages and we want to transfer some link equity to those pages. Whenever we publish an infographic we post it on out blog with an embed code carrying anchor text pointed to our site’s home page. People who share our infographic, normally links to the home page or to the post page where they find that particular item. So, what are the possible ways to get links to any other landing page? Can we post some bait on other landing pages as well. I need to know some more techniques to attract deep links. Thanks

    | shaz_lhr
    1

  • We just did a site redesign, and removed the noindex, etc. about 10 days ago. Over the last 24 hours, I've gotten some of my top keywords on the first page, but now they are gone, a few hours later. I assume this is typical?

    | CsmBill
    0

  • If we have a top nav with contact us, about us, delivery, FAQ, Gallery, how to order ect but none of these we want to rank and then we have the usual left hand nav.are we wasting juice with the top nav and would we be better either removing it and putting them further down the page or consolidating them and adding an extra products tab so the product pages are first.

    | BobAnderson
    0

  • Hi, I am trying to make our site more crawlable and get link juice to the "bottom pages" in an ecommerce site. Currently, our site has a big mega menu - and we have: Home > CAT 1
                SUBCAT 1
                      SUBSUBCAT 1
                                PRODUCT Our URL Structure looks:
    www.domain.com/cat1/subcat1/subsubcat1/ and here are the links to the products but the URL's look like: www.domain.com/product.html Obviously the ideal thing would be to cut out one of the CATEGORIES. But I may be unable to do that in the short term - so I was wondering if by taking CAT1 out of the equation - e.g., just make it a static item that allows the drop down menu to work, but no page for it - Does that cut out a level? Thanks, Ben

    | bjs2010
    0

  • I think it would be beneficial to have a third party seo review of the network of sites my team and I manage and was wondering if any of you had suggestions for what sort of tests should be done or that we should expect to see done during one of these reviews. We are a small team who has varying seo experience and have been working hard to make improvements to our sites in the past year. Most of our sites have been completely overhauled in the last 12-16 months and seo work that had not been done in the past has been setup, along with some corrections that may have been harming seo. We believe we are on the right track and have learned a good amount about seo in that time, but it would be nice to have some "expert" feedback outside of our office to get a clearer picture on anything we may be missing or some suggested improvements. A sort of double check on the work we have done.

    | unikey
    0

  • Text on my site seems to be readable in a text only version (the page is not cached so I viewed it by disabling JAVA and then copy and pasted the page into Word) However, when I look in the page source I don't see the text there. The text was created using Open X html boxes to help us with formatting, but is this causing an SEO problem?

    | theLotter
    0

  • Hello everyone, I am currently working for a huge classified website who will be released in France in September 2013. The website will have up to 10 millions pages. I know the indexing of a website of such size should be done step by step and not in only one time to avoid a long sandbox risk and to have more control about it. Do you guys have any recommandations or good practices for such a task ? Maybe some personal experience you might have had ? The website will cover about 300 jobs : In all region (= 300 * 22 pages) In all departments (= 300 * 101 pages) In all cities (= 300 * 37 000 pages) Do you think it would be wiser to index couple of jobs by couple of jobs (for instance 10 jobs every week) or to index with levels of pages (for exemple, 1st step with jobs in region, 2nd step with jobs in departements, etc.) ? More generally speaking, how would you do in order to avoid penalties from Google and to index the whole site as fast as possible ? One more specification : we'll rely on a (big ?) press followup and on a linking job that still has to be determined yet. Thanks for your help ! Best Regards, Raphael

    | Pureshore
    0

  • Hi, INTRO We were hit pretty bad - first with unnatural links warning and then (we assume) by penguin. We removed a lot of links and disavowed the removed along with all others we couldn't.
    The manual penalization was revoked but the site is still down. I understand that Penguin and Unnatural links are not the same.
    I assume that while our removal and fixes were enough for the manual penalty to be removed the penguin algorithm still disapproves us. Also, I am not expecting to be where we were but we know our current locations don't make sense (several pages seem to be de-indexed). AND THE QUESTION... SINCE ALL HAS FAILED, we consider removing the main landing pages (which were the target for link-building) and build new ones with new URLs. In the old ones placing 404 and not 301. This means that all the spammy links that were built will point to non-existing pages (404)
    (besides for those that point to the homepage...) Do you think it will resolve the problem? Or since the spammy links still point to our domain we are still in a problem? (even if to 404 pages). The way we see it, it is the last resort prior to dropping the domain! Thanks

    | BeytzNet
    0

  • I have been thinking about changing our webadres for quite a while but I'm too afraid of the impact on my SERP. I understand I would need to use the Google Change of Address tool & 301 redirects. Am I missing something? What is your experience with changing the URL of a website? How has this impacted your SERP? In the past I heard someone say it will damage the linkjuice by 20%. Is that accurate? If you change the URL, is there a blank period of where your old site nor your new site are indexed? Or does Google handle this transition well?

    | wellnesswooz
    0

  • Hi all, my site www.uniggardin.dk has lost major rankings on the searchengine google.dk. Went from rank #2-3 on important keywords to my site, and after the latest update most of my rankings have jumped to #12 - #20. This is so annoying, and I really have no idea what to do. Can it cause bad links to my site? In that case what will I have to do? Thanks in advance,
    Christoffer

    | Xpeztumdk
    0

  • Hello here, I own an e-commerce website that sells digital sheet music, and I would like to enrich my product pages with short references to artists/composers related to the product, taken from external websites such as mentions, fresh news, information taken from related videos, cross-references, etc. In other words, I'd like to provide our users with a different kind of informational content that our competitors are currently not offering. We could also think of this like "providing some sort of aggregate content on product pages to enrich the user's experience by providing more information about the product". What do you think are the risks or the benefits of such an approach? And if there are any risks, how to avoid/tackle them? Any thoughts are very welcome! Thank you in advance to anyone.

    | fablau
    0

  • We have 1000s of audio book titles at our Web store. Google's Panda de-valued our site some time ago because, I believe, of duplicate content. We get our descriptions from the publishers which means a good
    deal of our description pages are the same as the publishers = duplicate  content according to Google. Although re-writing each description of the products we offer is a daunting, almost impossible task, I am thinking of re-writing publishers' descriptions using The Best Spinner software which allows me to replace some of the publishers' words with synonyms. I have re-written one audio book title's description resulting in 8% unique content from the original in 520 words. I did a CopyScape Check and it reported "65 duplicates." CopyScape appears to be reporting duplicates of words and phrases within sentences and paragraphs. I see very little duplicate content of full sentences
    or paragraphs. Does anyone know whether Google's duplicate content algorithm is the same or similar to CopyScape's? How much of an audio book's description would I have to change to stay away from CopyScape's duplicate content algorithm? How much of an audio book's description would I have to change to stay away from Google's duplicate content algorithm?

    | lbohen
    0

  • On my wordpress shopping cart plugin, I have three pages /account, /checkout and /terms on which I have added “noindex,follow” attribute.   But I think I may be wasting link juice on these pages as they are not to be indexed anyway, so is there any point giving them any link juice? I can add “noindex,nofollow” on to the page itself. However, the actual text/anchor link to these pages on the site header will remain “follow” as I have no means of amending that right now. So this presents the following two scenarios –   No juice flows from homepage to these 3 pages (GOOD) – This would be perfect then, as the pages themselves have nofollow attribute.   Juice flows from homepage to these pages (BAD) - This may mean that the juice flows from homepage anchor text links to these 3 pages BUT then STOPS there as they have “nofollow” attribute on that page. This will be a bigger problem and if this is the case and I cant stop the juice from flowing in, then ill rather let it flow out to other pages.   Hope you understand my question, any input is very much appreciated. Thanks

    | SamBuck
    1

  • http://www.petstoreunlimited.com They get good grades from the on-page tool. The links are not amazing, but are not super spammy. Yet it ranks for nothing they target Any reason why?

    | Atomicx
    0

  • We're moving towards a new site with new site structure. The old site has numerous backlinks to past events that won't be published on the new site. The new site will have about 60 future events that are currently active on the old site as well. I was wondering the best way to move forward with the 301 redirect plan. I was considering redirecting the old site structure to an "archive.ourdomain.co.uk" subdomain and redirecting the 60 or so active events to their equivalents on the new site. Would this be a sensible plan? Also for the active events, is there any difference between: _redirecting the old page to the archive page and then forwarding to the equivalent on the new page _ and redirecting the old page directly to the new page

    | chanm79
    0

  • Hi, I have always thought if 2 links on a single page, both going to the same url wouldnt pass PR from both. I watched a Matt Cutts vid and he was saying in the original algo it was built in that both links would pass PR. So for example if I guest posted say 1000 words and this article had 2 links pointing to the same url would they both work? Cheers

    | Bondara
    0

  • Each product is an item of jewellery based on a letter of the alphabet.  At present all 26 are indexed but as you guess they all share the same description, title and URL (apart from change in letter). What I was going to do was set all but one to no-index, recreate new descriptions and revert back to index. But then that got me thinking - through stop words will the titles be seen as duplicates: letter bracelet letter a bracelet letter i bracelet

    | MickEdwards
    0

  • We recently launched a redesign and I noticed from running a crawl using Screaming Frog SEO that our redirects are all being seen as 302. I know 302 is a temporary redirect, but does this hurt SEO rankings when all our redirects are being seen as 302s instead of 301s? Also, the way I implemented the redirects was by using the IIS Manager Tool. Is it possible that our IIS Manager Tool is not configured properly and instead of adding the redirect as 301, it is inserting it into the rewrite file as 302s?

    | rexjoec
    0

  • Ran moz crawl, the tag pages are coming up with missing description Is it okay if the tag been 'noindexed', though they are not coming in as duplicate one Some examples been Gagan Modi - Blog entries tagged in finance
    http://www.mycarhelpline.com/index.php?option=com_easyblog&view=blogger&layout=statistic&id=128&
    stat=tag&tagid=67&Itemid=91 Gagan Modi - Blog entries tagged in nissan
    http://www.mycarhelpline.com/index.php?option=com_easyblog&view=blogger&layout=statistic&id=128&
    stat=tag&tagid=68&Itemid=91 Gagan Modi - Blog entries tagged in dc avanti
    http://www.mycarhelpline.com/index.php?option=com_easyblog&view=blogger&layout=statistic&id=128&
    stat=tag&tagid=69&Itemid=91 Gagan Modi - Blog entries tagged in mahindra
    http://www.mycarhelpline.com/index.php?option=com_easyblog&view=blogger&layout=statistic&id=128&
    stat=tag&tagid=7&Itemid=91 Gagan Modi - Blog entries tagged in budget
    http://www.mycarhelpline.com/index.php?option=com_easyblog&view=blogger&layout=statistic&id=128&
    stat=tag&tagid=72&Itemid=91 Gagan Modi - Blog entries tagged in datsun
    http://www.mycarhelpline.com/index.php?option=com_easyblog&view=blogger&layout=statistic&id=128&
    stat=tag&tagid=73&Itemid=91

    | Modi
    0

  • You are able to view our website as either http and https on all pages. For example: You can type "http://mywebsite.com/index.html" and the site will remain as http: as you navigate the site. You can also type "https://mywebsite.com/index.html" and the site will remain as https: as you navigate the site. My question is....if you can view the entire site using either http or https, is this being seen as duplicate content/pages? Does the same hold true with "www.mywebsite.com" and "mywebsite.com"? Thanks!

    | rexjoec
    1

  • I have a client who previously registered 20 unique domain names that tied to their company name and services. They use all of these domains to forward to their main website to try and capture additional traffic. Would you suggest that we remove all of the domains by 301 redirecting them all the the main website? I am trying to find a good article that shows the implications on SEO by using many domains that forward to a main website. Any suggestions would be greatly appreciated.

    | Prager
    0

  • Hello Mozzers , I'm trying to find best possible solution for this situation. So there is a website (e-commerce) and since it's grew up too much we are looking to move several categories on different domain. The reason for this is that we introduce completely different product group  (example: we have products that are related to watches and everything related to watch industry but now we introduce leather products: wallets, bags etc). Do you think it is worth it to move new categories to new domain in order to better target this product group?  In case of positive answer which is the best way to do it - 301 redirect or leave the products on this site and build a new site with slightly different product description and names? Regards, Nenad

    | Uniline
    0

  • Hi everyone, We have a website which will have content tailored for a few locations: USA: www.site.com
    Europe EN: www.site.com/eu
    Canada FR: www.site.com/fr-ca Link hreflang and  the GWT option are designed for countries. I expect a fair amount of duplicate content; the only differences will be in product selection and prices. What are my options to tell Google that it should serve www.site.com/eu in Europe instead of www.site.com? We are not targeting a particular country on that continent. Thanks!

    | AxialDev
    0

  • I am looking for a Good SEO for my tech news site and would like your help in recommending a good SEO that will fit in my budget of 300-500 per month.I have contacted many firms in the Moz directory of recommended firms but found they are out of my monthly price range.Google search for a decent SEO can be scary with so many so called SEO companies.I would like to work with a experienced SEO individual who can come up with a great plan for our site and also implement them.We just had a SEO forensic audit done with Alan Blieweiss and implemented his suggestions and are now looking for someone to work with long term for the rest of our SEO needs.I understand that I cannot afford the top SEO firms or industry leaders but with your help and suggestions I am sure we can afford and find a great SEO. Please reply here or message me.

    | chrisyak
    0

  • I use wistia.com to embed videos onto my company website. Because of the way I have my page setup, I have to embed my video using the iframe embed option. Since I'm using this option, my video transcript isn't going to be SEO friendly. Can I manually insert a title tag within the video iFrame code with the entire transcript? Would that be a good workaround so that I can have the transcript help my page optimization? Thanks! Andrea

    | JimDirectMailCoach
    0

  • Hi, I've noticed that on many pages Google shows on the SERPS titles that he chose for me and not necessarily the ones coded in the Title tag (usually small difference like adding suffix etc.). Why is that? Thanks

    | BeytzNet
    0

  • Hi there. I'm SEO expert myself. I am building quality and authority backlinks with branded anchor text. My website has now over 1.5k backlinks with branded anchor text and generic keywords. But my website still not showing in SERPS with my targeted niche or anchor text. Do i need to build backlinks with exact match anchor text if yes then how much? Thanks in advance.

    | globalitsoft
    0

  • I posted a question week ago about a client with really awful SEO errors to the tune of over 75k violations including massive duplicate content (over 8000 pages) and pages with too many links (homepage alone has over 300 links), and I was thinking, why not try to nofollow the product pages which are the ones causing so many issue.  They have super low domain authority, and are wasting spider energy, have no incoming links.  Thoughts?  BTW the entire site is an ecommerce site wth millions of products and each product is its own wordpress blog post...YIKES!  Thoughts?

    | runnerkik
    1

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.