Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi everyone, On my site I have about 1,000 hotel listing pages, each which uses a lightbox photo gallery that displays 10-50 photos when you click on it. In the code, these photos are each surrounded with an "a href", as they rotate when you click on them. Going through my Moz analytics I see that these photos are being counted by Moz as internal links (they point to an image on the site), and Moz suggests that I reduce the number of links on these pages. I also just watched Matt Cutt's new video where he says to disregard the old "100 links max on a page" rule, yet also states that each link does divide your PageRank. Do you think that this applies to links in an image gallery? We could just switch to another viewer that doesn't use "a href" if we think this is really an issue. Is it worth the bother? Thanks.

    | TomNYC
    0

  • Hi guys, So we have an e-commerce website and we have some products that are exactly the same but come in different colours. Lets say for example we have a Samsonite Chronolite and this bag comes in 55cm, 65cm and 75cm variations. The same bag also may come in 4 different colours. The bags are the same and therefore have the same information besides maybe the title tag varying due to the size and colour. But the descriptions are the same. How do I avoid Google thinking I am duplicating pages or have duplicated pages. Google things we have duplicated when the scenario is as I have explained. Any suggestions? Best regards,

    | iBags
    2

  • Hi all, We're trying to get our canonical situation straightened out. We have a section of our site with 100 product pages in it (in our case a city with hotels that we've reviewed), and we have a single page where we list them all out--an "all products" page called "all.html." However, because we have 100 and that's a lot for a user to see at once, we plan to first show only 50 on "all.html." When the user scrolls down to the bottom, we use AJAX to place another 50 on the page (these come from another page called "more.html" and are placed onto "all.html"). So, as you scroll down from the front end, you see "all.html" with 100 listings. We have other listings pages that are sorted and filtered subsets of this list with little or no unique content. Thus, we want to place a canonical on those pages. Question: Should the canonical point to "all.html"? Would spiders get confused, because they see that all.html is only half the listings? Is it dangerous to dynamically place content on a page that's used as a canonical? Is this a non-issue? Thanks, Tom

    | TomNYC
    0

  • Hi, We are and SEO company based in Scotland and have taken on a project where the client works in the UK but has distribution in mainland Europe and the US. He currently is working off 3 websites targeted at each area Uk, US and Mainland Europe We are going to rebuild one site and have each area on the site, however we are unsure if sub folders or sub domains would work better. My personal opinion is that sub domains would be better, but I dont have information to back this Can anyone advise? Any advice on geotargeting SEO also would be appreciated! Many Thanks Chris

    | trickcreative
    0

  • Hi Folks, I am trying to figure out the best way to get our company's 38 U.S. locations in the major local directories. To start, I'd like to get us listed in the major ones: Google, Yahoo, Bing, and Yelp. I do have the resources myself here on staff to do everything manually. So, I don't necessarily need a service like Yext (but would also like any opinions on that offering if anyone can offer it). But, from what I know in the past, every time you try to claim a local listing within each platform, you have to confirm your existence there somehow - whether it be by a mailed postcard or some sort of automated call they give you. Considering that we want to manage all social and local platforms here at corporate, how can we do this? I am not physically at these locations, but I'm sure this is possible to manage everything through one account. The addresses will be local, but the phone numbers on each local profile will route to our customer service here at corporate because the local locations are mostly administrative. In other words, businesses is booked through corporate and carried out at local destinations. Thoughts/Comments?
    I want to do what's best for SEO and also dont' want to harm anything or our link equity. Thanks,
    Pedram

    | CSawatzky
    0

  • Hi Guys, I just transferred my old site to a new one and now have sub folder TLD's.  My default pages from the front end and sitemap don't show /en after www.mysite.com.  The only translation i have is in spanish where Google will crawl www.mysite.com/es (spanish). 1. On the SERPS of Google and Bing, every url that is crawled, shows the extra "/en" in my TLD.  I find that very weird considering there is no physical /en in my urls.  When i select the link it automatically redirects to it's default and natural page (no /en).  All canonical tags do not show /en either, ONLY the SERPS. Should robots.txt be updated to "disallow /en"? 2. While i did a site transfer, we have altered some of the category url's in our domain.  So we've had a lot of 301 redirects, but while searching specific keywords in the SERPS, the #1 ranked url shows up as our old url that redirects to a 404 page, and our newly created url shows up as #2 that goes to the correct page.  Is there anyway to tell Google to stop showing our old url's in the SERP's?  And would the "Fetch as Google" option in GWT be a great option to upload all of my url's so Google bots can crawl the right pages only? Direct Message me if you want real examples.  THank you so much!

    | Shawn124
    0

  • Hey, My company currently has one chief website with about 500-600 other domains that all feature the same material as the chief website.  These domains have been around for about 5 years and have actually picked up some link traffic. I have all of these identical web-pages utilizing rel=canonical but I was wondering if I would be better served, from SEO purposes, to 301 redirect all of these sites to their respective pages on our chief website?  If I add 500 301 redirects, will the major search engines consider this to be black-hat link-building even though the sites are related and technically already feature the same content? For an example, the chief website is www.1099pro.com and I would 301 redirect the below sites to the chief site: 1099softwarepro.com 1099softwarepro.info 1099softwarepro.net 1099softwarepro.biz 1099softwareprofessionals.com 1099softwareprofessionals.info ...you get the point

    | Stew222
    0

  • I have a client on shopify.  All categories have correct canonical links. however, the links from all menus, category pages, etc. follow this structure: /collections/COLLECTION_NAME/products/PRODUCT_NAME but the canonical link on the above product url is: /products/PRODUCT_NAME I have a feeling this is hurting our product detail page's seo.  Our collection pages are ranking fine, but for some reason the detail pages aren't.  It could be that they are deeper, but I am trying to make sure nothing big is causing it first before I get into the smaller factors. Any best practices on this?

    | no6thgear
    0

  • URL i have to use targeted keyword on all sub page domain or not for example now i am using url like this format fundingtype.html litigation-funding.html legal-funding.html financingservices.html process.html and if i re-write all url with targated keyword like this format lawsuit-loans-fundingtype.html lawsuit-loans-litigation-funding.html lawsuit-loans-legal-funding.html lawsuit-loans-financingservices.html lawsuit-loans-process.html so which type URL are more effective for best SEO ??

    | JulieWhite
    0

  • We own this site http://www.discountstickerprinting.co.uk/ and just a little concerned as I right clicked open in new tab on the tab content section and it went to a new page For example if you right click on the price tab and click open in new tab you will end up with the url
    http://www.discountstickerprinting.co.uk/#tabThree Does this mean that our content is being duplicated onto another page? If so what should I do?

    | BobAnderson
    0

  • Hi there, some SEO companies did some work on our site. Instead of helping us they killed us by adding some very bad links all over the web. The disavowing process is not easy at all so I was wondering if there is any company that offers this. We have almost 1000 links that we want to get rid of. Any suggestions?

    | iBags
    0

  • If I create a good community in my particular field on my SEO site and have a quality Q&A section like this etc (ripping of MOZ's idea here sorry, I hope it's ok) will the long term returns be worth the effort of creating and man ageing this. Is the user created content of as much use as I think it will be?

    | mark_baird
    0

  • I was asked to be a guest contributor to an online bridal magazine, but not sure if there would be any linking value on it, as it is a pdf.  Of course, there would be readers who might remember me, and come visit our site. But for linking purposes, I don't see any seo benefit.  But I am a newbie, so I might be missing something. The magazine's are in a pdf format.  Here is a link: http://bridesclub.com/wedding-magazines/Spotlight-NW-Fall-2013.pdf You can also view it online but they redirect to www.issuu.com/  And the links are no followed. Anyone have any advice? Thank in advance!

    | yatesandcojewelers
    0

  • Link Building Question i want to get rank in google for www.topnotchlawsuitloans.com so have to build backlinks with lawsuit loans alt tag but main question is this have to build or gain backlinks for this domain only or one of my website sub domain www.topnotchlawsuitloans.com/lawsuit-funding-philadelphia.html‎ on page #6 so have to build backlink for this URL ??? what are the effective strategy to gain backlinks for main page or all sub pages have to build backlinks ?? how many back-link per keyword & per page is good for website.???

    | JulieWhite
    0

  • Hi Mozzers - I'm just looking at a site which has been damaged by a very poor site migration. Basically, the old URLs were 301'd to a page on the new website (not a 404) telling everyone the page no longer existed. They did not 301 old pages to equivalent new pages. So I just checked Google WMT and saw 1,000 crawl errors - basically the old URLs. This migration was done back in February, since when traffic to the website has never recovered. Should I fix this now? Is it worth implementing the correct 301s now, after such a timelapse?

    | McTaggart
    0

  • Hi guys i have some doubts with the correct URL structure for a new site. The question is about how show the city, the district and also the filters. I would do that: www.domain.com/category/city/disctict but maybe is better do that: **www.domain.com/category/city-district ** I also have 3 filters that are "individual/colective" "indoor/outdoor" and "young/adult" but that are not really interesting for the querys so where and how i put this filtters? At the end of the url showing these: **www.domain.com/cateogry/city/district#adult#outdoor#colective ** ? Well really i don't know what to do with the filters. Check if you could help me with that please. I also have a lof of interest in knowing if maybe is better use this combination **www.domain.com/category-city or domain.com/category/city **and know about the diference. Thank you very much!

    | omarmoscatt
    0

  • I'm working with a client who has 301 redirected thousands of URLs from their primary subdomain to a new subdomain (these are unimportant pages with regards to link equity). These URLs are still appearing in Google's results under the primary domain, rather than the new subdomain. This is problematic because it's creating an artificial index bloat issue. These URLs make up over 90% of the URLs indexed. My experience has been that URLs that have been 301 redirected are removed from the index over time and replaced by the new destination URL. But it has been several months, close to a year even, and they're still in the index. Any recommendations on how to speed up the process of removing the 301 redirected URLs from Google's index? Will Google, or any search engine for that matter, process a noindex meta tag if the URL's been redirected?

    | trung.ngo
    0

  • Our home page has the canonical tag pointing to itself (something from wordpress i understand). Is there any positive or negative affect that anyone is aware of from having pages canonical'ed to themselves?

    | halloranc
    0

  • What are some of the best ways to earn and build quality relevant links that will increase exposure to your target market in addition to assisting search rankings? I personally find that local niche directories and PR are the best ways to accomplish this without having content to "earn links"..what else works? Any interesting ideas??

    | RickyShockley
    0

  • Hi all, I would like associate content on "Page A" with "Page B".  The content is not the same, but we want to tell Google it should be associated. Is there an easy way to do this?

    | Viewpoints
    1

  • A client operates four well known brands inside the industry, and each website has a relatively (for the industry) high amount of Domain authority across these four sites. They just formed a group of these four brands and launched a new website with no authority. What steps would you take to instantly grow the authority of the new site?

    | Socialrocketco
    1

  • Hi Moz Community, In July, we changed our homepage to https via a 301 redirect from http (the only page on our site with https). Our homepage receives an A grade in the ‘On Page Grader’ by Moz for our desired keyword. We have increased our backlink efforts directly to our homepage since we switched to the SSL homepage. However, we still have not increased in search ranking for our specific keyword. Is there something we could have missed when doing the 301 redirect (submitting a new sitemap, changing rotbots.txt files, or anything else??) that has resulted in Google not correctly accessing the https version?  (the https page has been indexed by Google). Any help would be greatly appreciated.

    | G.Anderson
    0

  • Hey guys, I recently created a new website for a client who was ranking #1 for the term "jupiter obgyn" but they have now dropped down to #4. This happened because their old home page was at www. instead of just jupiterobgyn.com. When you type in the www. version, it does take you to the root domain but it's not carrying the old PA! The www. version of the page had a 22 PA and the new root domain hosted page is a 1. How can I fix it so that "link juice" carries over? Is this something i need to do in 1and1 (their web host) or within Wordpress? Thanks!!!

    | RickyShockley
    0

  • Hello, Noticed a lot of sites, usually wordpress (seems to be the default) have the images in their posts clickable that load to their own page, showing just the image, usually a .jpg page. I know these pages seem to be easily indexed into google image search and can drive traffic to those specific pages... My questions are... 1. What is the point of driving traffic to a page that is just the image, there are no links to other pages, no ads, nothing... 2. can you redirect these .jpg pages to the actual post page? I ask because on google image search, there are 3 links to click (website, image link, image page), when you click to view the image, it loads the .jpg page, why not have that .jpg redirect to the real content page that has ads and also has other links. Is this white-hat? 3. Do these pages with just images have any negative effect on optimization since they are just images, no content? 4. Can you monetize these .jpg pages? 5. What is the best practice? I understand there is value in traffic, but what is the point of image traffic if I can't monetize those pages?

    | WebServiceConsulting.com
    0

  • Hi guys, My company has a number of websites of which the main corporate site links to via its global navigation. This global navigation sits within a simple with no HTML <nav>markup. Every time a new page gets created on the main corporate, a backlink gets generated to those external sites. And the anchor text is always the same. As the corporate site publishes new pages frequently, I'm wondering whether this ongoing building of links using the same anchor text would be a cause of concern for Google (i.e. too many links from the same domain with the same anchor text). Would really appreciate some insight here, and what could be done to fix it if it's an issue. Many thanks </nav>

    | cos2030
    0

  • Hi all, I have been actively pursuing bloggers for my site in order to build page rank. My website sells women undergarments that are more on the exotic end. I noticed a large amount of prospective bloggers demand product samples. As already confirm, bloggers that are given "free" samples should use a rel=no follow attribute in their links. Unfortunately this does not build my page rank or transfer links juice. My question is this: is it advisable for them to also blog additional posts and include dofollow links? The idea is for the blogger to use a nofollow when posting about the sample and a regular link for a secondary post at a later time. What are you thoughts concerning this matter?

    | 90miLLA
    0

  • For example: Can I create a page per each store with it's location including a map? Would it assist in local results?
    Are there any other ways to "push" local results for a nation wide site? Random example:
    For a computer store selling computers:
    "buy computers NJ"
    "buy computers Boston" Thanks

    | BeytzNet
    0

  • Hello, Our client websites was ranking in high position in Google for a handful of keywords. Targeted keywords were in the title tag but for some reasons Google is not showing thouse keywords in title tag anymore. Instead Google shows same keywords in different language. I think there are some multilingual title tag problem. Any ideas how to solve it? Thanks guys

    | serp-eesti
    0

  • We run a multidomain e-commerce website that targets each country respectively: .be -> Belgium .co.uk -> United Kingdom etc... .com for all other countries We also serve our product images via a media subdomain eg. "media.ourdomain.be/image.jpg"
    This means that all TLD's contain the images of the .be media subdomain. Which is acually seen as an outbound link. We are considering to change this setup so that it serves the images from the same domain as the current TLD, which would make more sense: .be will serve images from media.ourdomain.be .co.uk -> media.ourdomain.co.uk etc.. My question is: Does google image search take the extension of the TLD into consideration? So that for example German users will be more likely to see an image that is served on a .de domain?

    | jef2220
    0

  • Hi, Is there any affect on SEO based on the ratio of linking root domains to non-linking root domains and if so what is the affect? Thanks

    | halloranc
    0

  • Hi all Wondering if anyone could offer an opinion on this..am talking to a new client who offer kitchen installation and design. They have a central headquarters and cover a 100 mile radius of their location. A lot of search terms they are aiming to target - Kitchen Design, Kitchen Fitters etc offer localised results. This is where my issue lays. I have worked with plenty of clients in the past which have physical presence in multiple locations and have marked up the site so that the site ranks for each of the stores, but trying to make one site appear in many locations where it doesn't have an address is a different issue completely. Not only do they only have one address, they also only have one phone number. We will target, as best we can, the non localised keywords but need to work out what to do to cover the locations 20/30/40 miles from the office which they cover. I welcome any opinions on this please.

    | Grumpy_Carl
    0

  • For the website in question, the first domain alphabetically on the shared hosting space, strange search results are appearing on the SERP for keywords associated with the site. Here is an example: A search for "unique company name" shows the results: www.uniquecompanyname.com as the top result. But on pages 2 and 3, we are getting results for the same content but for domains hosted on the same server. Here are some examples with the domain name replaced: UNIQUE DOMAIN NAME PAGE TITLE
    ftp.DOMAIN2.com/?action=news&id=63‎
    META DESCRIPTION TEXT UNIQUE DOMAIN NAME PAGE TITLE 2
    www.DOMAIN3.com/?action=news&id=120‎
    META DESCRIPTION TEXT2 UNIQUE DOMAIN NAME PAGE TITLE 2
    www.DOMAIN4.com/?action=news&id=120
    META DESCRIPTION TEXT2 UNIQUE DOMAIN NAME PAGE TITLE 3
    mail.DOMAIN5.com/?action=category&id=17‎
    META DESCRIPTION TEXT3 ns5.DOMAIN6.com/?action=article&id=27‎‎ There are more but those are just some examples. These other domain names being listed are other customer domains on the same VPS shared server. When clicking the result the browser URL still shows the other customer domain name B but the content is usually the 404 page. The page title and meta description on that page is not displayed the same as on the SERP.As far as we can tell, this is the only domain this is occurring for.So far, no crawl errors detected in Webmaster Tools and moz crawl not completed yet.

    | Motava
    0

  • Hey guys, hoping you can help me out here. I've been tasked with raising several sites' domain authority to a level of 30. Right now, many of them are hovering around 20. Three weeks into this project and our numbers have dropped 1-2 points on average but I don't think our efforts would reflect that this quickly. From what I've read online, a good strategy is guest posting on relevant sites and collecting links from sites with higher DAs. I've also read at least one Moz article about this potentially being ineffective. I've read some of the related posts but they seem mostly dated and the answers didn't seem to help me. Hoping someone with some experience with this can help me out, I appreciate it.

    | DustinAB
    0

  • If there are two fairly equally strong search terms that mean the same thing and want to direct people to a single page on my site, is it recommended that both phrases be placed in the H1? For example, Bacon, Lettuce and Tomato Sandwich (BLT Sandwich) I don't want to rank high for only one phrase while ranking lower (and potentially losing users) for the second phrase. What's the best strategy for this?

    | hamackey
    0

  • Almost everyday I get this- Query parameters for normalization found on www.sitename.com Site: www.sitename.comDate: 3/26/2013Priority: LowBing has detected new parameters in your URLsAnyone know why? We aren't changing anything. I have read it has to do with internal urls but I can find out what internal urls this is a problem with.

    | EcommerceSite
    0

  • We will be starting local SEO efforts on a medical practice that has 4 locations & 15 doctors each location (so 60 listings total). I will submit each doctor & each location to InfoGroup, LocalEze, Axciom & Factual. Also, I will only submit each location (not doctors) to Google. The problem I'm seeing is the fact that each listing would have the same exact phone number - it all goes to one main routing center. What kind of problems could come of this? Do we need a separate phone numbers for each of the four locations (at the very least)?

    | JohnWeb12
    0

  • My website has 43,000 pages indexed by Google.  Almost all of these pages are URLs that have parameters in them, creating duplicate content.  I have external links pointing to those URLs that have parameters in them. If I add the canonical tag to these parameter URLs, will that remove those pages from the Google index, or do I need to do something more to remove those pages from the index? Ex:  www.website.com/boats/show/tuna-fishing/?TID=shkfsvdi_dc%ficol  (has link pointing here)
           www.website.com/boats/show/tuna-fishing/   (canonical URL) Thanks for your help. Rob

    | partnerf
    0

  • I have been hit by the penguin 2.0 update some five months back. I believe that I have an algorythmic penalty applied to my sites. While the work to cleanup etc has been done, there is certainly no recovery. I also notice a lack of recovery stories.  In fact I think anyone affected cannot recover because a recalculation has not happened? Does anyone think that a recalculation of the penguin 2.0 penalties has happened? If so why do they think that.

    | Jurnii
    0

  • Hello all Let me first try to explain what our company does and what it is trying to achieve. Our company has an online store, sells products for 3 different countries, and two languages for each country. Currently we have one site, which is open to all countries, what we are trying to achieve is make 3 different stores for these 3 different countries, so we can have a better control over the prices in each country. We are going to use Geoip to redirect the user to the local store in his country. The suggested new structure is to add sub-folders as following: www.example.com/ca-en
    www.example.com/ca-fr
    www.example.com/us-en
    ... If a visitor is located outside these 3 countries, then she'll be redirected to the root directory www.example.com/en We can't offer to expand our SEO team to optimize new pages for the local market, it's not the priority for now, the main objective now is to be able to control the prices for different market. so to eliminate the duplicate issue, we'll use canonical tags. Now knowing our objective from the new URL structure, I have two questions: 1- which redirect should we use? 301, 302? 
    If we choose 301, then which version of the site will get the link juice? (i.e, /ca-en or /us-en?)
    if we choose 302, then will the link juice remain in the original links? is it healthy to use 302 for long term redirections? 2- Knowing that Google bots comes from US-IP, does that mean that the other versions of the site won't be crawled (i.e, www.example.com/ca-fr), this is especially important for us as we are using AdWords, and unindexed pages will effect our quality score badly. I'd like to know if you have other account structure in your mind that would be better than this proposed structure. Your help is highly highly appreciated.
    Thanks in advance.

    | ajarad
    0

  • Hello Folks, Tried going through the 301 answers but could not find any question similar to what I had. The issue we have is we have got a listing page with the products like this: /used-peugeot/used-toyota-corolla As you can see this URL is not really ideal and I want to redirect it to /used-toyota/corolla using mod_rewrite. The redirect will be 301. My concern here is the URL in the listing page won't change to /used-toyota/corolla and hence the 301 will be 'permanently' placed and I was wondering if this will lose some link juice of the 301ed URL. Now with 301 being a 'permanent' redirect one would assume it should not be an issue but I just wanted to be sure that I am correct in assuming so. Thank you for your time.

    | nirpan
    0

  • I'm a newbie with SEO and I have a question regarding product descriptions. Let's say I am selling 100 dog id tags. The tags are all made of same materials, same size, just different designs. Now for the product description, do I need to write a different set of description for all 100 tags? This is an example of a short product description(there's more) for all the pet tags: Personalized with 4 lines of information and 20 characters in each line. Lifetime guarantee - If your pet ID tag ever becomes illegible, we will replace it free of charge. Solid one-piece construction - No glued or ""sandwiched"" materials to wear out or fall apart. Split ring for collar attachement included with EVERY tag. Countless uses - School backpacks, luggage, fashion accessories, and many more! All of the above information pertains to all the pet tags. Can all my product descriptions contain that information, or will I need to modify this 100 times for each individual pet tag? I read up a lot on duplicate content so I am slightly confused. Will this hurt my SEO? Thanks, Keith

    | ktw016
    0

  • I have created a sitemap file as per Google Web Master Tools instructions. I have it saved as a .txt file. Am I right in thinking that this needs to be uploaded as a .xml file? If so, how do I convert this to a XML? I have tried but it seems to corrupt - there must be a simple way to do this?!

    | DHS_SH
    0

  • Hello Mozzers - just looking at a website which has duplicate keyphrases in its page titles... So you have [keyphrase 1] | [exact match Keyphrase 1] Now I happen to know this particular site has suffered a dramatic fall in traffic - the SEO agency working on the site had advised the client to duplicate keyphrases. Hard to believe, huh! What I'm wondering is whether this extensive exact match keyphrase duplication might've been enough to attract a penalty? Your thoughts would be welcome.

    | McTaggart
    0

  • Hi All, My site was ranking well for a long time and suddenly can't be seen at all any more. I have been trying to figure this out for some time and can't get to the bottom of it. Funny thing is also even when searching for my site with a keyword "snowboard gulmarg" my URL www.klinehinmalaya.com does not appear and somewhere way back in the listing my another page www.klinehimalaya.com/packages.php comes up. Any help would be good right now. Thanks in advance, Catherine.

    | caherinechan
    0

  • Hello Mozzers, If you are amending your urls - 301ing to new URLs - when in the process should you update your sitemap to reflect the new urls? I have heard some suggest you should submit a new sitemap alongside old sitemap to support indexing of new URLs, but I've no idea whether that advice is valid or not. Thanks in advance, Luke

    | McTaggart
    0

  • Going over Moz Crawl error report and WMT's Crawl errors for a new client site... I found 44 High Priority Crawl Errors = 404 Not Found I found that those 44 blog pages were set to Private Mode (WordPress theme), causing the 404 issue.
    I was reviewing the blog content for those 44 pages to see why those 2010 blog posts, were set to private mode. Well, I noticed that all those 44 blog posts were pretty much copied from other external blog posts. So i'm thinking previous agency placed those pages under private mode, to avoid getting hit for duplicate content issues. All other blog posts posted after 2011 looked like unique content, non scraped. So my question to all is: What is the best way to fix the issue caused by these 44 pages? A. Remove those 44 blog posts that used verbatim scraped content from other external blogs.
    B. Update the content on each of those 44 blog posts, then set to Public mode, instead of Private.
    C. ? (open to recommendations) I didn't find any external links pointing to any of those 44 blog pages, so I was considering in removing those blog posts. However not sure if that will affect site in anyway. Open to recommendations before making a decision...
    Thanks

    | SEOEND
    0

  • I'm working on a site that has 10 languages served from centrally located core files in Magento.  So each language has its own TLD with localised content served from SQL.  GWT has also had the preferred country set for each domain. The problem is that each and every domain is indexed in each of the local Google indexes.  In DE Google the FR homepage is ranking higher for the brand keyword.  I kind of think I am wasting my time with hreflang but would like some advice whether this is an option or clues how I can handle this situation best.

    | MickEdwards
    0

  • If a single author contributes to multiple sites, should each site have its own author page (tying to the same single gg+ account)? Ex. One author > one gg+ account > multiple author pages (one per site) Or, should all sites publishing his content link to a single author page/bio on a single, main site? Ex. One author > one gg+ account > a single author page on one site (all other sites link to this author page) In this event, where would the 'contributor to' link point for the additional sites he is contributing to, the homepage? Thanks!

    | seagreen
    0

  • We are trying to work out what the best structure for our blog is as we want each page to rank as highly as possible, we were looking at a flat structure similar to http://www.hongkiat.com/blog/ where every posts is after the blog/ but not in category's although the viewers can look in different category's from the top buttons on the page- photoshop - icons etc or we where going to go for the structured way-  blog/photoshop/blog-post.html the only problem is that we will end up 4 deep at least with this and at least 80 characters in the url. any help would be appreciated. Thanks Shaun

    | BobAnderson
    0

  • My sitemap has been submitted to Bing webmaster tools well over a year ago and I have never had any problems. Starting last week it showed failed, for some reason it can't reach it. I have resubmitted several times and it fails every time. I can go to the url with no problems, and Google Webmaster Tools does not have any problems. We have made no changes in over a year to how the sitemap is made and submitted. Anyone have any ideas?

    | EcommerceSite
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.