Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Howdy Mozzers, I have a problem with the way Google now tries not to show more than one search result per site on the first page. As in it is a lot harder to be ranked number 1 - 10 twice with different pages. Some of my pages have similar yet different page titles so they use the same first two keywords and then a variable such as '(keyword) (keyword) installations' '(keyword) (keyword) surveys'. Then when I search for '(keyword) (keyword)' they all appear at the start of page two with only ever one of them moving onto the end of page one. Now, it could just be that they are not quite optimised for page 1 but I think it would be more holding back of pages so they don't flood page 1. Any help on this? And also is there a problem with having similar page titles for pages? Cheers

    | Hughescov
    0

  • Hi everyone, So, I have  some general question's about Title tags. My question's are as follows: 1. If i have a title tag like this 'Commercial bathroom instillation '. Will I show up for Commercial bathroom  or Commercial bathroom instillation? The reason I ask is, i'm aiming for Commercial bathroom which has more search volume, but here is where the problem comes in. If I have Commercial bathroom instillation it is a more compelling title. Ideally i'm aiming for Commercial Bathroom, so im in a bit of a conundrum, as you can see. 2. My second question is if I have 'Bath Review and Shower review' for my title tag. Will I show up for Bath Review individually, and shower review individually, or only when someone search's that exact query? I hope that makes sense thanks. Peter

    | PeterRota
    0

  • Any suggestion on best ways to get new sites pages indexed? Was thinking getting high pr inbound links on fiverr but always a little risky right? Thanks for your opinions.

    | mweidner2782
    0

  • Hey Mozers! I was having a quick chat with a friend the other day on doing SEO for a site that grows in page size at an exponential rate and was just wondering how you would go about optimizing it? The example that we used would be a site that allowed users to upload videos and then have people vote on two videos against each other. So, if there are 100 uploaded videos and each of them are pared up with the other 99 to create a unique voting/battle page which has it's own unique URL, the site can get very large, VERY quickly. Meaning if just one more video is uploaded there would be How exactly would you go about optimizing the site? My biggest area of confusion would be generating sitemaps. I'm aware of best practices with large sitemaps (i.e. having a sitemap of sitemaps, not going over 50k in entries per sitemap etc..) But, how would you go about creating the sitemaps for this website if it's growing at an exponential rate, if at all? If you have any other questions feel free to ask and I'll clarify it. Thanks! 😃 **TL;DR How would you optimize a site that grows at an exponential rate? **

    | JordanChoo
    0

  • They opted for videos to explain to people what the website is about, but it ain't working for them. What steps would you take in order to get this site to rank higher without completely changing the design(changing design is out of the question they are low on funds). They also built a blog on wordpress.com and added a .me domain to it. For obvious reasons I'm not mentioning the website.

    | ternit
    0

  • I have a .co.nz website and would like to rank on .com.au without setting up a new country specific website for .com.au. What is the best way to do this ?

    | SteveK64
    0

  • We are transitioning to responsive design and some of our pages will not scale properly, so we were thinking of adding the same content twice to the same URL (one would be simple text -- for mobile and the other would include the images, etc for the desktop version), and content would change based on size of the screen. I'm not looking for another technical solution (I know google specifies that you can dynamically serve different content based on user agent)--I am wondering if any one knows if having the same exact content appear twice on the same URL will cause a problem with SEO (any historical tests or experience would be great). Thank you in advance.

    | nicole.healthline
    0

  • I'm not sure exactly what option i should take next. but i'll run you through a few points: The page is optimized to a rank "A" The page has 350 backlinks* a strong social presence Interlinking pages. High domain authority an OK page authority The domain ranks highly Every other sub domain rank highly. I make a search and the first page that ranks for this domain is a product page within the exact sub domain i'm trying to rank for, followed by some external blogs I've written and then the rest of the product pages. I've submitted the URL to web master tools twice and yet it still will not rank for that keyword. The  only time i see the page index is if i copy the exact URL into Google. Any help on this would be greatly appreciated. Thanks

    | Martin_Harris
    0

  • Hey Everyone, So the company I work for owns 2 domains. We have our main site which offers our portfolio of products and then we have a second domain, which we acquired, which focuses on one of our products (we also have this product available on our main site). So here is where things get tricky for me. This second site (the one that focuses on one of our products) has a HUGE following and a higher domain authority of 80 when our main site has an authority of 70. The higher up of the company want to merge the second popular site with our main site. There are many problems with this in my opinion since the following on this second site is very hardcore in the security space (I do not think that they will like to be sent over to a more corporate site) BUT I want to figure out the SEO value that can be gained or lost from this merge. Some questions... By 301 redirecting the pages over to our main page - I am assuming that the SEO power carries along with it so that these pages should still perform well? Will the domain authority of our main site go up with the merge since we are bringing over pages with a lot of equity? In your opinion, does it make more sense to keep the site with the higher authority since it is easier to host content that performs better? Anyone have any experience with this? SEO-wise do you think that this is a good idea or a bad idea? Thanks a lot! Pat

    | PatBausemer
    0

  • We are moving to responsive design--should we 301 re-direct all of the old m.domain URLs to the corresponding domain.com

    | nicole.healthline
    0

  • I have over 750 URL's returning 404 errors. The majority of these pages have back links from sites, however the credibility of these pages from what I can see is somewhat dubious, mainly forums and sites with low DA & PA. It has been suggested placing 301 redirects from these pages, a nice easy solution, however I am concerned that we could do more harm than good to our sites credibility and link building strategy going into 2013. I don't want to redirect these pages if its going to cause a panda/penguin problem. Could I request manual removal or something of this nature? Thoughts appreciated.

    | Towelsrus
    0

  • Hi Mozzers, We have 2 e-commerce websites, Website A and Website B, sharing thousands of pages with duplicate product descriptions. Currently only the product pages on Website B are indexing, and we want Website A indexed instead. We added the rel canonical tag on each of  Website B's product pages with a link towards the matching product on Page A. How long until Website B gets de-indexed and Website A gets indexed instead? Did we add the rel canonical tag correctly? Thanks!

    | Travis-W
    0

  • Hi everyone, I've just signed up for Moz and I'm getting well and truly stuck in. I have just completed my first site crawl and have a frightening 5,363 errors and 25,319 warnings. The main culprit is the forum on my site, it contains hundreds of pages dating from as far back as 2002. It is full of Duplicate Content, Duplicate page titles and a fair few 404 errors where old links are now outdated. Can anyone advise what would be the best course of action? Should I hide the whole forum from Google's robots? My only concern with doing this is the loss of hundreds of pages of regularly updated content which I feel is boosting SEO. Help! Thanks guys 🙂

    | gaz3342
    0

  • What is the biggest mistake you made with one of your SEO campaigns? When did it happen? What were the consequences and what did you learn from it?

    | DorotheaKettler
    0

  • Hi everyone, What's everyone's thoughts on Dublin Core Metadata. These aren't tags I have come across before but a client site is using them. After doing some reading I am leaning towards removing them but would be very interested in the communities thoughts on these. Much thanks,
    Davinia

    | Unity
    0

  • There are two affiliated brick & mortar retail stores moving into e-commerce. For non-marketing reasons separate e-commerce websites are desired. The two brands are based in separate (nearby) cities in the same Canadian province. Although the store name and branding will be different, the content on the site will either be near duplicates or exact duplicates. The more I look into this on Google and SEOmoz QA, the more I am concerned about the SEO implications of this. SEOmoz QA: Multiple cities/regions websites - duplicate content? "So, yes, because you are offering the same services at second location, you are thinking correctly about the need to rewrite all content so it's not a duplicate of site #1." Duplicate content - Webmaster Tools Help "However, in some cases, content is deliberately duplicated across domains in an attempt to manipulate search engine rankings or win more traffic… In the rare cases in which Google perceives that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we’ll also make appropriate adjustments in the indexing and ranking of the sites involved. As a result, the ranking of the site may suffer, or the site might be removed entirely from the Google index, in which case it will no longer appear in search results. ... Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results." Unfortunately, I would say there's very little chance that rewritten content will happen in the foreseeable future. With that said, I'd be greatly appreciative of the concerns and remedies that the SEOmoz community has to offer (even if they're for future use). Thanks in advance.

    | GOODSIR
    0

  • Without knowing I created multiple URLs to the same page destinations on my website. My ranking is poor and I need to fix this problem quickly. My web host doesn't understand the problem!!! How can I use canonical tags? Can somebody help, please.

    | ZoeAlexander
    0

  • The links from the home page to some internal pages on my site have been coded in the following format by my tech guys: www.abc.com/tools/page.html?hpint_id=xyz If I specify within my Google Webmaster tools that the parameter ?hpint_id should be ignored and content for the user does not change, Will Google credit me for a link from the home page or am I losing something here. Many thanks in advance

    | harmit36
    0

  • Hi, I have two websites say website A and Website B. The website A is set up for the UK audience and the website B is set up for the US audience. Both websites sell same products with some products and offers not available in either country. Website A can't be accessed if you are in US. Similarly website B can't be accessed if you are in UK. This was a decision made by the client long time ago as they don’t want to offer promotions etc in the US and therefore don’t want the US audience to be able to purchase items from the UK site. Now the problem is both the websites have same description for the common products they sell.Search engine spiders tend to enter a site from a variety of different IP addresses/locations. So while a UK visitor will not be able to access the US version of the site and vice versa, a crawler can. Now i have following options with me: 1. Write a different product descriptions for US website to keep both the US and UK versions of the site in the Google Index for the foreseeable future. But this is going to be time consuming and expensive option as there are several hundred products which are common to both sites. 2. Use a single website to target both US and UK audience and make the promotions available only to the UK audience. There is one issue here. Website A address ends with '.co.uk' and website B has different name and ends with .com. So website A can't be used for the US audience. Also website A is older and more authoritative than the new website B. Also website A is pretty popular among UK audience with the .co.uk address. So website B can't be used to target the UK audience. 3. You tell me

    | DevakiPhatak
    2

  • Hi Mozzers! Let's say our website is clean, professional, and minimalistic. Can we use a "read more" button that will expand the text on the page to increase the amount of content while (unless clicked) not impacting the appearance? I want to make sure I am not violating Google Webmaster's guidelines for "Hidden Text" Thanks!

    | Travis-W
    0

  • I wanted to also ask the wider moz community this question. Our blogs are currently run on blogger/wordpress using a subdomain strategy - blog.website.com and has now gained a home page PR3. It's been running for 2-3 years. This runs contrary to best practice of website.com/blog. I'm now considering making the blog internal but want to get your opinion as the longer I leave it, the bigger a decision it will be.... Do the pro's of making the blog internal outweigh the cons of doing so ? Pro's Blog benefits from root domain Fresh content on the site that people can interact with Root domain benefits from links the content gains Easier to analyse user activity Con's Loss of Page Rank Effort to 301 all URL's and content CMS altered to allow creation of blog content

    | RobertChapman
    0

  • Hi, After publishing a press release, if that press release is on top position on Google News for the keyword, how will it effect the SERP for that website?

    | purplar
    0

  • Given the following urls: example.com/warriors/ninjas/ example.com/warriors/ninjas/cid=WRS-NIN01 Is there any difference from an SEO perspective? Aesthetically the 2nd bugs me but that's not a statistical difference. Thank you

    | nymbot
    0

  • Hi, We've currently done a site migration mapping and 301 redirecting only the sites key pages. However two GWT (Google Webmaster Tools) is picking a massive amount of 404 areas and there has been some drop in rankings. I want to mitigate the site from further decline, and hence thought about doing a catch 301 - that is 301 redirecting the remaining pages found on the old site back to the home page, with the future aim of going through each URL one by one to redirect them to the page which is most relevant. Two questions, (1) can I do a catch 301 and if so what is the process and requirements that I have to give to the developer? (2) How do you reduce the number of increasing 404 errors from a site, despite doing 301 redirects and updating links on external linking sites. Note: The server is apache and the site is hosted on Wordpress platform. Regards, Vahe

    | Vahe.Arabian
    0

  • Hello! I've noticed when doing press releases for some reason mobile rank is different than online? Is there a reason for this? They are optimized for seo and some rank much better on a mobile google search. Has anyone else experienced this? Thank you

    | TP_Marketing
    0

  • Hello, My issue is that in wordpress 404 does not seem to be working properly. An example of this is: sitename.com/category/catname loads the files in that category but I can also type sitename.com/category/asdasfaasd/catname and it still goes to the posts in that category and does not 404. I can replace the misc text with anything and it does not 404. My worry is that this can be used to exploit duplicate content. I've looked at a couple of other sites and they do the same. I'm using Yoast as my SEO plugin and  my theme is elogix from themeforest. I've tried disabling all plugins, cloudflare and changing theme and the same issue exists. If anyone can help it would be extremely appreciated.

    | LukeHutchinson
    0

  • Hi Mozzers, What happens if I have a trail of 301 redirects for the same page? For example,
    SiteA.com/10 --> SiteA.com/11 --> SiteA.com/13 --> SiteA.com/14 I know I lose a little bit of link juice by 301 redirecting.
    The question is, would the link juice look like this for the example above? 100% --> 90% --> 81% -->72.9%
    Or just 100% -----------------------------------------> 90% Does this link juice refer to juice from inbound links or links between internal pages on my site? Thanks!

    | Travis-W
    0

  • My homepage (www.LeatherHideStore.com) will not rank for my keywords in Google - with Google mostly pulling product pages and some categories for serp results. In contrast, my homepage consistently shows for Yahoo and Bing with exceptions where a category is a better match for the keyword.  In other words, it is working exactly as it should in Yahoo and Bing. After a year of this frustration I just upgraded to a new site on Magento Community and surprise, the same problem!  The SEO moz analyzer has flagged significant duplicate content issues which I think is at the heart of my problem.  I have asked my developer to address these but let's just say that customer service is not his forte.  I am even starting to doubt he knows what to do although the site appears is well done. Given that it is a brand new site and duplicate content in Magento is to be expected (from what I have now read), I am deeply discouraged that my developer did not or could not plan for this so here I am again! Can anyone give me guidance on what to do? I have read a lot about canonicalization and it seems complicated especially if you have 1000 duplicate page titles.  I have seen that there are some extensions (i.e. Ultimate SEO Suite by aheadWorks) for Magento that claim to be able to solve duplicate content problems but I am really just grasping at straws and do not have the confidence or skills to implement this on my own. Can anyone please help? Thanks! Hunter

    | leatherhidestore
    0

  • seomoz give me notices about rel canonical issues, how can i resolve it. any one can help me, what is rel canonical and how can i remove it

    | learningall
    0

  • At the end of November a client site dropped significantly in the rankings. The drop effected almost all the keyphrases we monitor. Historically the homepage has always ranked higher than the sub-pages - however now it seems Google is no longer ranking the home page, and instead ranking the sub-pages, just far far lower down. Any ideas what could cause this?

    | cottamg
    0

  • Webmaster tools is flagging up duplicate descriptions for the page http://www.musicliveuk.com/live-acts. The page is one page in the wordpress page editor and the web designer set it up so that I can add new live acts from a seperate page editor on the left menu and that feeds into the page 'live-acts'. (it says under template 'live-acts-feed'. The problem is as I add more acts it creates new url's eg http://www.musicliveuk.com/live-acts/page/2 and http://www.musicliveuk.com/live-acts/page/3 etc... I use the all in one SEO pack and webmaster tools tells me that page 2/3/4/ etc all have the same description. How can I overcome this? I can't write new descriptions for each page as the all in one SEO pack will only allow me to enter one for the page 'live-acts'.

    | SamCUK
    0

  • Edit: Here's a simplified version of this issue and how I have fixed it thus far. On AmbitionSnowskates.com, there is a video section. There is no content on this page, it 302 redirects to the newest video. If you access ambitionsnowskates.com/video/, you are redirected to ambitionsnowskates.com/video/safari-time/. The original post (OP) was about a site with recurring events. There is a cycle between to which subpage the homepage should redirect. For this reason, I was wondering if I should redirect mysite.com to mysite.com/active-subpage/ or the other way around (and have the content directly on the homepage). I was also wondering how this will affect the result in SERPs. It turns out Google shows the title and description of the destination page, but shows the URL of the original URL (the homepage). Knowing this, I can tailor my meta descriptions to be about both the company and the current event; a mix of the two means I won't have to switch or duplicate meta descriptions between active events. I do appreciate the real solution though: in my opinion there should be unique content on the homepage with according CTA. I'm trying to push this as the best fix, with redirections being an alternative, but albeit more complex, solution. Again sorry for being so unclear. I wish I had had an example from the beginning. 🙂 I'm leaving this opened in case someone wants to chime in. Ben Hey guys, I need a hand on this one 🙂 We have a website with 3 events and we want the homepage to show the upcoming event. Event 1 is in February Event 2 is in April Event 3 is in June These events are recurrent year after year. Currently the homepage shows the content of event 1 at the root level (site.com/) . The other events have a unique URL (site.com/event-2, site.com/event-3). Later in the year, after event 1 is over, we change the homepage content to event 2 and move event 1 to its own URL. In other words... Current structure Today: Event 1: site.com/ Event 2: site.com/event-2 Event 3: site.com/event-3 In March: Event 1: site.com/event-1 Event 2: site.com/ Event 3: site.com/event-3 And so on. I want to make sure each event has its own URL and is properly indexed. Option A I can redirect the homepage to the right event: site.com -> 302 -> site.com/event-1. If that's the way to go, what will be the SEO impact, i.e. what content will show up in SERPs? The destination page's content/meta description and title? Option B What I could also do is keep the current structure (content moved to the root), but redirect temporarily the event's unique URL to the homepage: Today: Event 1: site.com/ Event 1: site.com/event-1 -> 302 -> site.com/ Event 2: site.com/event-2 Event 3: site.com/event-3 In March: Event 1: site.com/event-1 Event 2: site.com/ Event 2: site.com/event-2 -> 302 -> site.com/ Event 3: site.com/event-3 And so on. Again, if that's the way to go, how will this impact SERPs, which title and description will I see for the homepage and individual events? If you have other options, I'm all ears! Thanks a lot! (I mean it) -Ben

    | BenoitQuimper
    0

  • I've got a domain which was registered in July 2010 and had a website on it. I believe the domain expired and it was dropped for a couple of months. I snapped it up after discovering it in November 2012. Subsequently, the whois records show the domain was created in Nov 2012. What exactly is the "real" age of this domain from Google's perspective? Or at least, as far as SEO is concerned? Cheers,
    Syed P.S - domain age does indeed warrant some merit in ranking factors!

    | syed002
    0

  • Hi. I manage the site http://physiowinnipeg.com, which has had some interesting yo-yo effects lately, culminating in a 40-50 page drop as of last Friday. The phrase in question: physiotherapy winnipeg Background: The site has ranked on the first page for this keyphrase since about May There was the occasional blip where the homepage would disappear from rankings altogether for a day or two, and then return back to its regular listing This would coincide with a sudden increase in organic places listings (our site would hit the top of the local map listings, but the homepage would vanish) Other pages would still appear for the target phrase, just not the homepage About a month ago, the organic places listing settled, and the homepage permanently vanished Other pages still ranked high, and we commanded many of the listings on the first 10 pages of Google, first page included The homepage would still appear for some other searches We had been affected by the Google Places --> Google+ Local transition, so I was of the opinion that we needed to wait it out a bit to see if it, like the other issues related to the transition, would work itself out This time around, it worked itself out again, just today -- but we are now ranked slightly lower on the first page and our Google local listing has disappeared (again) Our other pages are still (currently) absent from the first few pages, for this keyphrase The main differences I noticed here, aside from the much longer timeframe of our drop, was that other pages disappeared as well, and that the homepage was actually found, between page 38 and 50, depending on the day According to SEO Moz and other tools, the site is doing pretty much everything right. I should note, however, that there is a service we use for educational content called Patientsites that loads in the subdomain http://education.physiowinnipeg.com -- and this site has pulled many warnings in SEO Moz (around 10,000, actually, mostly long URL related), as well as a few errors. I'm not sure if this is a part of the problem, but I am considering having the content of the subdomain blocked via robots.txt and Webmaster Tools. Has anyone else experienced anything similar, or have any insight? This weirdness has gone on too long. Thanks! Bobby

    | PinnacleWpg
    0

  • We launched a new site, and we did a 301 redirect to every page. I have over 5k duplicate meta tags and title tags. It shows the old page and the new page as having the same title tag and meta description. This isn't true, we changed the titles and meta description, but it still shows up like that. What would cause that?

    | EcommerceSite
    0

  • What is everyone's thoughts on creating several websites for your business for each department. For example.. say you owned a car dealership. You create a different site for: New cars for sale Used cars for sale Service department -mechanical repairs Parts & accessories department Financing department Positives: Having separate sites for each department would probably make it easier  to rank on the specific search terms. Since a whole site on one topic Ie. Used cars would rank over just a page with the same information on a dealership website. Negatives: You would have to maintain 5 sites Link building Social Media Analytics ETC. Since they are all new domains & sites it will take longer for each site to rank. Google will see them as small lower authority sites since they are only a few pages & not larger sites. What is everyone's thoughts on this? Would you create several small sites? Or would you continue working on one big main authority site & continue link earning to the specific department pages, blogging on the topics etc. Thanks for any help & opinions!

    | DCochrane
    0

  • We changed our website including urls. We setup 301 redirects for our pages. Some of the pages show up as the old url and some the new url. When does that change?

    | EcommerceSite
    0

  • Wow, I should know the answer to this question. Sitemap.xml files have to be accessible to the bots for indexing they can't be disallowed in robots.txt and can't block the folder at the server level. So how can you allow the bots to crawl these xml pages but have them not show up in google's index when doing a site: command search, or is that even possible? Hmmm

    | irvingw
    0

  • Hey guys, We're doing a total revamp of our site and will be completely changing our navigation structure. Similar pages will exist on the new site, but the URLs will be totally changed. Most incoming links just point to our root domain, so I'm not worried about those, but the rest of the site does concern me. I am setting up 1:1 301 redirects for the new navigation structure to handle getting incoming links where they need to go, but what I'm wondering is what is the best way to make sure the SERPs are updated quickly without trashing my domain quality, and ensuring my page and domain authority are maintained. The old links won't be anywhere on the new site. We're swapping the DNS record to the new site so the only way for the old URLs to be hit will be incoming links from other sites. I was thinking about creating a sitemap with the old URLs listed and leaving that active for a few weeks, then swapping it out for an updated one. Currently we don't have one (kind of starting from the bottom with SEO) Also, we could use the old URLs for a few weeks on the new site to ensure they all get updated as well. It'd be a bit of work, but may be worth it. I read this article and most of that seems to be covered, but just wanted to get the opinions of those who may have done this before. It's a pretty big deal for us. http://www.seomoz.org/blog/uncrawled-301s-a-quick-fix-for-when-relaunches-go-too-well Am I getting into trouble if I do any of the above, or is this the way to go? PS: I should also add that we are not changing our domain. The site will remain on the same domain. Just with a completely new navigation structure.

    | CodyWheeler
    0

  • Hi, Does anyone know how to implement schema.org markup with YouTube embedded videos? Thanks Carlos

    | Carlos-R
    0

  • I have found that my site is getting a link from a good site, but my concern is that the link is in a H2 tag in the footer of the front page of the site Would getting a link from a site wrapped in H2 tags be safe? The anchor is my sites brand name

    | JohnPeters
    0

  • Hi Mozzers, My site will be migrating to a new domain soon, and I am not sure how to spend my time. Should I be optimizing our content for keywords, improving internal linking, and writing new content - or should I be doing link building for our current domain (or the new one)? Is there a certain ratio that determines rankings which can help me prioritize these to-dos?, such as 70:30 in favor of link-building? Thanks for any help you can offer!

    | Travis-W
    0

  • I have category pages on an e-commerce site that are showing up as duplicate pages. On top of each page are register and login, and when selected they come up as category/login and category/register. I have 3 options to attempt to fix this and was wondering what you think is the best. 1. Use robots.txt to exclude. There are hundreds of categories so it could become large. 2. Use canonical tags. 3. Force Login and Register to go to their own page.

    | EcommerceSite
    0

  • Hello, I'm helping my client write a long, comprehensive, best-of-the-web piece of content. It's a boring ecommerce niche, but on the informational side the top 10 competitors for the most linked to topic are all big players with huge domain authority. There's not a lot of links in the industry, should I try to top all the big industries through better content (somehow), pictures, illustrations, slideshows with audio, and by being more thorough than these very good competitors? Or should I go for something that's less linked to (maybe 1/5 as much people linking to it) but easier? or both? We're on a short timeline of 3 and 1/2 months until we need traffic and our budget is not huge

    | BobGW
    1

  • To "Guest Blog" or "Ghost Blog"? I've been wondering which would be better given G's "authorship" tracking program. "Onreact.Com" indirectly raised this issue in a recent blog post "Google Authorship Markup Disadvantages Everybody Ignores" as : "Google might dismiss your guest articles. Your great guest blogging campaign on dozens of other blogs might fail because Google will count the links all as one as the same author has written all the posts and linked to himself. So maybe the links won't count at all." Assuming all other things are equal, would you use "Guest Author" with G Authorship attribution (if allowed) or just ghost the article and include an in-text link without attribution to you as the author?

    | JustDucky
    1

  • Ok... So I add tracking parameters to some of my social media campaigns but block those parameters via robots.txt. This helps avoid duplicate content issues (Yes, I do also have correct canonical tags added)... but my question is -- Does this cause me to miss out on any backlink magic coming my way from these articles, posts or links? Example url: www.mysite.com/subject/?tracking-info-goes-here-1234 Canonical tag is: www.mysite.com/subject/ I'm blocking anything with "?tracking-info-goes-here" via robots.txt The url with the tracking info of course IS NOT indexed in Google but IT IS indexed without the tracking parameters. What are your thoughts? Should I nix the robots.txt stuff since I already have the canonical tag in place? Do you think I'm getting the backlink "juice" from all the links with the tracking parameter? What would you do? Why? Are you sure? 🙂

    | AubieJon
    0

  • I have two sites that have no duplicate content (yet).  One ranks better than the other but has a crappy hyphenated domain name (Domain A), and the other one is the "brand site" with a better domain name (Domain B).  I'm creating a blog with technical articles and corresponding videos.  I want the videos to refer to the better domain name (Domain B) because I can't see referring people to a hyphenated domain (it would sound horrible).  But, the hyphenated domain has a better chance of improving it's rankings (long story why).  Can I duplicate the content and just use a canonical tag on Domain B to give the credit to Domain A?  If I do that, is it done on each post?  Or the blog's main page?  What I think would happen is any links to Domain B would pass the juice to Domain A.  Is that correct? I know Canonical's are tricky and I don't want to screw this up, so I'd greatly appreciate some advice from the experienced people on here. Thank you.

    | PhoenixDev
    0

  • Our website is AluminumEyewear.com and we're considering launching a specific version for Australia, naturally I want to avoid any dupe content issues but the content would largely remain the same. I have read through this post and wondered if the options given here are still relevant? I'm currently leaning towards using a sub-domain, i.e. au.aluminumeyewear.com or should I go for aluminumeyewear.com.au? Will there be dupe content issues if I do that? Confused and hoping for help!

    | smckenzie75
    0

  • Our client used to have a listing in each city, but after updating the addresses they were forever under review.  Google said that businesses serving customers at their locations can only list their primary office. Back when this client had multiple city listings, all addresses but one were UPS boxes.  If they are to change back to "No, all customers come to the business location," can they once again submit a listing for each city using these addresses? Yes, I realize they are UPS boxes, but they insist on being listed for each city.

    | elcrazyhorse
    0

  • Hi forum, SEOMoz's crawler identifies duplicate content within your own site, which is great. How can I compare my site to another site to see if they share "duplicate content?" Thanks!

    | Travis-W
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.