Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi, Me and the development team are having a heated discussion about one of the more important thing in life, i.e. URL structures on our site. Let's say we are creating a AirBNB clone, and we want to be found when people search for apartments new york. As we have both have houses and apartments in all cities in the U.S it would make sense for our url to at least include these, so clone.com/Appartments/New-York but the user are also able to filter on price and size. This isn't really relevant for google, and we all agree on clone.com/Apartments/New-York should be canonical for all apartment/New York searches. But how should the url look like for people having a price for max 300$ and 100 sqft? clone.com/Apartments/New-York?price=30&size=100 or (We are using Node.js so no problem) clone.com/Apartments/New-York/Price/30/Size/100 The developers hate url parameters with a vengeance, and think the last version is the preferable one and most user readable, and says that as long we use canonical on everything to clone.com/Apartments/New-York it won't matter for god old google. I think the url parameters are the way to go for two reasons. One is that google might by themselves figure out that the price parameter doesn't matter (https://support.google.com/webmasters/answer/1235687?hl=en) and also it is possible in webmaster tools to actually tell google that you shouldn't worry about a parameter. We have agreed to disagree on this point, and let the wisdom of Moz decide what we ought to do. What do you all think?

    | Peekabo
    0

  • I'm stumped as to why some of the pages on my website return no data from Google's Structured Data Testing Tool while other pages work fine and return the appropriate data. My home page http://www.parkseo.net returns no data while many inner pages do. http://www.parkseo.net      Returns No Data http://www.parkseo.net/citation-submission.html      Does Return Data. I have racked my brains out trying to figure out why some pages return data and others don't.  Any help on this issue would be greatly appricated. Cheers!
    Gary Downey

    | YMD
    0

  • My intuition tells me that it is not going to help with any SEO link benefit by posting an article as a guest writer on a Page Rank site of #1.  It's a rather new fashion content site. The article would be informative and not a sales pitch.  The only link is to our url home page in the bio. I've been told that links from low page ranking sites don't help anymore. But maybe over time the site will achieve higher PR status, and then the link in it would help us? Any thoughts? Thanks! ron

    | yatesandcojewelers
    0

  • Our website has recently been updated, now it seems that all of our products pages look like this cdnorigin.companyname.com/catagory/product Google is showing these pages within the search. rather then companyname.com/catagory/product Each product page does have a canaonacal tag on that points to the cdnorigin page. Is this best practice? i dont think that cdnorigin.companyname etc looks very goon in the search. is there any reason why my designer would set the canonical tags up this way?

    | Alexogilvie
    0

  • Hi guys, Question from a new-comer in SEO. Summary of the situation: potential customers are searching for a generic product category (buy mountainbike) more often than a brand in that category (Specialized MTB). And the latter is searched more often than a specific product ('some specific product from Specialized brand'). Both the brand pages and product pages are not ranking good Then would it be a good idea to have the category pages only link to the brand pages? They may show the products, but the links wouldn't pass link juice. I'm not even sure if that is technically possible, but I wanted to figure out the merit first. I'm hoping this would support the brand pages to rank better as they take in more volume. Please do feel free to teach me!

    | Peter85
    0

  • Article: http://searchengineland.com/homepage-sliders-are-bad-for-seo-usability-163496 I came across the following article and somewhat agree with the authors summary.
    I find sliders a distraction to B2B users and overall offers no SEO benefits. Scenario
    As a service provider, over time I have worked with many high profile blue chip comnpanies. As part of my site redesign, I'm looking to show users my client achievements. My initial thoughts are to carry out the following: On the home page I'm looking to incorporate some high profile company logos (similar to http://www.semrush.com) with a hyperlink "more customers" to the right of logo caption. The link will take the user to a dedicated page (www.mydomain.co.uk/customer) showing a comprehensive list of company logos. Questions
    #1 Is the above practice good or bad.
    #2 Is there a better way to achieve the above Any other practical advise on user experience, social engagement, website speed, etc would be much appreciated. Thanks Mark

    | Mark_Ch
    0

  • Ok, I have been doing a lot of work over the past 6 months, disavowing low quality links from spammy directories to our company website, etc.   However, my efforts seem to have had a negative, not positive effect.  This has brought me back to reconsidering what we are doing as we have lost a good amount of traction on the nationwide Google rankings specifically. Considering our company blog - platinumcctv(dot)net - we have used this blog for a long time to inform customers of new products, software developments and then to provide them links to purchase those components.  Last week,  I revamped the nearly default wordpress theme to another on a piece of advice.  However, someone told me that all of our links should be nofollow, even though it is a company blog because we have many links coming from this domain, and it could be found as spammy. Potato/Potato - But before I start the tedious task of changing every link to no follow on a whim, i searched a lot, but have found no CLEAR substantiation of this.  Any ideas? Other recommendations appreciated as well! Platinum-CCTV(dot)com

    | PTCCTV
    0

  • I have a customer who had prior help on his website and I noticed a 301 redirect in his .htaccess Rule for duplicate content removal : www.domain.com vs domain.com RewriteCond %{HTTP_HOST} ^MY-CUSTOMER-SITE.com [NC]
    RewriteRule (.*) http://www.MY-CUSTOMER-SITE.com/$1 [R=301,L,NC] The result of this rule is that i type MY-CUSTOMER-SITE.com in the browser and it redirects to www.MY-CUSTOMER-SITE.com I wonder if this is causing issues in SERPS.  If I have some inbound links pointing to www.MY-CUSTOMER-SITE.com and some pointing to MY-CUSTOMER-SITE.com, I would think that this rewrite isn't necessary as it would seem that Googlebot is smart enough to know that these aren't two sites. -----Can you comment on whether this is a best practice for all domains?
    -----I've run a report for backlinks.  If my thought is true that there are some pointing to www.www.MY-CUSTOMER-SITE.com and some to the www.MY-CUSTOMER-SITE.com, is there any value in addressing this?

    | EnvoyWeb
    0

  • I have a site, let's call it ExcellentFreeWidgets.com.  There is a page on the site that is very popular and we'll call the page title, "Big Blue Widget."  That page is currently #1 for the search "big blue widget."  This week, I was able to buy the exact match domain for that page, we'll call it BigBlueWidget.com. I want to build a site on BigBlueWidget.com to better capitalize on that search "big blue widget," which is huge.  The content would not be the same wording at all, but it would be the same subject.  It would probably be a five page or so website, all about Big Blue Widgets: what they are, where to get them, etc. The sites will not reciprocally link to each other. New new site, BigBlueWidgets.com, would link to the existing site, ExcellentFreeWidgets.com. The new site and the current page will compete for position in the SERPs. Here are my questions to you experts: 1. Will Google care at all that the same entity owns both sites, or will just just rank for the term as they normally would. 2. I am not sure I'll run Adsense on the new site or not. I will be pointing a link back my ExcellentWidgets.com site from a button that says, "Get an Excellent Widget." But if I do run Adsense on it, does Google Adsense care that the same entity has a site and another site's page that are competing for the same term that both have Adsense add on them? Note: I do not want to start a new entity for the new site (I'm in CA and LLC's are $800/year) as it's probably not worth all that hassle and money. Thank you so much.  I hope the that obfuscating the real domain names did not confuse the issue too much.

    | bizzer
    0

  • Hello Ladies and Gentlemen. I 100% agree with the redirecting of the non www domain name. After all we see so many times, especially in MOZ how the two different domains contain different links, different DA and of course different PA. So I have posed the question to our IT company, "How would we go about redirecting our non www domain to the www version?", "Where would we do that?", " we cant do the redirect on our webserver because the website is listed as an IP address, not a domain name, so would we do the redirect somewhere at GoDaddy?" who is currently maintain our DNS record So here is the response from IT:  " I would setup a CNAME record in DNS (GoDaddy), such that no matter if you go to the bare domain, or the www, you end up in the same place. As for SEO, having a 301 redirect for your bare domain isn't necessary, because both the bare domain and the www are the same domain.  301 is a redirect for "permanently moved" and is common when you change domain names.  Using the bare domain or the www are NOT DIFFERENT DOMAINS, so the 301 would not be accurate, and you'd be telling engines you've moved, when you haven't - which may negatively impact your rank. It sounds to me that IT is NOT recommending the redirect. How can this be? Or are we talking about two different things? Will the redirect cause the melt down as the IT company suggests? Or do they nut understand SEO?

    | Davenport-Tractor
    0

  • We are a US based ecommerce company that recently switched hosting to a Canadian owned company.  I was told we would have a US based IP address but noticed yesterday that the MOZ bar is listing my website, 1800doorbell.com as a Canadian company. I've researched this online and what's typically stated is that your IP location needs to be in the Geo area you serve.   When I brought his up to my host they stated: "The location being reported by many of these tools will be the one from the WHOIS. Since our corporation is registered in Canada, it will return a matching result. You can verify the location of the address by issuing a traceroute and examining the location codes at the end of the traceroute. For example, on: 96.125.180.207" So now I am really confused.  What matters to me is how the search engines see my IP address.  Will/do they see it as a US IP address? Below is the output from DNSstuff and thanks for any help: This is what I received back from DNSstuff: | ASN | 12179 |
    | Name | INTERNAP-2BLK |
    | Description | - Internap Network Services Corporation |
    | # Peers | 11 |
    | # IPv4 Origin Ranges | 32 |
    | # IPv6 Origin Ranges | 2 |
    | Registrar | ARIN |
    | Allocation date | Apr 13, 1999 |
    | Country Code | US | |   |
    | Reverse | unknown.static.dal01.cologlobal.com. |
    | Reverse-verified | No |
    | Origin AS | - Internap Network S... |
    | Country Code | CA |
    | Country | Canada |
    | Region | North America |
    | Population | 31592805 |
    | Top-level Domain | CA |
    | IPv4 Ranges | 5944 |
    | IPv6 Ranges | 336 |
    | Currency | Canadian Dollar |
    | Currency Code | CAD |
    | IP Range - Start | 96.125.176.0 |
    | IP Range - End | 96.125.191.255 |
    | Registrar | ARIN |
    | Allocation date | May 10, 2011 |

    | jake372
    0

  • The crawl report for a site indicates the existence of both www and non-www content, which I am aware is duplicate. However, only the www pages are indexed**, which is throwing me off. There are not any 'no-index' tags on the non-www pages and nothing in robots.txt and I can't find a sitemap. I believe a 301 redirect from the non-www pages is what is in order. Is this accurate? I believe the site is built using asp.net on IIS as the pages end in .asp. (not very familiar to me) There are multiple versions of the homepage, including 'index.html' and 'default.asp.' Meta refresh tags are being used to point to 'default.asp'. What has been done: 1. I set the preferred domain to 'www' in Google's Webmaster Tools, as most links already point to www. 2. The Wordpress blog which sits in a /blog subdirectory has been set with rel="canonical" to point to the www version. What I have asked the programmer to do: 1. Add 301 redirects from the non-www pages to the www pages. 2. Set all versions of the homepage to redirect to www.site.org using 301 redirects as opposed to meta refresh tags. Have all bases been covered correctly? One more concern: I notice the canonical tags in the source code of the blog use a trailing slash - will this create a problem of inconsistency? (And why is rel="canonical" the standard for Wordpress SEO plugins while 301 redirects are preferred for SEO?) Thanks a million! **To clarify regarding the indexation of non-www pages: A search for 'site:site.org -inurl:www' returns only 7 pages without www which are all blog pages without content (Code 200, not 404 - maybe deleted or moved - which is perhaps another 301 redirect issue).

    | kimmiedawn
    0

  • Hi, We changed our domain from coedmagazine.com to coed.com in April'13. Applied 301 redirects on all pages, submitted 'change of address' to google but we still see site:coedmagazine.com fetching 130K results on google as opposed to site:coed.com fetches 40K results. Can anybody here throw some light on what might be going wrong? [ Site runs on wordpress, hosted with wordpress as well ] thank you

    | COEDMediaGroup
    0

  • Hi, I have a client who have a solid, high ranking content based site (site A). They have now created an ecommerce site in addition (site B). To give site B a boost in terms of search engine visibility upon launch, they now wish to redirect approx 90% of site As pages to site B. What would be the implications of this? Apart from customers being automatically redirected from the page they thought they where landing on, how would google now view site A? What are your thoughts to thier idea. I am trying to talk them out of it as I think its a poor one.

    | Webrevolve
    0

  • Hello everyone and thanks in advance for your time. I have a good understanding about SEO, backlinks etc but nowhere near to professional! A good friend of mine has an online store made with opencart e commerce platform he would like to have have category view when his company name is searched on google. Does anyone has any idea how can this be achieved?

    | superofelia
    0

  • One of my sites that has always ranked 1st page on main keyphrases is now between pages 4 and 10 on them, I'm guessing it's been hit by Penguin 2.1. Can anyone offer advice? Here's the link profile: http://www.opensiteexplorer.org/links?site=www.hgoodwin.com Any advice greatly appreciated. UPDATE - Open Site Explorer only shows 95 links. I've checked my links in Google Webmaster Tools, and there are a lot more - 717! The vast majority I don't recognise and look dodgy. How could that happen, and what's the best course of action - is disavow the way to go? I don't even want to click some of those links. Best Regards, Stephen

    | stephenshone
    0

  • Note: All product pages are on INDEX FOLLOW. Right now this is happening with the deleted productpages: 1. When a product is removed from the new datafeed the pages stay online and are showing simliar products for 3 months. The productpages are removed from the categorie pages but not from the sitemap! 2. Pages receiving more than 3 hits after the first 3 months keep on existing and also in the sitemaps. These pages are not shown in the categories. 3. Pages from deleted datafeeds that receive 2 hits or less, are getting a 301 redirect to the upper categorie for again 3 months 4. Afther the last 3 months all 301 redirects are getting a customized 404 page with similar products. Any suggestions of Comments about this structure? 🙂 Issues to think about:
    - The amount of 404 pages Google is warning about in GWT
    - Right now all productpages are indexed
    - Use as much value as possible in the right way from all pages
    - Usability for the visitor Extra info about the near future: Beceause of the duplicate content issue with datafeeds we are going to put all product pages on NOINDEX, FOLLOW and focus only on category and subcategory pages.

    | Zanox
    0

  • Or have we do this step by step, i.e: 13.000 pages on noindex
    13.000 pages on noindex
    13.000 pages on noindex
    13.000 pages on noindex
    13.000 pages on noindex Makes together: 65.000 pages

    | Zanox
    0

  • Hello experts! I'm going through my Google Webmaster Tools > HTML Improvements looking for pages with duplicate meta descriptions/titles that I can fix. And I noticed there are about 60 pages odd looking page titles that have duplicate meta descriptions, which are also noted as: 301 Moved Permanently Moved Permanently The document has moved here.  Apache Server at sports When I click on the link to see the page names, all of them are pages we never created.  The pages are all sports blog related. Here are few examples: http://www.titanium-jewelry.com/justin-tuck-blog.html http://www.titanium-jewelry.com/unlimited-potential-project-blog.html http://www.titanium-jewelry.com/left-handed-baseball-glove-blog.html http://www.titanium-jewelry.com/adjustable-basketball-hoops-blog.html how did they get on our site?  Is this some sort of malicious attack?  Most of them are sports related blog looking names.  I just don't know how these pages could have been created. 2) is this hurting us with Google?3) Can you tell when the page was created?Thanks ron xEtX3op.jpg

    | yatesandcojewelers
    0

  • If Google has already discounted the value of the links and my rankings dropped because in the past these links passed value and now they don't.  Is there any reason to remove them?  If I do remove them, is there a chance of "recovery" or should I just move forward with my 8 month old blogging/content marketing campaign.

    | Beastrip
    0

  • I'm trying to determine the best way to handle my mobile commerce site. I have a desktop version and a mobile version using a 3rd party product called CS-Cart. Let's say I have a product page.  The URLs are... mobile:
    store.domain.com/index.php?dispatch=categories.catalog#products.view&product_id=857 desktop:
    store.domain.com/two-toned-tee.html I've been trying to get information regarding how to handle mobile sites with different URLs in regards to duplicate content.  However, most of these results have the assumption that the different URL means m.domain.com rather than the same subdomain with a different address. I am leaning towards using a canonical URL, if possible, on the mobile store pages.  I see quite a few suggesting to not do this, but again, I believe it's because they assume we are just talking about m.domain.com vs www.domain.com. Any additional thoughts on this would be great!

    | grayloon
    0

  • Hi, We are searching for a new SEO provider for www.compoundsecurity.co.uk and I notice that some SEO providers are now billing against results rather than days spent doing the work. Considering the high prices and lack of work done for those fees by current provider, this is of interest to me. Does anyone have experience of working this way and or have any advice please? Thank you

    | DaddySmurf
    0

  • Let's say I have two duplicate pages, A and B. However, A has 5 external links and B has 3 _different _external links. If I add the rel canonical tag to B, so that A is the "master page" do I also lose whatever link juice was going to B from the 3 external links?

    | KempRugeLawGroup
    0

  • I want to orphan a home page on a site that I own so that the start page becomes site.com/home (or whatever) as opposed to site.com/. I need to accomplish this without associating the former with the latter...meaning no 301. Since this will not be a temporary move, 302 does not seem to work either. And even if I could use it, I don't want to credit / with anything from /home. Is there any way to default the Apache handler to /home without rewriting the URL? Or is there any other solution? The bottom line is, at the end of the day, I need Google to forget about / and anything associated with it, without interrupting the user experience when they request /. Thanks in advance.

    | NTGproducts
    0

  • We operate 2 ecommerce sites. The About Us page of our main site links to the homepage of our second site. It's been this way since the second site launched about 5 years ago. The sites sell completely different products and aren't related besides both being owned by us. In Webmaster Tools for site 2, it's picking up ~4,100 links coming to the home page from site 1. But we only link to the home page 1 time in the entire site and that's from the About Us page. I've used Screaming Frog, IT has looked at source, JavaScript, etc., and we're stumped. It doesn't look like WMT has a function to show you on what pages of a domain it finds the links and we're not seeing anything by checking the site itself. Does anyone have experience with a situation like this? Anyone know an easy way to find exactly where Google sees these links coming from?

    | Kingof5
    0

  • Hi, I'm altering title tags in a well established site (many of which are duplicates) and was wondering whether there's a risk associated with adjusting them all in one go? Should I just make gradual changes instead in case I flag anything?

    | McTaggart
    0

  • Hi We have a website www.advanced-tuning.co.uk which has been suffering since Penguin 2 in terms of SERPS drops and indexing of pages. There are several issues which I believe are impacting on rankings and indexing/ pages being removed from the index. 1)      Unnatural links. Been through Cemper and 2% of back links are toxic , 71% suspicious. There has been some auto link building undertaken by previous SEO’ers which has resulted in a lot of very poor quality back links - .pl forum member links etc. I put together a link disavow doc and have now seen Average Link Detox Risk drop to moderate from high. There still a few dodgy links but I’m working my way through these. 2)      “thin” content – the site has a lot of auto generated manufacturer / model web pages e.g. http://advanced-tuning.co.uk/model/chevrolet-captiva-2-0-d-vcdi-150/ http://advanced-tuning.co.uk/model/bmw-116i-115/ these pages are internally linked to each other In addition there's a series of geographically targeted web pages which tbh don't seem to have been hit (yet) e.g. http://advanced-tuning.co.uk/location/engine-remapping-huddersfield/ My question is should I; a)      Look to remove these manufacturer / model  pages completely b)      Invest time in generating suitable content for the service /location pages? c)       Remove both types of content and concentrate on creating suitable content and links for the  top level manufacturer web pages Also If i do remove the manufacturer / model pages is it worth me 301'ing the pages that are still indexed? Thanks in advance, Ade

    | Door4seo
    0

  • I've noticed on moz report an alert about having to many links on my rss page. http://disneyticketsfree.com/rss/news-updates.html Is using google pagination the way to go? http://googlewebmastercentral.blogspot.com/2011/09/pagination-with-relnext-and-relprev.html If you look at entries they are all about Orlando travel related topics. Thanks to the community in advance.

    | touristips
    0

  • Hello guys, Can Wordpress plugins like WPLeadMagnet and PopUp Domination damage SEO? http://wpleadmagnet.com/ http://www.popupdomination.com/ I don't know what Googles thinks about using for example wpleadmagnet.com in order to catch exit & bounce traffic. What are you guys think? If i use wpleadmagnet.com can i expect to maintain my good positions in SERP or not?

    | EestiKonto
    0

  • I have been carrying out On-Page optimisation only for a client www.shade7.co.nz. After three months or so I have been getting some great results, improving to the top three positions for at least 30 of 45 keywords targeted. Couple of more tweaks and I would be a very happy camper. Disaster overnight! Rankings CRASH! Unbeknown to me the client a month or so back decided to link just about every product/link on a micro site he owns (www.shademakers.com/ ) plus one other site he owns.  Explorer I think discovered over 350 back-links (follow) from these sites! As this is a site he owns and it is targeting the same keywords I presume this falls into the EVIL bucket of SEO. Two part question do you believe I am correct that this is the reason for this rankings crash and what would be the best way to resolve this! server-side 301 redirect for the micro site? Delete the micro site (drastic measure) Remove all the links other than maybe one in the contact page saying visit our other site shade7 other options? The client or I have not received any bad link Emails from Google.

    | Moving-Web-SEO-Auckland
    0

  • We have an ecommerce website with 700 pages. Due to the implementation of filters, we are seeing upto 11,000 pages being indexed where the filter tag is apphended to the URL. This is causing duplicate content issues across the site. We tried adding "nofollow" to all the filters, we have also tried adding canonical tags, which it seems are being ignored. So how can we fix this? We are now toying with 2 other ideas to fix this issue; adding "no index" to all filtered pages making the filters uncrawble using javascript Has anyone else encountered this issue? If so what did you do to combat this and was it successful?

    | Silkstream
    0

  • Hello.... I got to tell that I feel like a newbie (I am, but know I feel like it)... We were working with a client until january this year, they kept going on their own until september that they contacted us again... Someone on the team that handled things while we were gone, updated it´s robots.txt file to Disallow everything... for maybe 3 weeks before we were back in.... Additionally they were working on a different subdomain, the new version of the site and of course the didn't block the robots on that one. So now the whole site it's been duplicated, even it´s content, the exact same pages exist on the suddomain that was public the same time the other one was blocked. We came in changes the robots.txt file on both server, resend all the sitemaps, sent our URL on google+... everything the book says... but the site it´s not getting indexed. It's been 5 weeks now and no response what so ever. We were highly positioned on several important keywords and now it's gone. I now you guys can help, any advice will be highly appreciated. thanks Dan

    | daniel.alvarez
    0

  • So the changes in the Google algo over the last 2 months has ruined most real estate oriented SERPS. I used to rank very well for all of my chosen local real estate keywords. Google pretty much serves up Zillow, Realtor.com, Trulia, Remax(corporate), Century21(corporate), ERA(corporate), and Exit(corporate) for every high traffic real estate serp. It's pathetic. Literally useless to searchers. Zillow, Trulia, and Realtor.com have always been contenders. Now they just dominate. They were always front page, now they are top three in every high volume real estate serp nation wide. The truly disheartening thing about the update is that it's putting franchise websites in the top rankings that in many cases don't even have a local feed to real estate. So users cant even search properties, but Google's brand is putting them above all the local franchise offices. So if you are a Remax office, Remax's official site is beating you on your local real estate keywords. The corporate site doesn't even relate at all to the local search. I described the issue. I could use the next five sentence to cry about my lack of faith in G anymore, but I won't. How does one compete with brands like Zillow for local keywords. If anyone has a local realtor client that competes against Zillow, Trulia and Realtor.com, I am willing to pay you for consultation.

    | joseph1179
    0

  • Hello, This is the first time I've asked a question here, but I would really appreciate the advice of the community - thank you, thank you!  Scenario:  Internal linking is pointing to two different versions of a URL, one with brackets [] and the other version with the brackets encoded as %5B%5D Version 1: http://www.site.com/test?hello**[]=all&howdy[]=all&ciao[]=all
    Version 2: http://www.site.com/test?hello
    %5B%5D**=all&howdy**%5B%5D**=all&ciao**%5B%5D**=all Question: Will search engines view these as duplicate content?  Technically there is a difference in characters, but it's only because one version encodes the brackets, and the other does not (See: http://www.w3schools.com/tags/ref_urlencode.asp) We are asking the developer to encode ALL URLs because this seems cleaner but they are telling us that Google will see zero difference.   We aren't sure if this is true, since engines can get so _hung up on even one single difference in character.  _ We don't want to unnecessarily fracture the internal link structure of the site, so again - any feedback is welcome, thank you. 🙂

    | mirabile
    0

  • In local SEO practices, is it best to geotag all images or only specific ones? For example, if we have images of our retail store on our G+ page (or on our About Us page) it seems like common sense to geotag those images. However, if you're a local photographer do you want to geotag all of your images or only images shot in locations where you'd like to rank?

    | AWCthreads
    0

  • We have a client who has 3 separate websites targeting the US, Australia, and the UK. Each of them has relevant ccTLD's such as: .com .com.au and .co.uk. Our client wants to use the Magento multi-site function so it combines all the stores (which are the exact same products) and merge it into one through Magento. Will this affect his Domain Authority? Or would they be treated as individual when receiving link value, trust, authority? There doesn't seem a lot information out there about this can anyone help? Thanks, Matt

    | HigherthanSEO
    0

  • Hiya Mozzers I often work for hotels. A common scenario is the hotel / resort has worked with their Property Management System to distribute their booking availability around the web... to third party booking sites - with the inventory goes duplicate page descriptions sent to these "partner" websites. I was just checking duplication on a room description - 20 loads of duplicate descriptions for that page alone - there are 200 rooms - so I'm probably looking at 4,000 loads of duplicate content that need rewriting to prevent duplicate content penalties, which will cost a huge amount of money. Is there any other solution? Perhaps ask booking sites to block relevant pages from search engines?

    | McTaggart
    0

  • Each store has its own subfolder (in my mind this hasn't actually happened yet 😉 ) on the main head office domain i.e. maindomain.com/localstore1 , maindomain.com/localstore2 etc. I am happy that this is the best structure for SEO purposes. I like the local SEO advantages to it as each store can have its own NAP and show its own inventory. However I am worried that each store having its own ecommerce site will lead to duplicate content issues. So I am having a rabid debate with myself as to whether each store should:
    a) have its own ecommerce website i.e. maindomain.com/localstore1/ecommercestore
    b) have its own ecommerce website i.e. maindomain.com/localstore1/ecommercestore with each product and category page having canonical links to the corresponding page on the main ecommerce website i.e. maindomain.com/ecommercestore 
    c) just have one ecommerce website with local stock shown e.g. maindomain.com/ecommercestore/productpage shows in an inventory in a line (below the price or such like): " localstore1 (3 items) localstore2 (0 items)"
    d) just chill, inventory stock-outs happen just don't worry about showing local stock And its not good to have internal rabid debates, so I'd like to ask the wider moz community. For bricks and mortar stores (branches or franchises) how would you set up ecommerce stores? Thanks.

    | BruceMcG
    0

  • So lets say we have a site that has 0 page rank (kind of new) has few incoming links, nothing significant compared to the other sites. Now from what I understand link juice flows throughout the site.  So, this site is a news site, and writes sports previews and predictions and what not.  After a while, a game from 2 months gets 0 hits, 0 search queries, nobody cares.  Wouldn't it make sense to take that type of expired content and have it 301 to a different page.  That way the more relevant content gets the juice, thus giving it a better ranking... Just wondering what everybody's thought its on this link juice thing, and what am i missing..

    | ravashjalil
    0

  • Situation: My company has 8 subsidiaries. They each have their own niche (IT, Electrical, Roofing, etc...). We also have offices in multiple countries (If that's even a factor). Questions: 1. Should I establish a web presence for each one? (www.SubsidiaryOne.com)  I would then link to these sites from www.ParentCompany.com. The other options are to do something like www.ParentCompany.com/SubsidiaryOne or SubsidiaryOne.ParentCompany.com. We are trying to build the brand of the parent company so I figured that housing everything inside of the parent company domain would help me meet my goal. Each company will have its own unique content, products, blogs, etc... 2. Should each subsidiary have its own social media presence (Its own Google+, Twitter, FB, etc...) or should I house them all under the umbrella of the parent? Thanks, Alex

    | MeasureEverything
    0

  • I have had a Google manual action (Unnatural links to your site; affects: all) that was spurred on by a PRWeb press release where publishers took it upon themselves to remove the embedded "nofollow" tags on links. I have been spending the past few weeks cleaning things up and have submitted a second pass at a reconsideration request. In the meantime, I have been creating new content, boosting social activity, guest blogging and working with other publishers to generate more natural inbound links. My question is this: knowing that this manual action affects "all," are the new links that I am building being negatively tainted as well? When the penalty is lifted, will they regain their strength? Is there any hope of my rankings improving while the penalty is in effect?

    | barberm
    1

  • So I've read the posts here: http://moz.com/community/q/subdomain-blog-vs-subfolder-blog-in-2013 and many others, Matt Cutts video, etc. Does anyone have direct experience that its still best practice to use the sub folder? (hopefully a moz employee can chime in?) I have a client looking to use hubspot.  They are preaching with the Matt Cutts video.  I'm in charge of SEO / marketing and am at odds with them now.  I'd like to present the client with more info than "in my experience in the past I've seen subdirectories work." Any help? Articles? etc?

    | no6thgear
    0

  • We have implemented Schema markup for Reviews on product pages.  All these pages check out in the Markup Checker Tool in Webmaster Tools, but out in the wild only about 50% of them are actually showing the Review markup in SERPs. Example of a page showing Review markup successfully: http://www.cloud9living.com/los-angeles/drive-a-stock-car And example of a page not showing markup in the SERPs: http://www.cloud9living.com/las-vegas/race-a-ferrari Any ideas on why some SERPs show markup and others do not? Thanks!

    | GManSEO
    0

  • Basically my current web developer is not providing me with what a modern website should need to fully utilize online marketing and SEO in terms of blogging, social media widgets, e-commerce and so on. Because of this I have thought of moving to a wordpress.org website run and built by myself. Is this a good idea? What is the best way to migrate and save existing authority (Re-directs etc)? Is there any potential risks or problems that I could encounter that aren't immediate obvious? Many thanks! Tom

    | CoGri
    0

  • My site is very new (~1 years old), but due to good PR we have gotten some decent links and are already ranking for a key term. This may be why someone decided to start a negative SEO attack on us. We've had less than 200 linking domains up until 2 weeks ago, but since then we have been getting 100+ new domains /day with anchor texts that are either targeted to that key term or are from porn websites. I've gone through the links to get ready and submit a disavow... but should I do it? My rankings/site traffic has not been affected yet. Reasons for my hesitations: 1. Google always warns against using the disavow, and says "you shouldn't have to use it if you are a normal website." (sensing 'guilty-until-proven') 2. Some say Google is only trying to get the data to see if there are any patterns within the linking sites. I don't want the site owners to get hurt, since the villain is someone else using xrumer to put spammy comments on their site. What would you do?

    | ALLee
    0

  • Hello everyone. I have searched on these forums for an answer to my concerns, and despite I found many discussions and questions about applying or not applying "nofollow" to internal links, I couldn't find an answer specific to my particular scenarios. Here is my first scenario: I have an e-commerce site selling digital sheet music, and on my category pages our products are shown typically with the following format: PRODUCT TITLE link that takes to product page Short description text "more info" link that takes to the same product page again As you may notice, the "more info" link takes at the very same page of the PRODUCT TITLE link. So, my question is: is there any benefit to "nofollow" the "more info" link to tell SEs to "ignore" that link? Or should I leave the way it is and let the SE figure it out? My biggest concern by leaving the "nofollow" out is that the "more info" generic and repetitive anchor text could dilute or "compete" with the keyword content of the PRODUCT TITLE anchor text.... but maybe that doesn't really matter! Here a typical category page from my site; http://www.virtualsheetmusic.com/downloads/Indici/Guitar.html My second scenario: on our product pages, we have several different links that take to the very same "preview page" of the product we sell. Each link has a different anchor text, and some other links are just images, all taking to the same page. Here are the anchor texts or ALT text of such same links: "Download Free Sample" (text link) "Cover of the [product title]" (ALT image text) "Look inside this title" (ALT image text) "[product title] PDF file" (ALT image text) "This item contains one high quality PDF sheet music file ready to download and print." (ALT image text) "PDF" (text link) "[product title] PDF file" (ALT image text) So, I have 7 links on the same product page taking the user to the same "product preview page" which is, by the way, canonicalized to the "main" product page we are talking about. Here is an example of product page on my site: http://www.virtualsheetmusic.com/score/Moonlight.html My instinct is to tell SEs to take into account just the links with the "[product title] PDF file" anchor text, and then add a "nofollow" to the other links... but may that hurting in some way? Is that irrelevant? Doesn't matter? How should I move? Just ignore this issue and let the SEs figure it out? Any thoughts are very welcome! Thank you in advance.

    | fablau
    0

  • I just want to ask if it is necessary to run 302 redirections first before redirecting old to new URLs permanently. I heard that we should run temporary redirects first so we can check after and to avoid passing the link juice but I want to hear thoughts from experts. Do i need to test 302s for old pages that are still live or should we redirect old URLs once these pages already removed from the site?

    | esiow2013
    0

  • We are having on our site a couple of pages that we want the page to be indexed, however, we don't want the links on the page to be followed. For example url: http://www.printez.com/animal-personal-checks.html. We have added in our code: . Bing Webmaster Tools, is telling us the following: The pages uses a meta robots tag. Review the value of the tag to see if you are not unintentionally blocking the page from being indexed (NOINDEX). Question is, is the page using the right code as of now or do we need to do any changes in the code, if so, what should we use for them to index the page, but not to follow the links on the page? Please advise, Morris

    | PrintEZ
    0

  • I have a situation where there are 12 stores in separate suburbs across two cities. Currently the chain store has one eCommerce website. So I could keep the one website with all the attendant link building benefits of one domain. I would keep a separate webpage for each store with address details to assist with some Local SEO. But (1) each store has slightly different inventory and (2) I would like to garner the (Local) SEO benefits of being in a searchers suburb. So I'm wondering if I should go down the subfolder route with each store having its own eCommerce store and blog eg example.com/suburb? This is sort of what Apple does (albeit with countries) and is used as a best practice for international SEO (according to a moz seminar I watched awhile back). Or I could go down the separate eCommerce website domain track? However I feel that is too much effort for not much extra return. Any thoughts? Thanks, Bruce.

    | BruceMcG
    0

  • I've recently taken over a client who uses the Magento platform and there was definitely a duplicate issue with his homepage. It redirected www to non www, however the canonical tag was setup wrong and pointing to the www version. When I looked at OSE for both versions the non www has only 7 linking domains and a page authority of 32. The www version has 24 linking domains and page authority of 39. As the domain is fairly new, I decided to redirect the non www to www and keep the canonical the same. (I changed the internal linking structure etc). When I run both URLs through this tool: http://www.ragepank.com/redirect-... it's returning a whole bunch of 302, rather than 301 redirects. What's the deal with that? Is that a Magento setting that I can fix or something a little harder? I'm not sure if it's proper etiquette to post the URL of a client, so if that would help and is OK, please let me know. Thanks

    | bradkrussell
    1

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.