I personally would follow them There is no issue in having a page with thin content followed, it will not hurt anything.
Posts made by LesleyPaone
-
RE: "No Index, No Follow" or No Index, Follow" for URLs with Thin Content?
-
RE: How does the grader tool treat keyword "stuffing" in ecommerce
Could you link the site, I do SEO specifically for e-commerce sites.
-
RE: Any recommendations for specialist magneto dedicated hosting in UK/Ireland?
I do not, but I work with another package and one thing I would suggest is if you are targeting Ireland and getting enough traffic for a dedicated server you might want to look into running on AWS. They have a datacenter in Ireland and I am running some pretty large e-commerce sites on them +4m visitors a week in size.
-
RE: Google search results
One thing you might be facing is Google tailoring the search results to what it thinks that you want to see. I would use Moz's rank checker tool to verify the results first, then act from there.
-
RE: Moz crawl issues: All pages keep resolving to our "cookies not enabled" page
Not an answer, but I am curious to what framework you are using.
-
RE: Blocking entire country?
I am guessing the machine has a type of management on it. If you are on a static ip address I would have the sysadmin set the port to only accept connects from you, them, and the local connection from the sysadmin in the data center.
-
RE: Are Duplicate Bio's Duplicate Content?
I personally would not worry about it. The content on the other sites will dilute the bio content enough that everything should be fine. Plus with a bio you want consistency.
-
RE: Anyone using CloudFlare on multiple sites?
Since this seems like this is a pretty active site, how is the GA pagespeed tab showing that things are doing since the switch?
As for as the threats blocked, they just blacklist ip addresses, they are not usually actual threats. Think about when you run a virus scan and it says it protected you from 125 threats and then you just see that it is talking about tracking cookies. Most of the threats are known TOR exit nodes. Which some people use them for things like that, but others use them because they can access content that is blocked in their country.
As far as the file requests, it does help shed file load, which is good. But that is what a CDN would do as well. Both of them do not cache page requests either, so they do not change the actual processing and page loading time of the site. They just change the asset loading time.
-
RE: Anyone using CloudFlare on multiple sites?
I personally do not like cloudflare. All is fine and well until a site on your DNS starts getting DDOS'd. Then it will affect your site as well. I tend to optimize a site how I want it and then use MaxCDN as a CDN network. Then you do not have to worry about compatibility issues either, like mentioned above. I do keep a cloudflare account however, I use it to move in front of sites that are actively being attacked.
One thing I don't care for is that cloudflare tracks your users and is basically building up a tracking database. If you look on your served resources there are a couple of cloudflare cookies for this purpose.
-
RE: What could have caused Total links to drop from 860,000 to 93,127 in Open Site Explorer?
Do you remember any of the sites / pages that were being linked to? Are they still there? It could be a natural thing, or it could be an index issue with Moz. Once I had posted a comment on a site and it was considered a top comment, so they kept it in the side bar of their site for like 2 months. Their site had like 30k pages. You might be having an issue like that. Is the number of linking domains down also by a lot, or is it just the number of links? Because a site could have re-factored how they do things in relation to duplicate content, then when Moz crawled it, it is not picking up all of the links from the site.
-
RE: Why can I only add 3 competitors?
I would be careful with it, I would pick good quality links over anything else personally.
-
RE: Why can I only add 3 competitors?
One thing I tell my clients it "Don't fish too big". The way that I do things is I shoot for the competitors that are close by, pass them up, then add a new list. I do ecommerce and a lot of my clients think that Amazon and Etsy are competitors. I have to tell them that they are not, they are market giants that have thousands if not tens of thousands backlinks built daily.
-
RE: Why can I only add 3 competitors?
That is just the way it is, that is how moz works. If you are only monitoring one domain in your account, I think you should be ok setting up multiple campaigns with the same domain and monitoring more competitors that way. I haven't tried this, but it seems like it would work.
-
RE: We sold our site's domain and have a new one. Where do we go from here?
If you were not able to do any redirects on the old domain, there won't so much be a decline as it will be starting over from scratch. Were you able to put any 301 redirects on the old domain that the new owners will leave?
-
RE: Can someone interpret this entry in my htaccess file into english so that I can understand?
The first two lines are the condition. It is saying if anyone comes to legacytravel.com or www.legacytravel.com then it is looking for this string carrollton-travel-agent if that string is found, then it will rewrite the url to http:\www.legacytravel.com/carrollton-travel-agent I don't really know why that rule is in place, but I am not familiar with the site.
You can learn about what the symbols in the htaccess mean here, http://perishablepress.com/stupid-htaccess-tricks/
-
RE: Blocking entire country?
If you do not block googlebot ip addresses in the process, you will not get penalized. But remember google does not post a list of their bot ip addresses, so be careful and monitor your GWT for error messages.
On a side note, what kind of attacks are you getting, there might be better ways to block them.
-
RE: Ticket Industry E-commerce Duplicate Content Question
Different title tags do not make them not register duplicate content. It helps with it, but the way duplicate content is figured is by how much of the on page text is the same from page to page.
-
RE: How to avoid duplicate content on internal search results page?
It really depends on your developers and your budget. I do development and SEO, so this is how I would handle it. On searches that are returning just one result, I would put something in place to see how many results are returned, if it is only one result returned, in the head of the page I would set the canonical url for the search page to the actual page that is being returned as the result.
If more result is being returned, you can handle that in many different ways. One way would be to create a pseudo category out of the results page. I would use this sparingly and only for popular search terms. But you could have an extension written for your site that can give you some on page control of the text, the url, the meta areas, and things like that. I wrote a module for a platform I use a couple of years ago that does something like it. http://blog.dh42.com/search-pages-landing-pages/ You can get the gist of the idea by reading about it there, but that is one good way to handle a limited number of them to get them to rank better. I would not do it with every search result though, you might get a penalty.
-
RE: How to avoid duplicate content on internal search results page?
Sorry, I misread it. I think either or in regards to the robots or on page is applicable. I think the on page would make them fall out faster though.
-
RE: Cloaking/Malicious Code
Great, good luck with things. You might be able to use the time stamps on the files in conjunction with the server logs to determine when the modifications were made and how they were made.
-
RE: How to avoid duplicate content on internal search results page?
I would add it to the robots.txt file. Depending on how your cms is set up, you can grab the search string from the current url and also use the presence of it to fire a no index as well. I wouldn't do a no follow however, there is nothing bad about following it, it is just the indexing of the search pages.
-
RE: How to avoid duplicate content on internal search results page?
No-index your search results. Most platforms do it by default to eliminate that error.
-
RE: Cloaking/Malicious Code
If you are thinking your site has been compromised what I always use to check a site is https://sucuri.net/ I would advise you to change all logins and passwords as well as update any cms you are using to the latest stable version as well.
-
RE: I have a blog on a sub domain, would you move it to the rood domain in a directory?
Thanks guys, you have pretty much confirmed what I thought. It looks like I have a fun weekend ahead of me redirecting and testing things out. But it will be a good notch to add to see how the traffic goes.
One thing I wanted to ask about, this is not the case in my instance, but would the recommendation be the same if the subdomain had a higher PR than the naked domain?
-
I have a blog on a sub domain, would you move it to the rood domain in a directory?
I have a blog that preforms fairly well on a sub domain, but after reading a post that Rand made to the Q & A I am thinking about moving it to the main domain in a sub directory. What are your thoughts on this? Here are some stats on it. The blog currently gets about 5 x the traffic of the main domain. The domain is older, 2008 creation date. They pretty much register for the same keywords.
-
RE: Should publish as page or blog posts on Wordpress ?
I guess it depends how it is going to be defined in your navigation. If you are going to have a section of your site dedicated to it, I would publish it as a page. If it is not going to be accessible from a menu, I would make it a blog post.
-
RE: Newsletter Optimization Help - Anyone know someone?
If I were to make a suggestion, you might mention what industry they are in. I know a lot of us SEO / Marketing people have industries that we are better in than others.
-
RE: How does user behaviour signalled at Google affect rankings?
I have my own views, but I would say very important. From what I have seen a site running GA that starts getting more traffic from other sources will raise its ranking in Google.
-
RE: URL parameters affecting link juice
In that case, use a canonical url and you should be fine. Have the canonical point to the page with no query string on it. You had me worried, I thought you might have a platform that needed the parameters to show products, then you mentioned you took them out in GWT. I have seen clients do that on the platforms that need the query strings and de-index all of their product pages.
-
RE: Site Speed, is it worth it from a SEO point?
In short, yes. I cannot remember which article or which video I saw it in, but one of the guys from google was interviewed and someone said the term "search engine" to him. He replied that Google likes to think of themselves as an "experience engine". That being said, they strive to provide the sites that promote the best user experience. A slow site does not provide that. I would be willing to bet, 10 times out of 10, two sites that were equal in every other aspect, the faster one would rank higher.
-
RE: URL parameters affecting link juice
Can you be more clear, with all of the different ecommerce systems, most of them do the urls different. Are you saying that your ecommerce platform produces urls like site.com/product.html and that will take you to a product page? Or does the platform need the url parameters to figure out which product to display?
-
RE: Duplicate content, which seems not to be duplicate :S
You are going to have a hard time getting your pages indexed with a feed, there is no telling how many other companies use the feed with the same descriptions. The best solution is to write your own custom content. Or at least try to add some to it.
-
RE: Duplicate content, which seems not to be duplicate :S
I would shoot for under 80% of the same content. Yes, it is something that a lot of ecommerce sites deal with, but there are ways around it. Are you using a feed by chance?
-
RE: Duplicate content, which seems not to be duplicate :S
It is seen as duplicate content because of the ratio of text between the two pages is very high. You only have 10 -20 words for your product descriptions, but you have a couple hundred words of on page text that is the same on each page.
-
RE: Date of page first indexed or age of a page?
If the page has not changed since it was indexed, you can do an advanced search in google and select a large date range. Then it will show the index date by the result. Be aware though, if it was ever changed since it was published it is showing the last update date. So you can narrow down that way also, by adjusting the years.
-
RE: Best way to handle blogs for a customer with a chain business?
In a situation like this it is very difficult in my opinion. One thing I was unclear on, does the client own the brand, or are they just a franchisee? That makes a bit of difference as to what I would suggest.
-
RE: One Location - Multiple Businesses
I would test the mail system and see if it works by sub dividing the suite. Like say the address is 299 East Main St, Suite 301. I would try something like this 299 East Main St, Suite 301-A and see if the mail still makes it. If it does, I would use that to delineate the different businesses.
-
RE: Best way to handle blogs for a customer with a chain business?
Interesting situation, let me ask a couple of questions to get a better idea of it. Are all of the domains area specific like gymvirginabeach.com gymwashingtodc.com, ect. Or is there one main name, like gymsite.com and then like gymsitevirginabeach.com. Also, is this a franchise or do they own the gym brand?
-
RE: Anyone know of any forums for agencies or those individuals engaged in Internet Marketing, SEO, Integrated Marketing, etc.?
I personally think there is a lack of them and would be interested. I do not know if you are familiar with ecommerce fuel, but they have a forum where everyone is screened and pays $25 a month to be a member. I would be interested in something like that, because it keeps the knowledge level high in my opinion.
-
RE: Site Disappeared For Exact Match Search?
It is number 7 on the first page for me, http://screencast.com/t/AUqwSdKvCO2
-
RE: Magento OR OpenCart OR osCommerce OR Zen Cart OR WP e-Commerce OR WooCommerce
I am sorry you feel that way, but you are wrong. It looks like woo only supports around 125 payment methods (gateways and offline type methods such as P.O's and such). With only 14 free ones that only include one top tier US payment company (Amazon).
Check out something like Prestashop. Between the main site and all of the 3rd party merchant sites, they support around 300 different gateways and methods. With most top tier gateways in the US being free, such as Auth.net, Bluepay, First Data, Paypal business, Paypal Advanced, ect. So while Woo does have good payment support, it is costly and not near the best coverage.
-
RE: I am managing an existing ecommerce website and just subscribed to the MOZ tools - what is the best rout to learning how bets leverage all the tools to optimize my site?
Shameless plug, but I did an interview with Rand Fishkin from Moz about SEO relating to ecommerce sites, it was published last week. You might find some tips here, http://www.prestashop.com/blog/en/seo-expert-series-rand-fishkin-of-moz/
-
RE: SEO: open source e-commerece vs. off the shelf
SEO would not be my main concern if it were me. I would chose the platform that you are best suited for on other factors. Honestly best SEO practices can be followed with just about any platform, you just have to find the one that fits your needs best. It is like car shopping, they will all take you to places you want to go, but you still have requirements outside of that.
-
RE: Is this the right RSS Feed address for our blog?
It is, that is how a RSS feed looks.
-
RE: Getting Different PA/DA for 'www' and 'non-www'?
From what I have personally seen with Moz is that it calculates DA based on the whole domain and passes the same DA to all of the sub domains of the site. So that would explain why the DA is the same for both urls, since www is considered a sub domain.
For the PA what it means is that more than likely your site is accessible from both the www and non www urls. This creates duplicate content, but at the same time it splits the page's authority as well. The reason being is that Google will see them as two different pages, but they have the same content. Then also some users might enter with the www and some without. Then those people might want to link to your site, some will use the www and others will not use the www. So you are basically creating two different link profiles for the site.
I would add a canonical tag to the site if you are going to keep it this way, then it will be known to search engines which site is the preferred one. Ideally, you should put a redirect and choose to go with using the www. or not using a prefix at all, but not both at the same time, then add a canonical tag too for good measure.
-
RE: Can Image File Names be Masked?
By masking are you talking about rewriting the url? If so, they will stay in the index. You can also send a canonical header with your image files too, Google will respect that as well.
-
RE: Can I use a fake email address for Moz Local submission? This is for a medical practice, that has been advised not to publish an email address due to HIPAA concerns.
I think if it was me and my client, what I would recommend is getting a HIPPA compliant email hosting account and making a submission email with it. Then setting an auto responder to the email account telling them that it is not checked. Also I might try to put a disclaimer some where in the listing as well not to email medical issues.
You can only hurt your business so much by being compliant to peoples stupidities.
-
RE: My website is not avaliable, will i lose ranking?
This is straight from the horses mouth so to speak, https://www.youtube.com/watch?v=4eYJuT0yGrI
-
RE: Are Subdomains better or SubDirectories better for an international website ?
When you use a sub domain search engines see it as a separate domain, so it is harder to build up authority for multiple sub domains as opposed to just using sub directories. I would go with sub directories if I were you. But, depending on how big the brand is I would look at ccTlD's too.
-
RE: What's the best way to eliminate "429 : Received HTTP status 429" errors?
Contact your host and ask let them know about the errors. More than likely they have mod_sec enabled to limit request rates. Ask them to up the limit that you are getting 429 errors from crawlers and you do not want them.