Most likely a reporting issue. But a good reason to dive into those links (in bulk, a link profile audit) and see if you can spot and see when they got added, if you can improve some etc.
- Home
- eyepaq
Latest posts made by eyepaq
-
RE: Huge increase in links to your site when moving to SSL
-
RE: Bit.ly Links & Google
Does Google treat bit.ly links differently?
** Yes, but the difference is small.
So bit.ly links are good as they use 301 redirects and although the official version is that they don't lose any page rank they do a little. From the 3o01 or else - it doesn't matter from where, but they do loose some. If you get any - that's fine - don't worry about it. If you build a link building strategy on those - that's another story and a good but very grey area
-
RE: How do the powerful links of an old domain help my new domain?
Those things still work - in some extent. Google however will drop some/most of the links but some do stick. if you can diversify the target it will help even more.
If there is no relevancy though (between the links, old domain and the new domain/pages) - you won't feel anything.
-
RE: Should I noindex?
There is no penalty for duplicate content (if you take that content from another site you posted on yours). Google went on record several time on this subject. But you won't rank with this content anyway.
Now if your site is 10-100 pages and you get 15k in - then you will change the topicality of the site by a lot with thin content that doesn't rank and dilute everything you have and that is not good. If your site is 200k pages indexed already and you add 15k - I would't really worry about it.
To play it "safe" you could indeed noindex them. Those will still be crawled if they are linked though. If you want to cut them out and also save some crawl budget (if the site is new and with not that many pages) I would push all of those 15k pages in a separate folder in the structure and add that folder in robots.txt as disallow.
Take into account that linking to those pages and having those pages indexed is also a signal for the pages that are linking to them (good or bad). If those are on the same topic it will help the pages that are linking to them.
If you can find the time and energy to improve this content even if it's the same like on any other sites, find a way to add some value (in the way you show it, with resources, stats, addon con tent etc) those can turn into a good set of landing pages.
my 2c. Hope it helps.
-
RE: My competitor pushed me down the SERP out of nowhere. How do I fix this?
Can you share the query ?
-
RE: Do I need a separate robots.txt file for my shop subdomain?
You go be fine without one. You only need one if you want to manage that subdmain: add specific xml sitemaps links in robots.txt, cut access to specific folders for that subdomain.
if you don't need any of that - just move forward without one.
-
RE: Subdomain replaced domain in Google SERP
Hi Chase,
Removing dev via web master tools should do the trick for now. Then since google won't get to dev anymore you should be safe.
Adding both noindex and password protection is not needed. Since it's password protected Google won't get to see the noindex on the pages. So you should only do one of the two. No need to change now. The password protection is safe.
As expected 'dev.chiplab.com' was removed from the SERP. Now, I'm a bit worried that the link equity was transferred for good to the subdomain from 'www.chiplab.com'. That's not possible, right?
*** Yes, that's not possible so you are good.
Only 301 redirections are "mandatory" for Google to pass equity - so all good.
-
RE: Subdomain replaced domain in Google SERP
noindex would be the easiest way.
Seen some people having the same issue fixing it by adding rel canonical to dev pointing to the new site and so the main site got back step by step with no interruptions...
Cheers.
-
RE: SEO / Word Press feature
Thanks for the answer.
It dose indeed but it's removing the post if it is scheduled for a future date. And if that date is 6 months from now we lose it until it will get reposted.
We are looking fora solution that a post can be scheduled some time in the future but it will stay online until that date comes.
-
RE: SEO / Word Press feature
Just to be more clear on it: we want to edit, improve older articles with fresh new content and updates not just change the date but we also do want to keep the old URLs as we have a lot of social exposure on those and direct links. (and yes, we did take into account the 301 as a solution but we will lose the social shares).
So dose anyone knows a solution or a plugin that can do that ? (as if we take a post from 2007 and set it up for dec 2015 we are losing the old one until it will get published in dec 2015.
Thanks !
Best posts made by eyepaq
-
RE: Link building strategy - black hat or white hat?
I am doing the same thing for several projects. If done with extra care there is nothing wrong with it.
There is no black and white hat seo - there are no clear rules - we're all cowboys.
There is nothing white with anything we do - [natural links] is ... white - and that is not sustained by anyone - it's just natural - it happens. If you have the best site ever you don't need SEO to improve the traffic and make it look natural as you force it
If you have good domains you can build quality content sites and use a solid and not aggressive link building techniques to your main site. You must however host those satellite domains on different hosting accounts (different IPs).
In my opinion this strategy works and works great. It will give you more options in the Search results, options that you can control and it will rise your main domain because of the links that will point to this domain.
However is time consuming as you need to do SEO sessions for each satellite but is worth it, especially if you have great domain names for those satellites websites.
There are a few things you should consider (again, this is just a suggestion):
1. Don't host the satellites on the same IP class.
2. Don't use one page / Landing page and link to your main domain.
3. Treat them as independent projects with separate accounts on Social networks, blogs, submit articles to digg or whatever in each site name.
4. Build links to those satellites - you can risk some shady techniques as the risk is not as high as with the main brand.
5. Don't place junk or duplicate content but if you do make sure it's from your competitor as maybe you will outrank them in the future - just kidding on the last part... or not ?
I don't think this technique can be called spam if you do build good content sites on those satellites and use the juice to boost your main brand.
-
RE: Spammy page titles and the consequences
It makes no sense to go with your friend approach
You won't get a penalty for it but for sure it won't help you either.
Brand at the end ? It's a choice - personally I always place it at the end as I think it makes sense.
having the duplicate and small hang between the two phrases ? Doesn't help.
I would go with " Bars in New York | brand " if that mess sense with the page and all.
What you could mix up is to have a different h1 - slightly different if possible as google doesn't like to be told what to rank based on what
Hope it makes sense. hope it helps.
-
RE: Use of subdomains, subdirectories or both?
Hi,
Since it said that search engines treat subdomains as different stand-alone sites,
** Search engines could treat subdomains as different stand alone sites. Usually they don't. A good example when they do that is the blogspot subdomains - and that is because those are in fact separate stand alone sites.
In your case however, since it's about users, similar with blogspot somehow - google can treat them at som point as separate websites.
whats best for the main site? to show multiple search results with profiles in subdomains or subdirectories?
** My personal opinion is that from a user point of view you should go with the subdomains as you do right now - it make sense, it's easy for them to use those urls, to link to them and so on.
You could lose some link equity for the main domain if some or all subdomain will be at some point treated as separate domains but if you put everything on a scale, it will balance as advantages on the subdomain approach anyway.
What if i use both? meaning in search results i use search directory url for each profile while same time each profile owns a subdomains as well? and if so which one should be the canonical?
** To be honest I do see the point / advantages in doing that.
One other advantage is that if you go with subdomains and google will count them as separate websites, if one of your user is ding something stupid (trying to rank with it and start building gambling and porn links to that sub domain you will be safe with the root domain and the other users won't be affected.
Hope it helps.
Edited to underline the word could
-
RE: Single-words high keyword density. How many is too many.
In my opinion I think you should ditch the keyword density measurement in your SEO approach - it doesn't make sense to focus on it, you can focus on more important factors that you'll help you rank.
Now the keyword must be in the content - that's for sure but you just need to have the keyword present and maybe use it in the url , in the title and h1 if it makes sense - never go overboard, don't stuff the page with the keyword but you are going way to far in trying to get the magical % number for any keyword as far as density in order to rank.
have some internal links with the keyword in different forms pointing to the page that you want to rank. Have some good not spammy external links for this page that you want to rank.
Don't over optimize as this can harm you - over optimization is worst then no optimization at all.
The bottom line is that in my opinion your focus is on a non important element - and fixing the magic number won't help you rank anyway.
Hope it helps.
-
RE: Competitor scraped ecommerce product overview
Hi,
A lot of higher authority sites (I am not talking about ny times but just, in a particular space, sites that have a higher authority then others) are copying lower authority sites and get credit for it from Google.
In your case -dose your competitor has more visibility and authority in general ? Was the copy session soon after your client released that particular page ? If both answers are yes, then there is a danger there. If not - there is nothing to be concerned of.
Most of the time, like you said, google's algorithm is intuitive enough to figure out where the original content came from but it's not perfect.
-
RE: Article linking to my post kicks it from the search results.
Give it some time. It's usual that a high domain authority link will "sunk" the target for a short period of time. Google even has a patent related with this. It should pop back up - it can also take 30 days or so you still have some time to .. just wait.
That's just one response for this - it coud be somethig else too though ...
-
RE: Why does my competitor rank so well with so many paid/traded links?
I understand the frustration and in most of the cases it's the way you are painting it but you are also subjective and also with that amount of links it is very hard to fully understand the link profile of your competitors and yours.
What I think it's important to understand, in general, is that the quality matters over quantity. Maybe 80% of your competitor links are not even consider and there is a small set of powerful links in his profile that is actually making the difference but again it's hard to really assess the situation.
You can however go his route and hunt his link but if you do so and get low quality links you might end up in the future in a hard corner and start sessions to remove those - especially with Googles crusade on links. You might get some short or even medium term advantage and get heads on with your competitor but dose it pay on a long run ? I think not.
I would rather go for only a few really powerful links, smart ones, to increase your visibility then going fishing with TNT.
As for you going on a crusade to hunt your low quality no follow links, in my opinion, is a waste of time.
Those links by having the no follow won't affect, positively or negatively your link profile and equity in general. The anchor text of those links are indeed taken in consideration by google to understand better your website, product and brand but you will need to balance very wise (if those anchor text are bad) if it's going to help you hunting them down. ( to be honest I think even if the anchors are bad it dosen't make sense to waste time on them).
Just a thought .. or several
Hope it helps.
-
RE: Is there anything wrong with having duplicate description tags if they are relevant to their pages?
No, if those make sense there is no issue with the duplication. (actually even if they don't make sense there is no issue with the duplication).
There is no penalty or filter for duplicate description tags - those are anyway also not part of any ranking factors - are just for CTR - just for the user.
Hope it helps.
-
RE: Google Instant Search? REALLY?! Why is this the result???
Come on ! That'snot offensive. It's based on what people search. Do you need fake suggestions ?
Try searching for:
americans are ...
germans are ...
austrians are ...
Then you will see some offensive suggestions
-
RE: Google Instant Search? REALLY?! Why is this the result???
Same from here in Austria.
Same from Germany (just pinged a friend from Germany to see his results).
It's world wide.
kicking since 1979.
Looks like your connection to Moz was lost, please wait while we try to reconnect.