Adding more internal links so the linkjuice isn't diluted over 1 link would be like playing black hat SEO... I'm sure it will be seen as spam. A nofollow is enough. Still, a directory of only 7 sites without the inner pages is useless.
Posts made by FedeEinhorn
-
RE: Will adding 1000's of outbound links to just a few website impact rankings?
-
RE: Will adding 1000's of outbound links to just a few website impact rankings?
Couple of questions:
The Website is a directory and yet it points to only 7 outbound Websites?
What about using nofollow for all those links?
On the content side, you are about to loose much of the site's content, you should expect a massive traffic drop. What's the point of a Directory if it only links to 7 Websites without offering any extra valuable content?
-
RE: Significant drop in traffic
Ufff, took me a while to find this one, and I'm sure I saw another one talking about nameservers, too (back when I was researching SEO impacts of using cloudflare, you know, hundreds of sites using the same nameservers).
-
RE: Significant drop in traffic
Whois data, nameservers, ips, etc, are no longer used to rank Websites, according to Google (you can't penalize a site because another stepped into a dark area. Lots of sites use private whois, reverse proxies, etc).
If you had a penalty on one of your sites, are you sure you didn't make the same mistake on the other sites?
-
RE: Twitter stats
Those metrics show you in fact the times that the URL was twitted. If you publish a tweet with yoursite.com/targetpage then that tweet will count towards that page but not towards yoursite.com
In any case, no one knows the extend of social signals on ranking algorithms... I don't think you are going to get a higher position because you have more tweets... (all those links are nofollow anyway)
FYI: If you put a tweet button on a page, you can "tell" the button which URL you want to tweet or it will use the page where it is placed (same applied for the count that it shows).
-
RE: How to handle a pure spam penalty (from GWT) as a blogging platform
404s. Remove them from existence.
Why will you have content that is pure spam on the site? if it is spam, delete it.
-
RE: How to handle a pure spam penalty (from GWT) as a blogging platform
Steps you should followÑ
- Clean ALL the blogs, remove any trace of spam (document everything in the process)
- Go back to the first point and make sure you have NO SPAM left (again, if anything comes up, document the changes you make)
- Once you are completely certain that there's no spam left, you can send another reconsideration request, make sure you show them the work you have done to clean the site.
- Wait for their response, and if you still get a negatory, repeat the process as most likely you still have spam in your site.
Hope that helps!
-
RE: Spam Penalty
Is there anything in your site that isn't available on other sites? You mentioned that you developed some of the games, but they are also listed and available in other sites, so there's no benefit to you, either. You are listing the same games that most likely hundreds of other Websites.
You need to use another approach, what makes your site unique? Why would Google list your site instead of others?
-
RE: Pages with Temporary Redirect (CTA)
Why is that those pages have redirects? I'm guessing you are pointing to them (with a 302 redirect) from other page instead of linking to the directly.
To solve that you can simply identify those redirects, change them to 301s or even simpler, link to the appropriate page instead.
Hope that helps!
-
RE: Penalty? NoFollowed Link Profile
The way you are managing the widgets just fine, those links should be nofollow, as they are now.
If you haven't receive a manual penalty notification it could be just as you said, a new site, with few links. Keep working on it, post content, etc so you can earn links the right way.
Hope that helps!
-
RE: Schema Training
Perhaps this can help: http://www.searchenginejournal.com/schema-org-eight-tips-incorporate-rich-snippets-conform-google-hummingbird/75435/
You can also read all about schema here: http://schema.org/docs/gs.html
-
RE: What is the best way to do a one time rankings check of 10000+ keywords
There are several other sites that re entirely dedicated to check keyword rankings. But no matter which one you choose, checking rankings for 10,000+ keywords will be expensive as hell.
Examples: serps.com, trackpal, positionly, etc.
-
RE: Moving image directory location on redesign.
If the images are being indexed just fine, even though they are "four folders deep", why would you change that? It would be like trying to fix something that is working perfectly, you may end up with something you didn't expect... (reindexing all images could remove them from the index for a while... it won't do any harm, but as you said, they are just fine where they are now).
Hope that helps!
-
RE: How to find keywords in super-small niche?
Have you tried Ubersuggest? http://ubersuggest.org/
-
RE: Need help with Title tags
Seems that you have lots of plugins that could interfere with the real results. Google is picking up your Title, and description tags accurately. What you are seeing is a local listing. You can also edit that listing to help Google better describe the business. But to answer your question, google is showing the correct title/desc tags:
-
RE: 301 redirects - an ongoing argument in our agency
It's the same thing! Both result in a 301 redirect...
The only difference is probably a little faster at the registrar level (unnoticeable), and the need of a server for the htaccess option. Besides that, there are no SEO benefits on any.
-
RE: Responsive websites rank better?
It isn't true. Responsive don't rank better. Usually people tend to say that as search engines like to show useful results to their users, therefore, when a user searches from a mobile device, the engine will most likely rank the one that it is better suited for a mobile device first (regardless if it is a responsive design or a mobile site).
Suppose a mobile search returns 2 sites that are equally worth reading, but only one has a mobile version (whether responsive or mobile). Then the engine may rank the mobile version one higher as it is best suited for the device the user is using.
More info in the following Google Webmasters Help video: http://www.youtube.com/watch?v=D03wRb4s7MU
-
RE: Https://www.mywebsite.com/blog/tag/wolf/ setting tag pages as blog corner stone article?
The idea of linking words to the tag page helps engines and people to find related content. However, it is always recommended to noindex tag pages, which means that search engines won't index that page, giving more exposure to the pages that really matter, the ones with the content.
-
RE: How many articles are ok to publish a day on my website blog?
There's no such thing as a maximum or minimum. All that matters is WHAT you publish. It is best to publish an excellent post once a week than 10 posts of useless content a day.
-
RE: How well do G.A. utm Campaigns play together?
I wouldn't use a utm_campaign within banners in my own site. Even with paid advertising, sometimes you don't need to use those variables as GA is intelligent enough to recognize the source as paid/organic.
Let's say you advertise on Twitter, AdWords and Bing. GA automatically recognizes AdWords traffic, no need of utm_campaign there; on the other hand you do need to set those variables to differentiate from paid/organic from twitter and bing.
Once you have a Goal created in GA, you can check the Goal Flow and see from where are they coming. You can even set steps inside GA and track the pages that the users browse before converting.
Also, if you send ecommerce data to GA, you will be able to track the sources that produce better income, etc., by using the ecommerce (as you book doctor appointments, you can look at the appointment requested as a "sale") and then take advantage of the all the ecommerce data.
-
RE: Video Sitemap Error
No, it shouldn't mess with the snippet. Snippets aren't created from the content of the XML feed, but from the video markup in the site. The sitemap only allows Google to easily identify the videos in your site.
As for the robots.txt question, there's no need to specify the video sitemap location, you can if you want, but that won't give you any "boost". Regardless if the videos are youtube hosted or not.
-
RE: If you have a product on your site that's only available in the US, is there a way to avoid it leading to a 404 error if a user in Canada accesses it?
Take the example of Amazon. There are several products that they do not ship outside the US, and just by adding a notice on the product page should be enough. Try getting the IP location and then add a message in the product page specifying that the product is only available for XX.
You can even go a little farther by adding related products that ARE available for that user similar to the ones he's looking.
-
RE: Twitter account that copies all your tweets
I can bet there are several scripts around capable of doing that. I wouldn't mind so much tho... you could try reporting or blocking the user to see if that prevents them from reading you feed...
-
RE: Video Sitemap Error
Value of the duration tag must be in seconds. In your example it should be "98". The duration tag accepts from 0 to 28800 (8 hours).
-
RE: What is a good tool for managing backlinks?
There are several tools that can help you, such as linkdetox, remove'em, etc. However, it is always recommended to go over the links manually. As only you know the real value of a earned link.
-
RE: What does Appropriate Use of Rel Canonical mean?
The canonical tag is used to tell search engine crawlers (Googlebot, Bing bot, etc) that the page they are viewing should be indexed using the value of the canonical tag.
Say the bot is crawling http://www.domain.com/page.html?var=1, a canonical there could have "http://www.domain.com/page.html" as the value to tell search engines that http://www.domain.com/page.html is the page they should be indexing.
It does not guarantee that the engine will in fact index that URL, but it helps avoiding duplicate content issues.
More here: https://support.google.com/webmasters/answer/139394?hl=en
-
RE: Bot or Virus Creating Bad Links?
Seems like your client's fault. Even though you said they swear they did not create the links, no negative SEO will waste the time or efforts on giving links to a site that was just launched and it's not even ranking well.
Perhaps your client did buy the links but without even knowing, there are places where links are sold like a "Be in the first page on Google", so common people do not associate that with spam links...
-
RE: Large number of Temporary Redirects
You are welcome! glad I was of help!
-
RE: Large number of Temporary Redirects
If you don't have the site available in french yet then you should find where is the site linking to the french version and remove the links, that will solve it without the 302 redirects.
If Mozbot is able to find those pages, that's because somewhere in your site you are linking to that nonexistent version.
-
RE: Duplicate Content
Have you created a Moz campaign for the site? As Mozbot crawls your site and tells you about all the duplicate content issues that you may have.
To solve that, instead of checking of changing code all over the place, make the changes on those pages that you already know have duplicate content issues (like in the example you gave) and then let Mozbot re-crawl the site so you can see which pages still have issues to solve them.
The rel canonical should point to the one page that has the most info (as you said list has less, grid will be better for the canonical).
If your site uses several categories and subcategories, you should also have a look at the noindex tag, as sometimes that creates duplicate content issues too (subcategory products listed in the root category). The same applies to any kind of listings, such as search results (which should be noindexed).
Hope this helps!
-
RE: Competitive metrics
You can read about DA here: http://moz.com/learn/seo/domain-authority
-
RE: Where Google+ Local Gets Listings?
I'm almost sure Google does not buy databases, probably someone who knows the place added it, as you do in foursquare (you search for a place while checking in and if it isn't listed, you can add it yourself).
-
RE: CHange insite Urls structure
It will bring lots of issues. That's not a solution but a problem. Your IT should work on another approach to solve the server load instead of that URL restructure.
Subdomains are considered different sites, therefore hundreds of links from several subdomains could possibly bring a penalty, and the way to avoid that would be to use a nofollow tag for the links, which will hurt your SEO dramatically.
I am sure your IT can solve the server load distribution issue using another approach (DB servers, file servers, server cluster, etc.), if not, you better look for another IT.
-
RE: Re-classifying a Traffic Source in Google Analytics
Are you sure the Analytics code is installed correctly? I've seen this issue in the past and all the times it was a bad installation of the code.
Example: https://productforums.google.com/forum/#!topic/analytics/DHACtWJE1_I
-
RE: How much domain authority is passed on through a link from a page with low authority?
It is, but no one outside Moz can answer that, there are several metrics that affect authority passing, and I guess that most of those metrics and algos are property of Moz and most likely secret to prevent abuse.
-
RE: Un Natural Links Removal Strategy
I don't think you are taking a good approach...
Instead of noindexing the pages, or blocking those that have bad backlinks, you should fix the backlinks.
Run a complete analysis on the entire backlink profile, go over each and every link trying to remove it by contacting the Webmaster, document everything you do on the process.
Once you are left with those link that you don't want and the webmaster didn't remove, create and upload a disavow file listing all those links.
Then you can proceed to send a reconsideration request, explaining Google what happened, what you did to fix it and send the proof of your work.
If you continue with your strategy, you basically need to create several new pages to block any bad link, and wait for the penalty to expire (nobody knows how long it takes). Then if and only if you removed all those pages that have bad backlinks you won't be penalized again, so this could take months, or even years... it isn't a good strategy at all...
-
RE: Google adding strings to meta title in SERPs => Driving my client crazy!
Are you using the advantage of rel="alternate" hreflang="x"?
-
RE: Creative Commons Images Good for SEO?
It won't make your site look spammy if the content you are publishing isn't spam. CC images require you to link back to the original source, you can even use a nofollow attribute on those links.
But still, as the images are not yours, you won't benefit from image search, as Google will list the original image posted by the author instead of yours.
There are royalty free stock photos that you can use and they aren't that expensive if you are on a subscription. Like Fotolia offers a subscription for 5 images at $25 per mo. But you can download a lower resolution one, which will deduct half a credit and then you can download 10 images. Most likely, you don't need the one that's worth 1 entire credit as the 1/2 credit one is large enough.
PS: Here's a post from Ann Smarty about how to use CC images from flickr: http://www.seosmarty.com/flickr-creative-commons/
Hope that helps!
-
RE: Suggestions on Website Recovery
Is is widely known that manual penalties do expire. If you have fixed the issued, cleaned up the backlink profile + disavowing those that were impossible to remove, then perhaps the penalty just expired and as you are no longer in violation to Google quality guidelines you haven't receive the penalty again.
If on another scenario, the penalty expired and you didn't do the cleanup, then most likely the penalty will be back to bite in the a***.
I always heard that penalties do expire but no one was able to tell how long it took, I think you are the first one that can verify that penalties do expire and don't come back unless you haven't fixed the issue
Anyways, after a penalty is revoked/expired, it will take some time, probably month to see the changes.
From my point of view, you are in the right direction, building content to earn backlinks, that's the way to go.
Hope that helps!
-
RE: Best way to noindex long dynamic urls?
If you have a page that lists all the villas outside the search results, then you don't lose anything by blocking that folder on the robots.txt
But still, somebody, the guy that wrote the custom theme knows how to do the changes needed.
If you want I can help you with it, for free Just PM me (I'll need FTP access).
-
RE: Best way to noindex long dynamic urls?
If you have a /all-villas/ page then you should go ahead and noindex the search results as Google Guidelines suggests. You can either do it in the /property-search-page/ or using the robots.txt file.
In the robots.txt, add:
disallow: /property-search-page/
The robots method guarantees that no page inside that folder is indexed or even crawled (including /property-search-page/?whatever).
Or on the page /property-search-page/ you can add the meta noindex as such:
Then check if that meta tag is shown in all search results (just check a couple of them).
Hope that works!
-
RE: SEMRush Ads Traffic Price VS their PDF report
No problem. Keep checking on SEMRush in a few days to see if those numbers are changing.
But for now, you can report to your customer that Nest wasn't doing any noticeable advertising before the acquisition. They are now, apparently, but those stats from SEMRush are just estimates based on open metrics, no one outside Google or Nest can actually give you an exact sum.
-
RE: Best way to noindex long dynamic urls?
Well, that will make a little easier from one side and harder from the other.
You can try installing SEO by Yoast, that will put all the canonical tags for you, however, I think it won't link the search result pages to the canonical page that lists them all.
That might require a little coding.
If there's another page, outside /property-search-page/ folder that lists all villas, then you can disallow that folder in the robots.txt file, and that should fix it. If there isn't, well, then you will need to edit the /property-search-page/ page to use a static canonical tag that points to the page that lists all the villas removing any kind of filtering.
Hope that helps!
-
RE: SEMRush Ads Traffic Price VS their PDF report
Well, I checked SEMRush and they in fact report $2K on ads, but it seems that they have no data of the past. So they are just catching up with the data or Nest just started with their online ads. In any case, as the stats are just pouring in, I would leave a few more weeks before "trusting" on what SEMRush reports. I'm also sure (like you) that after that acquisition they are spending much more than 11K a month in ads.
Hold on for a week or two and see if there any changes.
-
RE: SEMRush Ads Traffic Price VS their PDF report
I guess you are talking about Nest?
-
RE: What do you think about this links? Toxic or don't? disavow?
Hey,
I tested LinkDetox myself for removing our manual penalty. Paid for several credits as every time I used them I received a rejection from Google after sending the reconsideration request (identifying the links, sending mails to have them removed, rediscovering them to see if they were still there, disavowing). All links removed as LinkDetox suggested, still no removal of the penalty.
Then we took another approach, we downloaded our linking domains from GWT, using text editor (dreamweaver in our case) added a "domain:" in front of ALL domains. Then we went over one by one to identify the good ones and removed them from the list, we ended up disavowing about 80% of the domains, then we sent the disavow file and a few minutes later sent a reconsideration request, a week later, penalty revoked.
Now, that we finally had our penalty revoked, we can still go over those links on the disavow file, one by one and if we find that the link was in fact worth having, or not there anymore, then we remove the domain from the list. We also check for subdomains, in our case we had like 1000 links from wordpress.com blogs, from about 5 subdomains. In the first disavow (that worked) we disavowed the entire domain wordpress.com, then we went over all the blogs to see which ones were worth having, and removed the root domain (wordpress.com) and instead added each subdomain (each blog) that we wanted disavowed.
Conclusion: nobody knows the real value of a link just by looking at some metrics like linkdetox does. You know the value, so go over the links by yourself and disavow all those that you think have no value, for example, those stats, or website worth links are useless, go ahead and disavow them all.
Hope that helps!
-
RE: Best way to noindex long dynamic urls?
I wouldn't put a noindex meta on them, instead I would consider using a canonical tag pointing to the page that lists all the villas.
Anyway, what programming language are you using?