So what's up with UpDowner.com?
-
I've noticed these guys in link profiles for several sites I manage. They'll usually show up around 1,000-10,000 times in the backlink profile. From what I can tell they index websites, build up keyword relationships, and then when you search for something on their site (e.g. poker) they'll present a list of related sites with stats about them. The stats seem to be yanked straight from Alexa. Where the backlink comes from is that every time 'your' site shows up for a search result they'll put a little iframe that contains your site. This means if your site's name/keywords are pretty broad, you could be showing up thousands and tens of thousands of times as being linked from these guys on their pages that Google indexes.
And Google indexes, boy do they ever. At the height, they had over 53 million pages indexed. That has apparently shrunk now to around 25 million. I believe their strategy is to generate a crap-load of automated content in the hopes they can cash in on obscure long tails.
So my questions for you guys are:
- Are you seeing them in your backlinks too?
- Should I block their spider/referrers?
- What is their deal man?
-
http://www.seroundtable.com/google-updowner-ignored-15369.html
Looks like Google is on top of these guys so at least their links won't get you Penguinized.
-
I checked out their site. You can register for it and then get your site verified. Then they claim you can get your listing removed. I haven't tried that yet because I'm trying to find out more about them before I do that. I'm curious why they just showed up in my Google webmaster listing, and supposedly have backlinks from a bunch of my pages--but if you enter that page into their page form, they don't find it.
Does anyone else have any experience with them?
-
Yes I noticed this too, we just got their IP address and blocked their IP through .htacess.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What does Google's Spammy Structured Markup Penalty consist of?
Hey everybody,
White Hat / Black Hat SEO | | klaver
I'm confused about the Spammy Structured Markup Penalty: "This site may not perform as well in Google results because it appears to be in violation of Google's Webmaster Guidelines." Does this mean the rich elements are simply removed from the snippets? Or will there be an actual drop in rankings? Can someone here tell from experience? Thanks for your help!1 -
.com geotagging redirect to subdomains - will it affect SEO?
Hi guys, We have a .com domain and we've got geoIP on it, so UK goes to .co.uk and USA goes to .com/us We're just migrating over to another platform so we're thinking of keeping a "dummy" server just to do this geoIP pointing for us. Essentially .com will just point over to the right place and hold a specific .com/abc (which is generic for everyone worldwide) Current Scenario:
White Hat / Black Hat SEO | | Infruition
.com (Magento + geoIP)
.com/us (US Magento)
.co.uk (UK - geoIP redirect to Shopify)
.com/abc (sits on Magento server) Wanted Scenario:
.com - used for GEOIP and a specific .com/abc (for all users)
.co.uk (UK) - Shopify eCom
.com/us -> migration to us.xx.com (USA) - Shopify eCom I just wanted to know if this will affect our rankings on google? Also, any advice as to the best practises here would be great. Thanks! Nitesh0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Removing/ Redirecting bad URL's from main domain
Our users create content for which we host on a seperate URL for a web version. Originally this was hosted on our main domain. This was causing problems because Google was seeing all these different types of content on our main domain. The page content was all over the place and (we think) may have harmed our main domain reputation. About a month ago, we added a robots.txt to block those URL's in that particular folder, so that Google doesn't crawl those pages and ignores it in the SERP. We now went a step further and are now redirecting (301 redirect) all those user created URL's to a totally brand new domain (not affiliated with our brand or main domain). This should have been done from the beginning, but it wasn't. Any suggestions on how can we remove all those original URL's and make Google see them as not affiliated with main domain?? or should we just give it the good ol' time recipe for it to fix itself??
White Hat / Black Hat SEO | | redcappi0 -
Why is this site performing so well in the SERP's and getting high traffic volume for no apparent reason!
The site is https://virtualaccountant.ie/ It's a really small site They have only about 7 back links, They don't blog They don't have a PPC campaign They don't stand out from the crowd in terms of product or services offered So why are they succeeding in topping the SERP's for difficult to rank for accounting keywords such as accountant and online accounts. What are they doing better than everyone else, or have they discovered a way to cheat Google, and worse still - ME!
White Hat / Black Hat SEO | | PeterConnor0 -
I'm worried my client is asking me to post duplicate content, am I just being paranoid?
Hi SEOMozzers, I'm building a website for a client that provides photo galleries for travel destinations. As of right now, the website is basically a collection of photo galleries. My client believes Google might like us a bit more if we had more "text" content. So my client has been sending me content that is provided free by tourism organizations (tourism organizations will often provide free "one-pagers" about their destination for media). My concern is that if this content is free, it seems likely that other people have already posted it somewhere on the web. I'm worried Google could penalize us for posting content that is already existent. I know that conventionally, there are ways around this-- you can tell crawlers that this content shouldn't be crawled-- but in my case, we are specifically trying to produce crawl-able content. Do you think I should advise my client to hire some bloggers to produce the content or am I just being paranoid? Thanks everyone. This is my first post to the Moz community 🙂
White Hat / Black Hat SEO | | steve_benjamins0 -
Why doesn't Google find different domains - same content?
I have been slowly working to remove near duplicate content from my own website for different locals. Google seems to be doing noting to combat the duplicate content of one of my competitors showing up all over southern California. For Example: Your Local #1 Rancho Bernardo Pest Control Experts | 858-352 ... <cite>www.pestcontrolranchobernardo.com/</cite>CachedYou +1'd this publicly. UndoPest Control Rancho Bernardo Pros specializes in the eradication of all household pests including ants, roaches, etc. Call Today @ 858-352-7728. Your Local #1 Oceanside Pest Control Experts | 760-486-2807 ... <cite>www.pestcontrol-oceanside.info/</cite>CachedYou +1'd this publicly. UndoPest Control Oceanside Pros specializes in the eradication of all household pests including ants, roaches, etc. Call Today @ 760-486-2807. The competitor is getting high page 1 listing for massively duplicated content across web domains. Will Google find this black hat workmanship? Meanwhile, he's sucking up my business. Do the results of the competitor's success also speak to the possibility that Google does in fact rank based on the name of the url - something that gets debated all the time? Thanks for your insights. Gerry
White Hat / Black Hat SEO | | GerryWeitz0 -
NYT article on JC Penny's black hat campaign
Saw this article on JC Penny receiving a 'manual adjustment' to drop their rankings by 50+ spots: http://www.nytimes.com/2011/02/13/business/13search.html Curious what you guys think they did wrong, and whether or not you are aware of their SEO firm SearchDex? I mean, was it a simple case of low-quality spam links or was there more to it? Anyone study them in OpenSiteExplorer?
White Hat / Black Hat SEO | | scanlin0