User Agent -teracent-feed-processing
-
Does anyone knows some info about "teracent-feed-processing" user agent?
IP's from which user agent reside: 74.125.113.145, 74.125.113.148, 74.125.187.84 ....
In our logs, 2 out of 3 requests are made by it, causing server crash.
-
It seems that the Sudden drop in indexed pages reported in WMT might relate to some reporting issues from Google - https://productforums.google.com/forum/#!topic/webmasters/qkvudy6VqnM;context-place=topicsearchin/webmasters/sitemap|sort:date
-
Since "teracent-feed-processing" didn't followed the rules in robots.txt, we had to hard-block it. If server detects the user agent beeing "teracent-feed-processing" it will drop the connection: _ (104) Connection reset by peer_
-
Well it isn't Googlebot and it isn't one I have come across before. Don't forget that any user agent can be spoofed very easily so I wouldn't worry about blocking it.
**Should I assume that the drop in reported indexed pages is a result of blocking the teracent-feed-processing user agent? **
I really don't think that this is Google. The only one they have is Googlebot, and tell you this is the one to add if you wish to block them.
Just a thought, can you share your robots.txt file just to make sure pages aren't being unintentionally blocked?
-Andy
-
It seems that "teracent-feed-processing" user agent is somehow linked to Google. If you analyse the Ip's , you'll noticed that are Google owned. Teracent company has been bought by Google in 2009.
btw - we've already blocked it, but I'm trying to figure it out what's the key role played by this user agent. We've also noticed a drastic decline in number of pages being reported in Google Webmaster Tools (half of what we used to have). Should I assume that the drop in reported indexed pages is a result of blocking the teracent-feed-processing user agent?
-
It sounds like your typical spammy site so I would suggest just blocking them. Add the following to the top of your robots.txt file:
**User-agent: teracent-feed-processing** **Disallow: /** However, before you go live with this, use the webmaster tools Robots.txt tester to make sure everything else still gets crawled. -Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
False Soft 404s, Shadow Bans, and Old User Generated Content
What are the best ways to keep old user generated content (UGC) pages from being falsely flagged by Google as soft 404s? I have tried HTML site maps to make sure no page is an orphaned but that has not solved the problem. Could crawled currently not indexed by explained by a shadow ban from Google? I have had problems with Google removing pages from SERPs without telling me about it. It looks like a lot of content is not ranking due to its age. How can one go about refreshing UGC without changing the work of the user?
Technical SEO | | STDCarriers0 -
Will redirecting a logged in user from a public page to an equivalent private page (not visible to google) impact SEO?
Hi, We have public pages that can obviously be visited by our registered members. When they visit these public pages + they are logged in to our site, we want to redirect them to the equivalent (richer) page on the private site e.g. a logged in user visiting /public/contentA will be redirected to /private/contentA Note: Our /public pages are indexed by Google whereas /private pages are excluded. a) will this affect our SEO? b) if not, is 302 the best http status code to use? Cheers
Technical SEO | | bernienabo0 -
How do you delete an admin user in wordpress that wont delete
I hired an indian company to do some work on three sites that I own. I used a freelancing platform and they have been banned and now when i check in my wordpress sites, the admin user will not delete. Everytime i try and delete them it comes back. I change the password and the email address, but when i check a couple of hours later it comes back again, giving them full control over my sites which they are playing around with. any help would be great. I have tried going into the cpanel but it still will not delete. my hosting company has tried to delete them but it is not working
Technical SEO | | in2townpublicrelations0 -
How do I direct users to site page when they search vanity URL?
My company runs a contest via a landing page on our website. The full URL to the landing page is rather long so we have a vanity URL that we use for advertising purposes. I have a 301 on the vanity URL to the landing page URL so people visiting it directly end up where they should just fine. But if a user goes to Google and types the vanity URL into the search bar, the landing page is nowhere to be found in the results. What do I need to do to get the landing page to show in results when people search the vanity URL?
Technical SEO | | jarjarjarvis0 -
Should I redirect desktop users visiting mobile page to desktop version?
So I am redirecting mobile users to mobile version of the page and also have alternate attrubute set up for that: What about the opposite case? When user from desktop computer visits mobile version of the page. Should I redirect him back to desktop version?
Technical SEO | | poiseo1 -
Content Based on User's IP Address
Hello, A client wants us to create a page on two different sites (www.brandA.com/content and www.brandB.com/content) with similar content and serve up specific content to users based on their IP addresses. The idea is that once a user gets to the page, the content would slightly change (mainly contact information and headers) based on their location. The problem I am seeing with this is that both brandA and brandB would be different Urls so there is a chance if their both optimized for the similar terms then they would both rank and crowd up the search results (duplicate content). Have you seen something similar? What are your thoughts and/or potential solutions? Also, do you know of any sites that are currently doing something similar?
Technical SEO | | Rauxa0 -
Page that appears on SERPs is not the page that has been optimized for users
This may seem like a pretty newbie question, but I haven't been able to find any answers to it (I may not be looking correctly). My site used to rank decently for the KW "Gold name necklace" with this page in the search results:http://www.mynamenecklace.co.uk/Products.aspx?p=302This was the page that I was working on optimizing for user experience (load time, image quality, ease of use, etc.) since this page was were users were getting to via search. A couple months ago the Google SERP's started showing this page for the same query (also ranked a little lower, but not important for this specific question):http://www.mynamenecklace.co.uk/Products.aspx?p=314Which is a white gold version of the necklaces. This is not what most users have in mind (when searching for gold name necklace) so it's much less effective and engaging.How do I tell Google to go back to old page/ give preference to older page / tell them that we have a better version of the page / etc. without having to noindex any of the content? Both of these pages have value and are for different queries, so I can't canonical them to a single page. As far as external links go, more links are pointing to the Yellow gold version and not the white gold one.Any ideas on how to remedy this?Thanks.
Technical SEO | | Don340 -
NoIndex user generated pages?
Hi, I have a site, downorisitjustme (dot) com It has over 30,000 pages in google which have been generated by people searching to check if a specific site is working or not and then possibly adding a link to a msg board to the deeplink of the results page or something which is why the pages have been picked up. Am I best to noindex the res.php page where all the auto generated content is showing up and just have the main static pages as the only ones available to be indexed?
Technical SEO | | Wardy0