User Agent -teracent-feed-processing
-
Does anyone knows some info about "teracent-feed-processing" user agent?
IP's from which user agent reside: 74.125.113.145, 74.125.113.148, 74.125.187.84 ....
In our logs, 2 out of 3 requests are made by it, causing server crash.
-
It seems that the Sudden drop in indexed pages reported in WMT might relate to some reporting issues from Google - https://productforums.google.com/forum/#!topic/webmasters/qkvudy6VqnM;context-place=topicsearchin/webmasters/sitemap|sort:date
-
Since "teracent-feed-processing" didn't followed the rules in robots.txt, we had to hard-block it. If server detects the user agent beeing "teracent-feed-processing" it will drop the connection: _ (104) Connection reset by peer_
-
Well it isn't Googlebot and it isn't one I have come across before. Don't forget that any user agent can be spoofed very easily so I wouldn't worry about blocking it.
**Should I assume that the drop in reported indexed pages is a result of blocking the teracent-feed-processing user agent? **
I really don't think that this is Google. The only one they have is Googlebot, and tell you this is the one to add if you wish to block them.
Just a thought, can you share your robots.txt file just to make sure pages aren't being unintentionally blocked?
-Andy
-
It seems that "teracent-feed-processing" user agent is somehow linked to Google. If you analyse the Ip's , you'll noticed that are Google owned. Teracent company has been bought by Google in 2009.
btw - we've already blocked it, but I'm trying to figure it out what's the key role played by this user agent. We've also noticed a drastic decline in number of pages being reported in Google Webmaster Tools (half of what we used to have). Should I assume that the drop in reported indexed pages is a result of blocking the teracent-feed-processing user agent?
-
It sounds like your typical spammy site so I would suggest just blocking them. Add the following to the top of your robots.txt file:
**User-agent: teracent-feed-processing** **Disallow: /** However, before you go live with this, use the webmaster tools Robots.txt tester to make sure everything else still gets crawled. -Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
False Soft 404s, Shadow Bans, and Old User Generated Content
What are the best ways to keep old user generated content (UGC) pages from being falsely flagged by Google as soft 404s? I have tried HTML site maps to make sure no page is an orphaned but that has not solved the problem. Could crawled currently not indexed by explained by a shadow ban from Google? I have had problems with Google removing pages from SERPs without telling me about it. It looks like a lot of content is not ranking due to its age. How can one go about refreshing UGC without changing the work of the user?
Technical SEO | | STDCarriers0 -
Site architecture? I've got a free user report, that shoots back a page with their data for them to share with co-workers and friends.
Hi, I have a site about to go online that users can run a free report that connects to their calendar app to get 12 months of statistics for their meetings, and then it shoots out a report. So they go to a.com/freereport and they get back a.zom/freereport/report/xxxxxx The content of those reports is different, but the structure is the same as it is a fun way to show off meeting stats to co-workers and friends. I don't see the point of Google indexing those as the traffic to those pages is going to be from social networks and viral, but I do want the backlink credit. Will I get backlink credit if I nofollow that folder? I am having a hard time deciding what to do seo wise and would love some thoughts and advice, what would you recommend? Do nothing fancy. Mark the report folder no follow. Try to do something with rel=cannonical to point those pages to the root page? Thoughts?
Technical SEO | | bwb0 -
What's the best way for users to upload their images to my wordpress site to promote UGC
I have looked at lots of different plugins and wanted a recommendation for an easy way for patients of ours to upload pictures of them out partying and having fun and looking beautiful so future users can see the final results instead of sometimes gory or difficult to understand before and after images. I'd like to give them the opportunity to write captions (like facebook or insta posts and would offer them incentives to do so. I don't want it to be too complicated for them or have too many steps or barriers but I do want it to look nice and slick and modern. Also do you think this would have a positive impact on SEO? I was also thinking of a Q&A app where dentists could get Q&A emails and respond - i've been doing AMA sessions and they've been really successful and I would like to bring it into out site and make it native. Thanks in advance 🙂
Technical SEO | | Smileworks_Liverpool1 -
Content Based on User's IP Address
Hello, A client wants us to create a page on two different sites (www.brandA.com/content and www.brandB.com/content) with similar content and serve up specific content to users based on their IP addresses. The idea is that once a user gets to the page, the content would slightly change (mainly contact information and headers) based on their location. The problem I am seeing with this is that both brandA and brandB would be different Urls so there is a chance if their both optimized for the similar terms then they would both rank and crowd up the search results (duplicate content). Have you seen something similar? What are your thoughts and/or potential solutions? Also, do you know of any sites that are currently doing something similar?
Technical SEO | | Rauxa0 -
HTTP Vary:User-Agent Server or Page Level?
Looking for any insights regarding the usage of the Vary HTTP Header. Mainly around the idea that search engines will not like having a Vary HTTP Header on pages that don't have a mobile version, which means the header will be to be implemented on a page-by-page basis. Additionally, does anyone has experience with the usage of the Vary HTTP Header and CDNs like Akamai?Google still recommends using the header, even though it can present some challenges with CDNs. Thanks!
Technical SEO | | burnseo0 -
Is having no robots.txt file the same as having one and allowing all agents?
The site I am working on currently has no robots.txt file. However, I have just uploaded a sitemap and would like to point the robots.txt file to it. Once I upload the robots.txt file, if I allow access to all agents, is this the same as when the site had no robots.txt file at all; do I need to specify crawler access on can the robots.txt file just contain the link to the sitemap?
Technical SEO | | pugh0 -
NoIndex user generated pages?
Hi, I have a site, downorisitjustme (dot) com It has over 30,000 pages in google which have been generated by people searching to check if a specific site is working or not and then possibly adding a link to a msg board to the deeplink of the results page or something which is why the pages have been picked up. Am I best to noindex the res.php page where all the auto generated content is showing up and just have the main static pages as the only ones available to be indexed?
Technical SEO | | Wardy0 -
User Created Subdomain Help
Have I searched FAQ: Yes My issue is unique because of the way our website works and I hope that someone can provide some guidance on this.Our website http://breezi.com is a website builder where users can build their own website. When users build their site it creates a sub-domain route to their created site, for example: http://mike.breezi.com. Now that I have explained how our site works here is the problem: Google Webmaster Tools and Bing Webmaster Tools are indexing ALL the user created websites under our TLD and thus it is our impression that any content created in those sub-domains can confuse the search engine to thinking that the user created website and content is relevant to _OUR _main sitehttp://breezi.com. So, what we would like to know if there is a way to let search engines know that the user created sites and content is not related to our TLD site. Thanks for any help and advise.
Technical SEO | | breezi0