Site is too slow? Seeing a new code.
-
Site is too slow. I am seeing this new code more than 1000 times in my home page-
What should I do now?
My site- http://a1stlucie.com/
-
Hi Beachflower,
Did you ever get a resolution to this? I'm curious to see what the outcome and solution was. If this is a malicious attack then you'll need to consult someone who specializes in net sec, but I've dealt with a few different kinds of attacks before so I can make a couple of recommendations.
1. Change all of your logins. Make them unique and difficult for a bot to guess. Then set it to lock out users after five incorrect guesses. This prevents brute force hacks.
2. Add a honeypot to your login forms. A honeypot is a hidden field that bots will try to fill out on a form. Users can't see it, so they don't fill it out. If it gets filled out, the program knows it's a bot, and invalidates the attempt to login.
3. Use screaming frog to find all the js that was maliciously inserted on each URL and create a "cleanup" list. A developer should be able to write a simple "find and replace" program that just deletes it.
4. Consider migrating to https if you haven't already. This can prevent Man-in-the-Middle attacks (MIM) on your site, and also confers several SEO benefits such as improved user experience, a slight boost in ranking, and faster site speed (HTTP/2 integration).
These are just a few first steps to take and a Net Sec professional will have much more to add. Hope that helps!
-
Yes I think. They are using that to slow the site down. and its working.
-
Well it could be a few things. Either it's always been there and you never noticed it before, your company/client added it but did so in a way that populated it out over 1000 times as an error, or the site was hacked and the js was inserted by a malicious entity. It looks like they're using a link shortener in the src part of that js, which makes me nervous.
-
I don't think so. But why over 1000 times?
-
Just to verify - this is js that your company/client did not insert themselves?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I host same video on my site which my manufacturer hosted on his site will be consider as duplicate?
Hello All, My manufacturer hosted video's on his site now if I host same video on my ecommerce site will it be consider as duplicate? any penalty ? Any suggestion pls? Thanks!
On-Page Optimization | | pragnesh96390 -
Thoughts on archiving content on an event site?
I have a few sites that are used exclusively to promote live events (ex. tradeshows, conference, etc). In most cases these sites content fewer than 100 pages and include information for the upcoming event with links to register. Some time after the event has ended, we would redesign the site and start promoting next years event...essentially starting over with a new site (same domain). We understand the value that many of these past event pages have for users who are looking for info from the past event and we're looking for advice on how best to archive this content to preserve for SEO. We tend to use concise urls for pages on these sites. Ex. www.event.com/agenda or www.event.com/speakers. What are your thoughts on archiving the content from these pages so we can reuse the url with content for the new event? My first thought is to put these pages into an archive, like www.event.com/2015/speakers. Is there a better way to do this to preserve the SEO value of this content?
On-Page Optimization | | accessintel0 -
Moving from Local to National Audience - Ecommerce Site
I have an e-commerce site that was optimized for local results. We’ve been receiving traffic from all over the nation so recently we've decided to optimize the site for a more national audience by removing the local business indicators. I have below a list of things that I’ve done, I’m contemplating doing and other various questions. One note I want to make is that we offer installation services – which we would still like to rank locally for. I have done the following: Remove [city] from all of the title tags (except for the installation pages) Removed NAP (name, address, ph#) from site wide footer – I’m not sure if I should have done this. Inserted NAP at the bottom of the body content of the installation pages Questions Right now my installation pages are located in a sub-folder (ex: www.mydomain.com/installation-services). Should I create a sub-domain for installations (ex: city.mydomain.com/installation-services)? My thoughts are that it would make that sub-domain all about our installation services while keeping our root domain focused on our products and not dilute it with installations. However, this sub-domain will have less juicy juice. What should I do with our G+ business page? In the back end of the business G+, there are boxes I can check for “my business has service areas where I visit my customers at their location” and “I serve customers at my business address.” If checked, will these make my site less likely to rank Nationally? We DO have a storefront and we DO visit people to do installations, so I would like these to be checked but not if it hurts my national rankings. There are local “7 pack” results for my main keywords and their installations. So, I would want my root domain to show up locally for “widgets in [city]” and my sub-domain to show up for “widget installation in [city]". Is it against Google’s TOS to create another G+ business page for the sub-domain? Is this even possible? Do I even need to do this or will Google know that I have a sub-domain all about installation and have my site show up in the local results? Is there anything in GWT that I need to make sure to have done? I put the geographic target to United States. Is there anything else? Of course, there is always the option of creating a new domain and optimize it for the installations. Any other suggestions would be greatly appreciated. Thank you.
On-Page Optimization | | SWWebTeam0 -
Help! A site has copied my blog!
My site tanked on July 21 and I have been working so hard to bring it back up but nothing is working. Today I looked at "Links to Your Site" on Google Webmasters and I see a copy of my site on another URL. mysite.eemovies.org/mycategory/mypost The domain name is eemovies.org and then all my stuff is wrapped around it and all my content is there! How do I stop this?!
On-Page Optimization | | 2bloggers0 -
Should I use nofollow when interlinking large, networked sites?
My company runs a network of very large networked sites, each with thousands of content pages. In our main navigation we are currently not nofollowing links between these networked sites. The links appear on every single page in the top navigation, and there are thousands of pages on each site. I am worried this will look to Google like we have suspiciously received thousands of links from one domain - one link from every page on the domain. Should we be nofollowing these navigation links between the different sites in our network?
On-Page Optimization | | Natasha90040 -
Site Maps / Robots.txt etc
Hi everyone I have setup a site map using a Wordpress pluggin: http://lockcity.co.uk/site-map/ Can you please tell me if this is sufficient for the search engines? I am trying to understand the difference between this and having a robots.txt - or do I need both? Many thanks, Abi
On-Page Optimization | | LockCity0 -
Why does Google no longer like our site?
Hey guys, I'm trying to figure out why the traffic and rankings have been plummeting on www.readprint.com. It's a collection of both public domain books and books on Amazon's store. If anyone can offer any pointers as to if it's duplicate content or ??? It used to get 300K visits/mo but has slowly been dropping over the last year. I appreciate anyone's expertise!
On-Page Optimization | | CoBraJones0 -
Site-wide keyword density
A colleague of mine was saying that he has been able to get top ranking for a high traffic term by using variations of that head term on multiple pages that are associated with the main page. For example,he would optimize a landing page for the high traffic word "Construction." He would then build pages under this landing page that are optimized for variations of this word: "Construction facts," "Industrial Construction Companies," "Construction Resource Allocator" etc. His theory is that the subpages add credibility with spiders that the root page is the best for that root page. This doesn't seem like it would work, but I'm curious as to what other people think.
On-Page Optimization | | EricVallee340