Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What Tools Should I Use To Investigate Damage to my website
-
I would like to know what tools I should use and how to investigate damage to my website in2town.co.uk I hired a person to do some work to my website but they damaged it. That person was on a freelance platform and was removed because of all the complaints made about them. They also put in backdoors on websites including mine and added content.
I also had a second problem where my content was being stolen.
My site always did well and had lots of keywords in the top five and ten, but now they are not even in the top 200. This happened in January and feb.
When I write unique articles, they are not showing in Google and need to find what the problem is and how to fix it. Can anyone please help
-
Repairing website damage requires a structured approach. Start by assessing any issues caused by the freelancer using tools like Wordfence for WordPress to detect backdoors or malicious changes. It’s possible Google penalized you for whatever work the freelancer did. You might need to disavow toxic links they built, for example, or address other issues. Tools such as Google Alerts can help identify content duplicates for action.
For future reference, regular monitoring helps prevent unauthorized changes and stay informed about industry trends. If you monitor a web page for changes, you will ensure no unauthorized adjustments are made on your site. There are several ways to approach this, and several tools you can use; some will notify you when your webpage changes. Monitoring competitors for content trends can also guide your strategy and reveal potential areas for improvement.
-
use google search console and screaming frog
-
The best tool that i used to diagnose problems in my office interior designs service based website is the Google search console GSC. You can also use screaming frog or also use moz to analyze and solve issues
-
Page Freezer: Instantly preserve web pages and social media profiles to capture evidence of website damage
-
To investigate damage to your website, you should consider using tools like Google Search Console for monitoring search performance and detecting issues, website security scanners like Word fence to check for malware or vulnerabilities, and website auditing tools such as SEMrush or Screaming Frog for comprehensive analysis of technical SEO issues and website health.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Product pages are throwing Missing field "image" and Missing field "price" in Wordpress Woocommerce
I have a wordpress wocommerce website where I have uploaded 100s of products but it's giving me error in GSC under merchant listing tab. When I tested it show missing field image and missing field price. I have done everything according to https://developers.google.com/search/docs/appearance/structured-data/product#merchant-listing-experiences and applied fixed i.e. images are 800x800 and price range is also there. What else can be done here?!merchant listing.jpg
Technical SEO | | Ravi_Rana0 -
Is page speed important to improve SEO ranking?
I saw on a SEO Agency's site (https://burstdgtl.com/search-engine-optimization/) that page speed apparently affects Google ranking. Is this true? And if it is, how do I improve it, do I need an agency?
On-Page Optimization | | jasparcj0 -
US and UK Websites of Same Business with Same Content
Hello Community, I need your help to understand, whether I can use the US website's content on my UK website or not? US Website's domain: https://www.fortresssecuritystore.com UK Website's domain: https://www.fortresssecuritystore.co.uk Both websites are having same content on all the pages, including testimonials/reviews. I am trying to gain business from Adwords and Organic SEO marketing. Thanks.
Technical SEO | | CommercePundit1 -
Canonical homepage link uses trailing slash while default homepage uses no trailing slash, will this be an issue?
Hello, 1st off, let me explain my client in this case uses BigCommerce, and I don't have access to the backend like most other situations. So I have to rely on BG to handle certain issues. I'm curious if there is much of a difference using domain.com/ as the canonical url while BG currently is redirecting our domain to domain.com. I've been using domain.com/ consistently for the last 6 months, and since we switches stores on Friday, this issue has popped up and has me a bit worried that we'll loose somehow via link juice or overall indexing since this could confuse crawlers. Now some say that the domain url is fine using / or not, as per - https://moz.com/community/q/trailing-slash-and-rel-canonical But I also wanted to see what you all felt about this. What says you?
Technical SEO | | Deacyde0 -
Tool to Generate All the URLs on a Domain
Hi all, I've been using xml-sitemaps.com for a while to generate a list of all the URLs that exist on a domain. However, this tool only works for websites with under 500 URLs on a domain. The paid tool doesn't offer what we are looking for either. I'm hoping someone can help with a recommendation. We're looking for a tool that can: Crawl, and list, all the indexed URLs on a domain, including .pdf and .doc files (ideally in a .xls or .txt file) Crawl multiple domains with unlimited URLs (we have 5 websites with 500+ URLs on them) Seems pretty simple, but we haven't been able to find something that isn't tailored toward management of a single domain or that can crawl a huge volume of content.
Technical SEO | | timfrick0 -
Using the Google Remove URL Tool to remove https pages
I have found a way to get a list of 'some' of my 180,000+ garbage URLs now, and I'm going through the tedious task of using the URL removal tool to put them in one at a time. Between that and my robots.txt file and the URL Parameters, I'm hoping to see some change each week. I have noticed when I put URL's starting with https:// in to the removal tool, it adds the http:// main URL at the front. For example, I add to the removal tool:- https://www.mydomain.com/blah.html?search_garbage_url_addition On the confirmation page, the URL actually shows as:- http://www.mydomain.com/https://www.mydomain.com/blah.html?search_garbage_url_addition I don't want to accidentally remove my main URL or cause problems. Is this the right way this should look? AND PART 2 OF MY QUESTION If you see the search description in Google for a page you want removed that says the following in the SERP results, should I still go to the trouble of putting in the removal request? www.domain.com/url.html?xsearch_... A description for this result is not available because of this site's robots.txt – learn more.
Technical SEO | | sparrowdog1 -
Why is my website banned?
IMy website is Costume Machine at www.costumemachine.com . My site has been banned for 1 year now. I have requested that google reconsider my site 3 times without luck. The site is dynamic and basically pulls in feeds from affiliate sites. We have added over 1,500 pages of original content. The site has been running great since 2008 without any penalties. I don't think I got hit with any linking penalty. I cleaned up all questionable links last November when the penalty hit. Am I being hit with a "thin" site penalty? If that is the issue what is the best way to fix the problem?
Technical SEO | | tadden0