Code Monitor Recommendations
-
Hi all,
I was wondering if you have any recommendations for a code monitor? We'd like to keep track of any code and content changes on a couple of websites. We've taken a look at Page Monitor: https://chrome.google.com/webstore/detail/page-monitor/pemhgklkefakciniebenbfclihhmmfcd?hl=en but I'm not sure if it tracks code changes? Any suggestions for free or paid tools would be appreciated.
Edit: We'd also like to avoid a tool that requires any tracking code changes or anything that involves a database/FTP connection.
-
Not really no. I only use it per instance versus as a more constant change tool. If you have a dev in-house though they could probably setup something that runs on a virtual machine and sends reports on changes. Or you could run it after getting a change report from one of the other, more standard types of page change tools to see if the site is now making different HTTP requests as well.
-
Hi Ryan,
Thanks for the great resources. I was just wondering if you had any tips on how best to set up Fiddler to monitor for code changes?
Thanks,
Holly
-
Some great suggestions above - https://www.codeguard.com/ is also worth considering - relatively cheap and effective.
-
I've used this one. It checks the code once a day, and no need to install anything.
-
For something code based, you might have to setup an instance of Fiddler to do that as most page change monitoring tools are just looking at the readable text versus the code. Plus server side code changes that are executed prior to the page being displayed will get missed by either type. Here's some more that might suit your needs...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Recommendations for the length of h1 tags and how much does it matter. What is the major disadvantage if the h1 tags are slightly longer.
recommendations for the length of h1 tags and how much does it matter. What is the major disadvantage if the h1 tags are slightly longer.
Intermediate & Advanced SEO | | MariaMcGrath0 -
Recommendations for other good SEO forums?
I learn so much from reading the Moz forum everyday. I would like to expand my learning and start following some other SEO forums as well. What are your recommendations for additional SEO forums?
Intermediate & Advanced SEO | | GregB1230 -
Site audit and SEO consultation ...who do you recommend?
I am looking to have an SEO specialist to audit and do consultation on one of my sites. This website never received a penalty from Google but was hit algorithmically and I need to bring it back up strong on the serps. Who do you recommend from the "recommended list" from MOZ? Cheers 🙂
Intermediate & Advanced SEO | | mbulox0 -
Thousands of 404 Pages Indexed - Recommendations?
Background: I have a newly acquired client who has had a lot of issues over the past few months. What happened is he had a major issue with broken dynamic URL's where they would start infinite loops due to redirects and relative links. His previous SEO didn't pay attention to the sitemaps created by a backend generator, and it caused hundreds of thousands of pages to be indexed. Useless pages. These useless pages were all bringing up a 404 page that didn't have a 404 server response (it had a 200 response) which created a ton of duplicate content and bad links (relative linking). Now here I am, cleaning up this mess. I've fixed the 404 page so it creates a 404 server response. Google webmaster tools is now returning thousands of "not found" errors, great start. I fixed all site errors that cause infinite redirects. Cleaned up the sitemap and submitted it. When I search site:www.(domainname).com I am still getting an insane amount of pages that no longer exist. My question: How does Google handle all of these 404's? My client wants all the bad pages removed now but I don't have as much control over that. It's a slow process getting Google to remove these pages that are returning a 404. He is continuously dropping in rankings still. Is there a way of speeding up the process? It's not reasonable to enter tens of thousands of pages into the URL Removal Tool. I want to clean house and have Google just index the pages in the sitemap.
Intermediate & Advanced SEO | | BeTheBoss0 -
Any ideas for easy code to get rankings live?
I'm interested in gathering some further data for my site. What I would like to do is as well as collecting the search Keyword for users coming to my site, I would also like to gather the search engine position for that keyword LIVE - IE what is the current position of that keyword in the SE at the time the user put in their search. This data will be really useful when digging down on my analysis. Does anyone have any simple ideas of how you would go about implimenting this? Many thanks
Intermediate & Advanced SEO | | James770 -
Can anyone recommend solid directories (paid) to submit websites to?
can anyone recommend solid directories (paid) to submit websites to? maybe a directory you've already submitted to and have gotten results. Thanks
Intermediate & Advanced SEO | | PaulDylan0 -
Mobile subdomain recommendations - Mobile SEO
My company is moving forward with creating a subdomain for mobile visitors (m.examplesite.com). I know there is much discussion on subdomain vs one url with different style sheets. That ship has sailed and the subdomain is the way we are proceeding. Google appears to recommend leaving both sites open to the normal Googlebot and redirecting the mobile bot to the mobile site (as we will be redirecting mobile visitors to the mobile subdomain). Has anyone had experience with this. Any duplicate content issues? Does anyone feel strongly that the normal Googlebot should be blocked from the mobile site (this seems to go against Google's recommendations)? It seems like another option is to use the canonical tag and let the search engines know the traditional site is the canonical page/version. Any recommendations? Any other issues that should be considered?
Intermediate & Advanced SEO | | btdavis0 -
Canonical, 301 or code a workaround?
Hi, Recently I've been trying to tackle an issue on one of my websites. I have a site with around 400 products and 550 pages total. I've been pruning some weaker pages and pages with shallow content, and it's been working really well. My current issue is this: There are about 20 store brands of 6 products on my site that each have their own page. They are identical products just re-branded. Writing content for each of these pages has been difficult, as it's a fairly dry product too. So I have around 120 pages of dry content that is unique but not much different from one another. I want to consolidate but I am not sure how yet. Here is what I am thinking: 1. 301 - I pick one product page as the master, 301 all the other duplicate products to it and then make one page of great content that encompasses all of them. If the 301 juice gets diluted over time I might miss out on some long tails, but I could also gain a lot more from a great content page with 500+ words of really good content as opposed to pages with 150-250 words of just so so content. 2. Canonical - Similar to above. I pick a master page and canonical the other pages to it. Then I could use the great content on all the pages, and still have pages for the specific products. The pages might not show up in search engines but would still be searchable on my site. 3. Coded solution - In my CMS I could always make a workaround where the products still appear on the brands page (just their name with a link to the product page) but all the links direct to a master page. I realize all the solutions are fairly similar, although I am not sure which is ideal. Option 3 is the most expensive/time consuming but it would drop my page total down to around 450 pages. For a while now (dating back to before Panda) I've been trying to get rid of the low quality and outdated product pages so I could focus on the more popular and active pages. Dropping my page total would also help in the SEO efforts as the sheer volume of pages that need links right now is high, and obviously the less pages I have the more time I can spend on each page (content and link building). So what do you think? Should I do any of the 3, a combination of the 3 or something different? Cheers, Vinnie
Intermediate & Advanced SEO | | vforvinnie0