A demonstration and explanation of how we manage and monitor over fifty Drupal websites used throughout the College of Liberal Arts at Georgia Tech. I'll show how we use automatic scanning to pick up important details about each site (Drupal version, modules and their versions, users with administrative access, etc.) and store that information in a central database alongside with manually collected details on all of our websites (nearly 180 in total). I'll share some of the many reports we're able to run about our sites and how we use these reports to help with update / patch management. I'll also show how we do the same kind of automated scanning for WordPress sites and for the operating system of each of our web server VMs.
This is not so much a how-to session as it is a "here's what we're doing - let's hope we inspire you to do something similar" session, since all of this infrastructure is custom-built, but I will provide ideas on how the same infrastructure could be built on top of Drupal 8 today to create a Drupal powered site monitoring and management system.