Things were humming along nicely, and then everything slowed to a crawl. I checked, and I wasn't running out of memory (with 64 gigabytes in the server, I hadn't thought that was likely) and there wasn't a disk problem.
There's a handy utility called "iftop" that tells you where your bandwidth is going, and investigation revealed that one user was hogging about half the bandwidth. So, I thought, let's throttle him.
My first thought was to do it at the firewall, but web searches revealed that this wasn't going to be easy, and possibly not even possible. So then I thought about Apache's mod_bandwidth", but that doesn't work on a per-user basis. Which left only one possibility - doing it at the linux level.
More googling turned up this. I tried it out and it worked. So that became the basis for my solution.
My first attempt was to use iftop to monitor the situation, and cut and past the offending IP to a bash script. OK, that's put a temporary plaster on the situation, but I need something that's ongoing and automated. I couldn't see a way to make iftop output a single snapshot to a file, and I searched the web for a tool that might do it. I found a couple of dozen things that might have been the answer ... but weren't.
Then I thought - OK, I have to write my own. and that was surprisingly easy. I used tcpdump to capture 10000 packets to a file, then a small perl script to read that file, cumulate the packet lengths for each IP, and print out the addresses of any IPs that went over 5 megabits. I tested that, and it worked.
So then I blended the shell script with my perl program, tested it, and put it in a cron to run every five minutes.