Don’t Aggregate, Consolidate!

In previous posts, I’ve emphasized that averages are particularly bad at characterizing most things that you might be looking for. However, storing aggregated data of any type can limit your ability to analyze data later.

Bitrot, Part 2

This article has a link to a simple script I’ve used for over a decade to detect corrupted files. It will detect and report on files that have changed, been added, deleted or possibly moved within the same directory structure.

Bitrot, Part 1

Your systems have drives set up in RAID configurations and besides, you have data copied to redundant systems and backups, right? Safe? Maybe not. I recently found corruption in a quarter of a million files that had not previously been detected, for years!

(Ab)use of the R Language

For years I’ve done most of my log scraping and analysis with the usual suspects; bash, sed, awk, perl even. The log scraping still uses those tools, but lately I’ve been toying around with “R” for the analysis.

Averages Mostly Suck at Almost Everything..

..unless you’re dealing with baseball. When dealing with systems, many of us think “Average” is a measure of “Typical” or “Normal“. Many systems people will also use averages to look for “Abnormal“. However, average (or mean) doesn’t represent either “normal” or “abnormal” very well..