id,summary,reporter,owner,description,type,status,priority,milestone,component,version,resolution,keywords,cc 247,"serialize logs with JSON, not pickle",Brian Warner,,"split out of #244 There are several of places which might serialize log events: * via `LogFileObserver`, when `$FLOGFILE=` is set at process startup * in a locally-written Incident file, via `foolscap.logging.log.setLogDir()` * the log gatherer and incident gatherer * via `flogtool tail --save-to=` * in the output of `flogtool filter` All of these currently use the stdlib Pickle module to read and write serialized log events. The files (which may be BZ2-compressed, or uncompressed) contain concatenated pickle bytestrings, so they can be read by calling `pickle.load(f)` repeatedly, until it throws an error. This ticket is about replacing these pickles with something safer, namely JSON. Pickles are unsafe, and obviously I should have never used them in the first place. My plan is to delete all the pickle-handling code and replace it with JSON-handling code (newline-separated JSON records, read with `f.readlines()` and `JSON.loads()`. The new version won't `import pickle` at all, and the only nod to backwards compatibility will be a check that looks at the first few bytes of the logfile to see if it might be a pickle. If we see one, the tool will print an apology message which explains the unsafety, and mentions that installing an old version of Foolscap is the only way to read these logfiles. ",task,closed,major,0.13.0,logging,0.9.1,fixed,,