02:12:27 PaulWise edited Mailman2 (+28, started some jobs for syslinux lists): https://wiki.archiveteam.org/?diff=50882&oldid=50773 04:23:38 -+rss- Luiz André Barroso has died: https://www.wired.com/story/google-mourns-luiz-andre-barroso-veteran-engineer-invented-the-modern-data-center/ https://news.ycombinator.com/item?id=37606775 05:42:55 TIL that fully.com closed their online store https://fully.com/ 06:36:18 twitter circles to be kill: https://x.com/support/status/1704996222619009211 06:36:18 nitter: https://nitter.x86-64-unknown-linux-gnu.zip/support/status/1704996222619009211 06:36:28 ⭕️🔪 10:03:59 JustAnotherArchivist edited PIXNET (+242, Add support confirmation that the deletion…): https://wiki.archiveteam.org/?diff=50883&oldid=50689 14:58:36 pabs: yeah i thought about here but it's a bit too late now 15:25:23 VoynichCr edited WikiIndex (+17): https://wiki.archiveteam.org/?diff=50884&oldid=42140 16:31:10 hi why is the staging server not included in the warrior? i noticed some of my warriors were getting no work done as they needed to wait to upload their stuff. would it make sense to get this (https://github.com/Fusl/ateam-airsync) running as well? Are there any special requirements to meet? 17:34:37 staging server? 17:47:53 Upload to your own worker...? 17:48:47 The targets can't be distributed like workers can. 17:49:02 Their entire purpose is to aggregate data from workers. 18:00:08 that makes perfect sense, thanks! 18:40:07 I didn't even know twitter had circles, I thought that's a google+ thing 19:29:10 yeee it was like facebook groups: private (though they had bugs) invite only or open to join Twitter timelines where your tweets stayed there 19:29:19 like little private twitters kinda 19:29:50 bugs included… tweets leaking outside circles 19:29:56 @_@ 22:23:57 So the good news about FOIAonline is that there don't seem to be bad rate limits (~8 req/s is fine, didn't try more), and the response times are decent even from across the pond. 22:24:01 But ugh this site... 22:25:09 I'm listing all requests now. No idea yet how long that'll take. 23:38:28 Ok, it already finished and found about 1.2 million requests. 23:40:18 Many requests seem to not have any files, and the ones that do are in the megabytes range, so I'd expect maybe a couple terabytes for everything. Will try to get a better estimate though.