00:33:27 arkiver: re your AB job on https://www.poemlife.com/, when is it shutting down? It needs to be run slowly due to rate limiting on the site and I'm wondering if I should add an ignore for links to individual posts - as it stands, it'd take somewhere between 2 weeks and a month to finish the current queue 02:16:18 is https://tracker.archiveteam.org/ dowm 02:17:47 nvm 07:36:29 Bzc6p edited Network.hu (+17, If you look closely, there's already a pretty…): https://wiki.archiveteam.org/?diff=50280&oldid=50227 07:42:30 Bzc6p edited Network.hu (+280, More clarification and update): https://wiki.archiveteam.org/?diff=50281&oldid=50280 11:06:17 Exorcism uploaded File:Lineblog-screenshot.png: https://wiki.archiveteam.org/?title=File%3ALineblog-screenshot.png 11:06:18 Exorcism edited LINE BLOG (+23): https://wiki.archiveteam.org/?diff=50283&oldid=50060 12:41:39 PaulWise edited Google Drive (+0, fix link): https://wiki.archiveteam.org/?diff=50284&oldid=50278 17:56:45 https://twitter.com/elonmusk/status/1683171310388535296?s=12 17:56:51 hold on to your buttholes 18:23:30 Yep, that's gonna end badly 18:56:34 IDK: do you have a web page example on which those weheartit images are embedded? 19:05:11 all links of ddg/google I could find just redirect to the main page 19:05:41 might just be gone? 19:09:53 This URL has been excluded from the Wayback Machine. lovely 21:45:44 considering how increasingly desperate youtube is for money... i fear a 144p archive project should be planned 21:47:20 what do you mean 21:49:34 hmm? 21:50:15 basically, yt is cutting off ad blockers, trying to make youtubers pay for their own advertising, in general signs of "site out of money" desperation 21:50:49 And you're going to pay for it? Cool. :-) 21:51:34 i feel like 144p content is the least at risk, ive noticed they seem to automatically load some videos in lower resolution more often, probably to save on bandwidth 21:52:06 I think the point is more that saving everything at 144p is more feasible than saving everything in HD, not that 144p is going away 21:52:27 this, good luck having enough storage to store everything at the highest quality 21:52:52 but 144p should be enough to know something was there, and confirm that an external archive is (probably) correct 21:52:56 that makes more sense 21:53:03 still would be huge, though 21:53:18 *HUGE 21:55:00 pay for their own advertising? 21:59:35 Even at 144p, you're still looking at a couple dozen petabytes probably. 22:02:00 how about 72p 22:02:06 :d 22:02:25 i suspect diminishing returns would strike 22:03:10 even 144p might be well beyond diminishing returns over a similarly low-bitrate 240p... 22:03:37 which means we will need dozens of petabytes of storage. not happening anytime soon 22:04:06 Correct, and by the time we can easily get that, it'll have grown by an order of magnitude. 22:05:17 about that... sure, storing videos at native quality WOULD probably have grown by that point, but i doubt the number of videos will grow exponentially 22:05:23 not anymore, at least 22:05:34 arkiver: moving from #archiveteam 22:05:39 just need some billionaires to throw some of the money that's clearly burning a hole in their pocket at archive.org 22:05:43 fireonlive: just the audio :D Metadata would be cool 22:06:21 true :3 22:06:27 I'm curious how large the audio would be. Certainly still more than SoundCloud at a couple petabytes. 22:06:37 metadata + maybe hashsums? at the most i guess 22:07:20 Audio bitrates aren't much lower than 144p video, so I guess the estimate from above is still about right. 22:07:28 but.. i don't think those are natively provided 22:07:43 so that'd be a looooot of data to chew through to provide a sha256 hash or something lol 22:08:16 (then it's like, per every format? which formats?) 22:08:40 And then YouTube reencodes the video and the hashes get fucked. 22:08:48 that too :D 22:09:06 VP10 gets released in 20XX, eveything gets re-transcoded... 22:09:56 isn't av1 coming #soon(tm) 22:10:05 we can hope! 22:11:19 They have AV1, but only on some videos. 22:11:36 here's all the formats for that recent redit video https://bpa.st/QRCWZZPFV3VU7UPPNA3IW7X2SE 22:11:38 ahh ok 22:11:45 https://www.youtube.com/playlist?list=PLyqf6gJt7KuHBmeVzZteZUlNUQAVLwrZS 22:11:59 :) 22:13:58 let's see what -vF shows for the oliveer video 22:13:58 https://bpa.st/L7JIX7NU2TRCGYC4VE7H5NNXWM 22:13:58 ah ok 22:13:58 the storyboard .mhtml is interesting 22:14:23 ?! ping timeout on my end i guess? 22:14:41 https://bpa.st/L7JIX7NU2TRCGYC4VE7H5NNXWM < the storyboard .mhtml is interesting lol, never downlaoded it before 22:15:27 i see av01 in there though 22:44:44 I'm guessing even a DPoS project couldn't keep up with downloading everything being uploaded to YT just to calculate the hashes of the videos and throw them away, even if there were no ratelimits (and it'd definitely not be worth it). All metadata sounds a lot less impossible and once we have it it would be possible to do more targeted video grabs 22:44:45 such as (for example) archiving videos uploaded before a specific year, when YouTube and file sizes were a lot smaller. 23:14:14 qwertyasdfuiopghjkl it would but where the fuck would we put it all. 23:20:03 I assume the 1st year of YT at 144p might be possible, but I haven't done any calculations so my guess might be off by a huge amount.