Sysop: | Amessyroom |
---|---|
Location: | Fayetteville, NC |
Users: | 23 |
Nodes: | 6 (0 / 6) |
Uptime: | 52:06:57 |
Calls: | 583 |
Files: | 1,139 |
Messages: | 111,532 |
Cool project idea, i already did the same.
"Billy G. (go-while)" <no-reply@no.spam> posted:
Cool project idea, i already did the same.
Here is everything you can get from archive.org and probably everything you can get from the biggest paid providers....
Impressive, some content goes back to 1983, before the "Great Renaming",
but checking comp.lang.tcl also shows a new message posted today.
"Billy G. (go-while)" <no-reply@no.spam> posted:
Cool project idea, i already did the same.
Here is everything you can get from archive.org and probably everything
you can get from the biggest paid providers....
Impressive, some content goes back to 1983, before the "Great Renaming",
but checking comp.lang.tcl also shows a new message posted today.
There are nearly half a million groups listed, but many appear to be bogus with names which are typos and no or minimal content.
Could I add your server to this list?
On 6/13/25 1:36 PM, Billy G. (go-while) wrote:
Cool project idea, i already did the same.
Can you provide a link to your archives or are they only on your news server? What newsgroup list did you use for gathering groups? I used the group list from isc.org
(https://ftp.isc.org/pub/usenet/CONFIG/newsgroups) supplemented with
Eternal September's list.
Also, how did you get your archives? I developed a script to do this for
me because I couldn't find a reliable way to do this otherwise. I also
have the ability to download groups from a specific time frame which I
hope to use every year to archive groups year-by-year. Are you doing something like that also?
Jason
Cool project idea, i already did the same.
Here is everything you can get from archive.org and probably everything
you can get from the biggest paid providers....
10 TB of text, mostly unfiltered. maybe some google groups spam is missing.
The archive is live and connected via peering so nothing else to do, it archives on it's own.
The Server is written by me and lacks some commands.
Text Usenet Archive
Host: lux-feed1.newsdeef.eu
Port: 119 or 563 SSL
User: usenet
Pass: archive
Please don't hit it too hard but connections are limited any ways.
You can get me on discord: https://discord.gg/rECSbHHFzp
If anybody can take a full copy: I'm happy to share!!!
If anybody can take a full copy: I'm happy to share!!!
Hey Billy, I've been meaning to reach out to you. Mind contacting me via e-mail?
I'd like to know if there is a more efficient way than using suck/pullnews to obtain the archive? I had been putting together an archive at news.blueworldhosting.com, but have a number of holes and never got around to seriously importing the mbox files from archive.org.
I know you're in early development stages, but if you'd like someone to test pushing/streaming articles via NNTP I'm interested. I have a lot of bandwidth and performant hardware, always a good test case for testing NNTP streaming.
I've a tool to send many groups concurrently to nntp server via ihave.
On 30.08.25 05:11, Jesse Rehmer wrote:
If anybody can take a full copy: I'm happy to share!!!
Hey Billy, I've been meaning to reach out to you. Mind contacting me via
e-mail?
I'd like to know if there is a more efficient way than using suck/pullnews to
obtain the archive? I had been putting together an archive at
news.blueworldhosting.com, but have a number of holes and never got around to
seriously importing the mbox files from archive.org.
I know you're in early development stages, but if you'd like someone to test >> pushing/streaming articles via NNTP I'm interested. I have a lot of bandwidth
and performant hardware, always a good test case for testing NNTP streaming.
Hi!
using suck is worst way to download from the newsdeef archive.
the overview is not a database but a flat file with offset indexes
for every 100 articles only and downloading by article number is slow.
Articles are stored as sha256 hash from message-id.
best way is requesting '(X)HDR message-id' in a group first,
then suck message-ids: results in max performance.
I've a tool to send many groups concurrently to nntp server via ihave.