AnimeSuki Forum (
-   Tech Support (
-   -   Usenet Downloading Guide / Fanzub (

GHDpro 2009-05-05 04:50

Usenet Downloading Guide / Fanzub
Usenet Downloading Guide

With reports that BayTSP is active again sending out copyright infringement notices again for certain unlicensed anime, there are a few options if you got caught or don't ever want to even take the chance of getting caught.

One of the options is to simply stop downloading anime. For those who don't find that an option, there is also IRC. But before the advent of BitTorrent, Usenet was a popular place to get fansubs too, at least for me.

What's Usenet?
Usenet is a distributed internet discussion network. It's like a forum, except that every "post" to every "thread" is automatically distributed among all servers in the Usenet network. Usenet is one of the oldest internet protocols and is approximately 30 years old.

Few people probably use Usenet for (text) discussions anymore - or if they do, they're using Google Groups rather than a real Usenet client. Like email however, it's possible to add binary attachments to message and use it to distribute files.

Advantages & Disadvantages
There is nothing with its ups and downs. First, here are some of the disadvantages of Usenet:
  • It's not free
    Or more precisely, good quality Usenet servers aren't free. If you are lucky your ISP might provide a Usenet server for you to use, but the retention (how long files stay on the server) and completion (whether you'll be able to download the files completely) is likely to be crap.

    Fortunately Usenet providers usually aren't terribly expensive. And some (like my favorite, Astraweb) offer "Pay per Download" accounts, which means no monthly subscription. Some also allow PayPal payments, which is useful if you don't have a credit card. See this site for more Usenet providers:
  • Usenet wasn't designed for binary files
    It was designed for text discussions. But people found out you could encode binaries and post them as Usenet messages. However, the are a few problems:
    • Each message can be only a few KB in size
    • As a result, each file is split up in hundreds, if not thousands of Usenet messages
    • To improve completion, files are usually split (using file splitters or RAR)
    • Finding all messages and parts of every file can be tricky and/or time consuming
    • If a single message is missing, you might be screwed, unless somebody reposts it
    But don't let that discourage you. Innovations like NZB files (more info below) and PAR files (again more info below) resolve a lot of these issues.

  • Usenet sucks for fansubs these days
    You can find lots of anime on Usenet - except most aren't fansubs (if you know what I mean) and if they are fansubs, there is little choice between groups and/or they're posted days or weeks late.

    This is not a problem I have an easy fix for, except for people to get more active in posting fansubs to Usenet. Fortunately posting files is pretty easy, but I'll leave that for a future guide.
And here are the advantages:
  • It's not free
    Why is this an advantage you may ask? Well, because you're paying money specifically for the privilege of getting Usenet access, the service provided by such Usenet providers tends to be really good:
    • Very fast servers for fast downloads
    • Multiple connections allowed to download even faster
    • File completion is excellent to perfect
    • Retention is insane - some providers strive to offer a year of retention
    • If you're paranoid, some providers offer SSL connections

  • Private & Secure
    The only party that knows what you're downloading (especially if you're using SSL connections) is your Usenet provider. And guess what? Quite a few of them don't keep logs! Also if copyright enforces ever go after Usenet they're likely to go after the providers only, so as a downloader you're very likely to remain safe.

  • Leech all you want
    Unlike BitTorrent, you don't need to share anything back to cloud, because there is no cloud.

  • Usenet isn't just for anime
    You can find er, lost of "other" content too. I'll leave that up to your imagination.

NZB files
An average list of files posted to an anime related newsgroup on Usenet may look like this:

Needless to say, it's a real pain to sift through this. Fortunately a site called Newzbin (which is now invitation only) created a file format in which the message ID (location) of every part of every file can be described.

Using such NZB files, a Usenet client can then easily start downloading all parts. Simply put, a NZB file is like a torrent file in that it makes finding files on Usenet a lot easier:

Despite the original Newzbin site being effectively closed, NZB is fortunately an open format that is offered by many other sites, such as:
PAR files
Apart of the problem of finding files is the problem that your Usenet server may be missing one or more parts of the file. PAR files were invented to solve this problem.

PAR files can be used as "glue" that can be used to replace bad or missing files (v1+v2) or to repair damaged files (v2 only). So if a part of a file or even a whole file didn't make it, you can still repair it.

There are two PAR versions:
  • Version 1
    This version is hardly used anymore, because it can not be used to repair files (only to replace missing or damaged ones) and every PAR file needs to be as big as the biggest file in the set.

  • Version 2
    This is the most commonly used version these days. It can repair damaged files, and does not need to be the same size of other files in the set. This means you can grab a PAR2 file that's just as large as the parts you are missing.
More information on how to use PAR files can be found in the guide section below.

Interesting fact: although they're now used in pretty much every binary newsgroup, PAR files were invented by posters from anime newsgroups.

The Guide

Step 1 - Get a Usenet client

There are tons of Usenet clients, but for downloading from Usenet using NZB files I personally recommend NZB-O-Matic Plus, because it's a no-nonsense little application that does just what it needs to do and no more. It's simplicity makes it easy to use.

After installation, set up your Usenet server first. Note that you may need to manually adjust the port if you want to use SSL.

In the Options menu, modify the Preferences to similar what you see in the screenshot. Except for the folder locations of course, feel free to set those to anything you want.

As you can see NZB-O-Matic Plus can also monitor a folder for you, like uTorrent can. This may be useful if you want to automate downloading.

Step 2 - Start downloading

Obtain the NZB file of what you want to download (on one of the sites I listed above), and open it with "Loader" (the utlity application that loads the file into NZB-O-Matic Plus) or use File -> Import NZB... in the menu.

Don't forget to hit Connect on the Usenet Servers tab or the download won't start.
Note the download speed and the expected download time!

If you configured NZB-O-Matic Plus as in the second screenshot, any PAR2 files will be paused until you need them (or delete them in case you didn't need them).

Step 3 - Verify the download

After download, you should have a list of files that looks like this:

To verify and re-assemble the file, use a tool called QuickPar.

After installation you may want to alter the options so that QuickPar will automatically start working once you double-click a PAR file.

If everything went correct, you see something like this: not only did QuickPar verify the split files, it also assembled them into one file!

If some part of the file is damaged or missing, QuickPar will notify you of how many parts it needs. You can then download those with NZB-O-Matic Plus.

Instead of using split archives, some posters may have used WinRAR to split the file. You can recognize such files by the .RAR file extention.

Needless to say, split archives that can be automatically reassembled by QuickPar are much easier to use - so if you post to Usenet, please stop using RAR unless you're posting a collection of small files or if compression is actually useful - which is not the case with all fansubs!

Step 4 - Watch

Just use the CCCP Project for this, if you haven't installed it already.


GHDpro 2009-11-09 13:58

Here is an update to this thread, which has proven to be really popular (j/k). Oh well, maybe I'm just preaching to the wrong choir here.

First regarding the situation of fansubs on Usenet: it's not quite as bad as I thought,as a lot of fansubs still get posted but to a slightly different group than I was used to (a.b.m.a.highspeed). Still I've noticed it usually takes hours and occasionally days for new fansubs to appear on Usenet. That's a bit slow, certainly as some posters can upload 400MB files in 2 minutes based on posting times :)

Second, I've switched from NZB-O-Matic Plus (see guide in first post) to SABnzbd, which is much more powerful. SABnzbd is not only capable of downloading Usenet posts through NZB files, it can verify them (using PAR/PAR2) and unrar and/or join files! This means that, assuming SABnzbd is properly configured, you end up with completely finished file after it is done. No more manual post-processing necessary.

One oddity of SABnzbd however is that the only interface is a web interface. On the one hand this makes it easy to administrate SABnzbd remotely (queue downloads from work/school!) but on the other hand causes a few issues. For example, you can't just click a NZB and assume it'll be loaded by SABnzbd. Also to check download progress you'll have to fire up your browser. Fortunately there is a Firefox extension that relieves some of these issues.

One nasty thing I encountered with SABnzbd is its insistence on putting all downloaded files in a folder with the same name as the NZB file. If you download episodes as 1 NZB file per episode, then you'll end up with lots of folders with just 1 file in it in your finished downloads folder. Needless to say, that's not very practical. Fortunately SABnzbd allows custom post-processing scripts. I've written a script that solves this issue and it can be found on the SABnzbd forum.

Now for some screenshots of SABnzbd in action:

Spoiler for image size:

Interesting detail: if you paid close attention to the screenshots, you may have noticed the PC that was doing the downloading is not the same PC as the screenshots are from.

And one more thing...

The NZB files from the screenshots came from Fanzub, a Usenet search engine I wrote myself. The main goal of this site is to make it easier to find anime (especially fansubs) on Usenet. I hope you like it.

Here is an example of Fanzub in action:[gg]+umineko&cat=anime

NOTE: Fanzub is a fully automated Usenet search engine. For this reason the site may return results that some find objectionable. Use at your own risk.

Victory 2010-02-23 05:35

I came across this guide whilst searching for a decent anime usenet index. First wanted to say thanks for creating Fanzub, it is absolutely fantastic and a godsend, words cannot express how awesome I find it to be.

I also use Astraweb but I'm subscribed to an $11/month plan. Here in Australia we only just Unmetered bandwitdh plans from an ISP called AAPT and that plan is $100AUD/month. Anyway the point is that many are put off by the fact that Usenet is a paid service but really if you're already paying that much on an ISP plan then a little bit extra spent to let you actually use that bandwidth in my eyes is a fair investment.

It's been noted that an advantage of Usenet is private and secure but I want to stress that point more. Torrenting is already being throttled by some ISPs and in many networks it is completely disabled (for example at the university I attend) however Usenet is still useable especially through the SSL connections which means you can leech all you want pretty much wherever you want using other peoples' bandwidth (depending on how much jerk you've got in your personality).

Love Usenet

Comfun 2010-07-23 17:52

I just discovered and I would like to thanks you GHDpro ;)

Great tool, well coded.

rss feed + nzbget and it's perfect. Around 30min max delay between torrent post and the file downloaded from anywhere :)

It's a real good alternative for user not able to use torrent neither irc at work/unniversity/shared wifi.

I'm using it now with ( but the script making nzb seems kida slow sometimes ) to verify my post and grab the .nzb from my proxy server.

For around 10$/month with unlimited now it's a fair price to get stuff from internet fast and from anywhere.

Again thanks you :)

By the way is it possible to ask you for add some newsgroup ( atm i'm thinking about alt.binaries.cartoons.french.animes-fansub;alt.binaires.animes.german ) in the script or you want to keep it english only ( even if there is non english sub posted on the group already ) ?
Or may be addind a new category for "Non English Sub Anime" listing those newsgroup ?

Keep it working it's super !

Tyrani 2011-01-22 10:31

I'm curious... the fanzub site hasn't been responding for the past couple of weeks. Is it permanently down? The Giganews-Fanzub-SABnzbd combo was wickedly efficient. I've gone back to using the Animeusenet site for now but that is pretty much torture now that its speeds are so ungodly slow.

Fahd 2011-01-22 14:09

I can access it just fine from here. You could try using a site like Down for everyone or just me to see whether it's just your ISP.

GHDpro 2011-01-22 17:49

The site was down for about a week (not weeks; less than a week total).

The server the site resided on previously was getting very flaky and going down constantly. So instead of sending reboot tickets every few hours I ordered a new server at another company.

However it took some time to get the server set up properly. The first HDD they put into the new server had a "Power On Hours" value of 28895 (do the math), so it had to be replaced. I only got to fully reconfiguring and restoring everything today.

Anyway, the site should work again now. The script might need some more time catching up on the posts from the past week though.

Tyrani 2011-01-29 00:40

GHDpro -- I wanted to say thanks for getting it up and running again so quickly. It seemed like the site was working (for me at least, Fahd!) within a few hours of making that post and it has been rock solid for the past week.

I very much appreciate the services you've provided, both with Fanzub and with Animesuki. These are really about the most useful anime-oriented sites I use, so thank you very much!

Comfun 2011-03-08 18:31

Yea both are very usefull.
A simple bot I scripted post almost all torrents files from animsuki to usenet so it offers a fast alternative using for those who have access to usenet.
It's really good for old torrents with slow peers.

And it's all thanks to GHDpro that's easy to grab :)

Tokyotosho and nyaa files are also archived on usenet for currently 943 days.

Middling 2011-03-19 11:16


Originally Posted by Comfun (Post 3523026)
Yea both are very usefull.
A simple bot I scripted post almost all torrents files from animsuki to usenet so it offers a fast alternative using for those who have access to usenet.
It's really good for old torrents with slow peers.

I've been downloading files from Usenet posted using your bot for quite some time, so thanks for that.

Unfortunately i believe there's a bug in your bot which causes it to duplicate parts of a file which results in having to download twice as much data as the extracted file contains (for instance "Great_Teacher_Onizuka_-_01_-_[P.A].avi" is shown as having 949.6MB encoded data + 51.4MB recovery but after download the resultant episode is only 458.1MB).

This is not too much of a problem for me as i have unlimited downloads at the moment but it does seem wasteful and means that i no longer look to Usenet first for my anime downloads.

If you could fix the bot i would be grateful as would, i'm sure, those that have a fixed limit to the downloads they can do. Besides, it should cut your data transfer needs in half. :D

Comfun 2011-03-26 07:31

I changed back the way files are posted, so it won't make you download twice the files if you're using a bad/old newsreader.

Newsleecher 4 is quite a good newsgraber.

If astraweb don't screwup again I might not repost twice to ensure file are complete before it spreads on others news servers.

Also notice that like don't duplicate headers in the nzb they provide even if I post duplicate files, in contrary to or

So normaly even if you use a bad newsgraber like grabit you should not download duplicate content if you're using those .nzb without duplicate headers links inside.

@GHDpro I would be glad to provide you ssh/ftp acces to one of my servers or even to provide you a server for if you have problem on the one you use. Just let me know if you're interested.

Even if I like seems it have a problem indexing my post due to subject lines.
I tought it was due to my duplicate post but it's not it, it still index twince, so fanzub is the best for me for now when i'm not browsing group directly from my newsreader.

fanzub+flexget+newsleecher (win) or sabnzbd (lin) is a good way to automate download using rss feed for new releases.

GHDpro 2011-03-27 06:40


Originally Posted by Comfun (Post 3545807)
@GHDpro I would be glad to provide you ssh/ftp acces to one of my servers or even to provide you a server for if you have problem on the one you use. Just let me know if you're interested.

Thanks for the offer, but I have enough servers to put Fanzub on if really necessary. I guess I should have realized Fanzub has now really "taken off" and that I need to take a bit more care to avoid long downtimes. Initially daily visitor count was so low that I didn't really care that much if the site was down for a few hours or more - nobody would notice or care.

That has changed over the past few months. While nowhere near AnimeSuki levels, according to Google Analytics there are now 100 unique visitors per day, and that doesn't count how many people use the RSS feeds and only visit the actual site once in a while.

To put it in a graph:
(Unique visitors measured by week)

GHDpro 2011-03-29 15:36

I'm still working on unimportant stuff instead of finally finishing some of the more basic features of the AnimeSuki V3 Beta site, but anyway...

I've just added NZB support to the V3 Beta site, powered by The matching of torrents to NZB files is done through the CRC32 value in the filename. This should work fine for 99% of all files that have CRC values (exceptions are such joke CRC32 values as "DEADBEEF"). The NZB file for each torrent (when available) can be found through the icon.

Using NZB files instead of torrents can be especially helpful for older series. Also as Fanzub doesn't have a nice series/groups index of its own, AnimeSuki can provide it for NZB files now as well.

The only downside is that batch torrents are not really supported. But when I finish the "Details" page for torrents on AnimeSuki v3, I will add the NZB icon for any individual files inside a batch torrent that have a matching CRC32 value as well.

Comfun 2011-04-09 02:56

Very nice, the new NZB icon on animsuki beta.
For filename without CRC32, I can add it in the subject name so if somehow animesuki calcul crc32 for torrents files it can be compared to check for a match.

GHDpro 2011-04-09 11:48

Well the scripts scan the subject line (of Fanzub) and filename (of torrents on AnimeSuki) only. AnimeSuki doesn't download the torrents (through BT that is; it only parses the torrent files themselves), so calculating the CRC for torrents without a CRC value in the filename them is not possible.

Hopefully fansub groups which don't put CRC values in filenames will catch onto this and will add them in the future. Fortunately those groups seem to be few.

Just in case: the server provider for Fanzub is doing some emergency maintenance this weekend - the site maybe slow or down for short periods.

Edit 2:
I just realized that if all files inside a batch torrent are available (individual CRC match), it should (in theory) be possible to make a NZB available of such a batch torrent as well (by including all files inside a single NZB file). But that will certainly require some additional work.

Comfun 2011-04-10 00:59

thanks for the infos.
I will not surcharge subject line with calculated crc32 then.
I was planing to add but it don't seems usefull.

Comfun 2011-04-19 13:07

Hi again GHDpro if you have spare time and if it's possible , could you add some filter to the parser ?


lot of spam carbage.
basicly since those anime groups are dedicated to anim you can be 100% sure .rar ext or .exe ext with less than 10MiB are garbage/spam/troll/trojan
the rss feed is almost unusable in those days due to the 50 items rectrictions when those 50 items are filled with those garbage.

If you could put rss feed to the last 100 or 150 items it would be a plus too, sincewealky releases may be sometimes ungrabable when a whole serie is posted, aka the feed would have 50 episodes of whatever.

If it's possible and if you have time this could be amazing. Thanks you :)

GHDpro 2011-04-19 14:14

Sorry about that. But yeah, time for some small improvements.

I've now added a bit of spam detection code that will first check what posters have posted more than 10 files in the past day, and then mark any posts by them smaller than 20 MB as spam.

This should not mark any real posts as spam, even if you say post a 8 MB trailer for "Rockman.EXE" as a RAR file :) (as long as you don't forget to change the default Yenc-PowerPost settings for name/email, something most spammers do seem to do).

It is possible particularly large spam (like 20+ MB posts) will pass through - if that becomes a problem I'll look into what I can do to accurately detect those when it happens.

Also increased default items returned in the RSS feed to 200 (4x more). This should hopefully not cause to many problems.

Comfun 2011-04-19 15:34

You're amazing, very nice.
I never sam ( for now ^^ ) spammer posting file bigger than 20MiB so I think, the improvement you did should be working well.

Not sure if this could be interesting but is it technically possible to change the nzb file to something like this :
become [NaniSubs]_A-Channel_-_01_(848x480_h264 AAC)_[2131B6A3].mkv.nzb
or [NaniSubs] A-Channel - 01 (848x480 h264 AAC) [2131B6A3].mkv.nzb

instead of currently NaniSubs A-Channel - 01 1280x720 h264 AAC.nzb

so it's would be easisest to sort some rules using severals rss feed to mark the files as already downloaded. is already doing so and i think it's very usefull.

Also i'm telling that because i'm interested to repost files with more than 650 days, weekly too keep them on usenet since retention is the only thing making files disapears.

And having those kind of nzb could prevent downloading files if the filenames is already being send, and this would be more difficult to do without filename in the nzb filename already.
I would have to sed every 4th lines of the nzb looking for the filename.

Well maybe this one is a little selfish.
By the way do you have an easy way from your side to extract from the tars you must have the nzbs from files with more than 650 days ?

I'm started months ago to stock nzb files but i don't have those after 300 days for now.
If you can retrieve them easyli it wold be cool otherwise i will just parse nzbindex urls and grab them this way.

Thanks again :) With animusenet down for weeks, fanzub is the last good nzbindex for anim now.

GHDpro 2011-04-20 01:01

The reason the filename of the NZB file is a "filtered" version of the original filename instead of the original filename itself is because SABnzbd by default uses the NZB filename as the folder name for each download.

I'm not sure what tool you're using to download NZBs - I could add support for some parameter like "?unfiltered" or something that would instruct the script not to filter the filename.

As for NZB history... I could send you a dump of the "posts" table of Fanzub, which should contain all information you need (though you need a bit of programming experience to make use of it I think). It only contains the original subject lines though, not filtered ones with only filenames you see on the site.

All times are GMT -5. The time now is 05:37.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2017, vBulletin Solutions Inc.