Need some tips on burning large directories to disc

Morlock

Limp Gawd
Joined
Jun 8, 2012
Messages
508
I just bought a blu-ray burner drive and a spindle of 25gb discs.

How do I:

1 burn large directories (bigger than a blu-ray disc)
2 do it without going through the total PitA of sorting it into directories with <23.25 of data; in fact, without sorting it at all
3 do it in a way that results in discs that I can simply put into my drive and read normally. I.e., no compression, no zip, no 7z, etc.

?

P.S., freeware preferred. I've already DLed most of the recommended burning freeware.
 
No compression eh? Are you worried about the disc getting corrupted? Because otherwise compression is your friend in these situations, chances are this large directory has a bunch of small compressable files that could fit < 25GB that way. For convenience, you can always mount the partial archive as a virtual drive with tools abound made for this purpose. For improved performance if the files aren't compressable (eg. media files) you can create the archive in "Store" mode which disables compression. But if you insist on doing it your way, read further.

I assume you have exhausted search engine results so if there are no ready made solutions (for windows, there's dirsplit for unix though) that allow you to split a directory without making archive files out of it (.001, .002, etc) then you will have to write such a script or program yourself in the programming language you are most familiar with.

In the simplest form using WinAPI functions, you can make a while(1) loop that recurses the directory file by file and on each iteration gets the filename with FindNextFile() and checks the size of the file with GetFileSize(), convert to KB or MB and incerements a counter (unsigned long) with it, then CopyFile() or use stdlib functions to write the file to destdir and when (= IF!) the counter reaches 26214400 (KB) begin splitting to destdir2 or something. Condition the loop to end when FindNextFile() fails on an error condition or specifically ERROR_NO_MORE_FILES. You will have to deal with subdirectories (inner loops as many times as necessary) and handle a case where a large file pushes the counter over the limit, then you'll have to decrement the handle you pass to FindNextFile() in the next iteration and begin writing to the new directory. You can probably do something more efficient and elegant by using something other than a while(1) loop, this is just a quick 1 minute write-up at 9AM to point you in the right direction. Clearly there are edge cases that make this more difficult to implement in a scalable fashion compared to just throwing everything into a tape archive and splitting afterwards, which may be why you haven't found any existing software.

P.S. For a reliable backup solution, you're better off using lower density, high quality DVD±R or CD-R media, particularly considering the guaranteed degradation problems with certain LTH BD-R media (Ritek), not to mention the relative scarcity of BD-ROM drives.
 
Back
Top