Re: Burning ISOs: Windows vs Mac


Subject: Re: Burning ISOs: Windows vs Mac
From: Timothy A. Seufert (tas@mindspring.com)
Date: Fri Sep 07 2001 - 19:20:28 MDT


At 3:03 PM -0400 9/7/01, johnathan spectre wrote:

>Even with large-buffer "burn proof" drives, if
>you hog the CPU for too long you will run out of data going to the drive and
>end up with a bad disc.

Actually, with "burn-proof" and similar schemes you should
theoretically end up with a good CD even if the burner's write buffer
empties -- that's exactly what "Burn-Proof" etc. are designed to do.
Basically, the onboard controllers have become capable of very
precise position control, making it feasible to restart burning after
a buffer underrun forces the burner to turn the write laser off for a
while.

A buffer underrun on a "burn-proof" burner does leave a slight gap in
the recording, but they've gotten it down to a few micrometers, which
apparently does not affect readers.

You don't want to rely on it as a way of life, though, because
recovering from a buffer underrun kills the effective write speed.
It's more of a way to prevent wasting a blank CD if you should have
an underrun.

>n general the higher the speed I burn at the less I
>do on the system. My fastest CDRW drive right now is a 24x and I don't do
>anything on the system when burning that fast. If I'm going to do some work
>I'll slow down the burn so the data feed can keep up.

24X is still only 3600 KB/s, so IMHO it's perfectly safe to keep
doing stuff while you burn. HOWEVER, it should be CPU intensive
stuff. The Linux kernel (like any UNIX) will automatically give
higher priority to I/O bound processes, so it's safe to have all the
CPU bound processes you want at the same time as you're burning.

It's even safe to have other I/O bound processes, as long as they are
beating on a different disk than the one that's acting as the source
of data for the CD burn.

-- 
Tim Seufert



This archive was generated by hypermail 2a24 : Fri Sep 07 2001 - 18:30:35 MDT