How does CCleaner’s Drive Wiper work?

Terafall February 15, 2013

If I use Drive Wiper in CCleaner to wipe free space 35 passes, is it the same if I use 7 passes 5 times?

  1. Jan Fritsch
    February 16, 2013 at 6:43 am

    Usually a single overwrite is enough to prevent usable data being restored from a drive.

    However, since most of the tools (including CCleaner Drive Wiper) are software solutions they operate on a level where a multitude of reasons could prevent them from accessing certain parts of the hard drive.

    I haven't looked into the Drive Wiper option too much but here is how I see it:

    Drive Wiper creates a dummy image as large as the free space on your drive. It then overwrites every single cluster in that image.

    Using 5-7 passes simply means you will overwrite the same part of your hard drive multiple times which doesn't add any additional security. You would be better of to do a single overwrite, close and reopen CCleaner and run another single pass. This way a new image will be created which might be able to cover parts previously not available for wiping.

  2. Jim Chambers
    February 16, 2013 at 4:40 am

    Erasers write a certain character to each byte of free space (unallocated space) on each pass. A different character may be used on each pass. !-3 passes should be quite sufficient unless you're hiding something from the Feds.

    • Jan Fritsch
      February 16, 2013 at 7:01 am

      It doesn't matter how often and which character is used to overwrite data.

      Data is stored in 1s and 0s, charged or not charged, magnetic or not on the storage device. Once it is overwritten regardless of the pattern there simply isn't usable (previous) data to be restored.

      There is no "depth" and no "history" to be recovered, there is no "half charged" or "half magnetic" state. If hardware was able to "read history" out of the storage you would be facing regular corruption of your file system because it would accidently happen all the time.

      • Jim Chambers
        February 16, 2013 at 5:10 pm

        Data is written in bytes in hexadecimal characters and your talking bits. Expensive and sophisticated analysis of removed disk can recover erased data using edges of write track where weak image remains unless erased about 7 times. Thus my joking reference to the Feds that recommend repeated wipes for security of secret information.

        • Jan Fritsch
          February 17, 2013 at 1:28 am

          How can you store 1 Byte of data on a storage device that only has two states? 1 bit is the smallest storage unit and is how data is written on a magnetic hard disk.

          Each tiny storage unit is an oriented magnetic field ("north-south" or "south-north") which represent the bits "1" and "0". If there are four "north-south" fields in a row it would result in the bits "1111" being read which is the binary code for "F" hex.

          Yes, a weak image remains ~ if I recall correctly the mathematical chance of interpreting this weak image correctly is about 53% per bit after a single overwrite?

          But it doesn't really matter considering the time and cost it takes...

          The reason for more sophisticated methods for government data comes from stuff like bad blocks or HPAs which are not accessed by software or even failures as simple as someone unplugging a disk before the process was finished.

          "Basically the change in track density and the
          related changes in the storage medium have created a situation where the acts of clearing and
          purging the media have converged. That is, for ATA disk drives manufactured after 2001
          (over 15 GB) clearing by overwriting the media once is adequate to protect the media from
          both keyboard and laboratory attack."

  3. Manuel Guillermo López Buenfil
    February 16, 2013 at 1:54 am

    Well, yes, it is the same.
    The question is, why would you want to do that?
    1 pass is enough to prevent any software from recovering data.
    Even with just 1 pass, not even a paid professional would be able to get anything back.
    Also, wiping a hard drive can be a very long process (it depends on the drive size)
    My recommendation: go with 1 pass.
    If you want to read more about this, check this article:

  4. ha14
    February 16, 2013 at 1:13 am

    Wipe Free Space - Multi-Pass ? Stupid Thing ?

    •7-Pass: this writes zeroes seven times over the free space, and takes seven times as long. A 7-pass erasure using random data will do a pretty complete job to prevent reconstruction of the data on the drive.
    •35-Pass: it writes zeroes 35 times, and takes a very long time.

    Gutmann himself has responded to some of these criticisms and also criticized how his algorithm has been abused in an epilogue to his original paper, in which he states [1]:

    In the time since this paper was published, some people have treated the 35-pass overwrite technique described in it more as a kind of voodoo incantation to banish evil spirits than the result of a technical analysis of drive encoding techniques. As a result, they advocate applying the voodoo to PRML and EPRML drives even though it will have no more effect than a simple scrubbing with random data. In fact performing the full 35-pass overwrite is pointless for any drive since it targets a blend of scenarios involving all types of (normally-used) encoding technology, which covers everything back to 30+-year-old MFM methods (if you don't understand that statement, re-read the paper). If you're using a drive which uses encoding technology X, you only need to perform the passes specific to X, and you never need to perform all 35 passes. For any modern PRML/EPRML drive, a few passes of random scrubbing is the best you can do. As the paper says, "A good scrubbing with random data will do about as well as can be expected". This was true in 1996, and is still true now.

    Wipe Free Space schemes

  5. Bruce Epper
    February 15, 2013 at 11:40 pm

    It pretty much works out to the same thing, but 35 passes on newer equipment is excessive. With newer drives, the track density is high enough that more than 3 passes is just wasted time. With the older drives with a lower track density, the inter-track space was large enough that the residual magnetism left there could still be read even after a dozen or more passes which is why there are some early papers about using 35 (or more) passes to break this down.

Ads by Google