Tag: imaging

  • Clonezilla

    For quite a while I’ve used ghost4linux (g4l) for my disc cloning needs. What I REALLY like are the ability to do a network copy of the image to an ftp server and the built in dd_rescue to rescue data from a failing hard drive. Unfrotunately g4l does a bit by bit copy of a drive which means it can take a while and it copies the full drive capacity (say for instance 80GB) even if you only have 5GB worth of information. Now, it can be compressed and if you massage the drive by defraging/filling empty space with ones before you start you can squeeze the image down pretty small, but… sometimes that’s a big task (I remember leaving one box writing zeros to the drive overnight to prepare the empty space for a g4l cloning.) Anyway…. I’ve run across clonezilla recently and am VERY impressed – it’s basically a wrapper around partimage – it will only copy the data component of a disc’s contents if it recognizes the filesystem (most linux filesystem types ext2/3/reiser plus ntfs and fat… it seems like a couple others too.) If it doesn’t recognize the filesystem it drops back to bit by bit mode which is nice. The only other thing I would want from it is better documentation and dd_rescue capabilities. (And maybe a fuse module to be able to image to/from ftp servers.) It supports several network approaches (samba/ssh) for writing/reading images over a network.

    (more…)

  • Drive images – filling free space with zeros

    So, one of the things I’ve been doing is drive imaging. I’ve got 3 systems that are to be identical (based on, of all things Freedos…) So, I thought it was a perfect opportunity to dust off cloning/imaging software. So, I’ve been using the excellent g4l (ghost4linux) which is now up to v 0.22 (I make use of this for trying to rescue failing hard drives too as it includes dd_rescue). Anyway, the new machines have 80GB drives and the lzo compressed images are running ~450MB…. but one of my questions was making sure that it was as small an image as could be. I found that there are a few ways to squish the image file more and that it mostly revolves around filling the empty drive space with zeros.

    (more…)

  • Really not wanting to sit at the desk….

    So… it’s ~70 degrees out and springish I have network cables draped over my desk to support a couple systems that I’m imaging right now and am really not wanting to sit here, but… I do have a few things to “clean out”…

  • Live filesystem “capture” into a virtual disk image

    ah… the joys of *nix utilities…. I’ve just successfully tested a “capture” of a live, running system into a virtual disk image. No, I don’t mean that I booted up with an imaging utility. I took a live, booted and logged in system and imaged the primary hard drive that it was living on, into a file on another machine. (Yeah, I know, there are probably a few people reading this and saying they’ve done that and most people that would need to do this already know how…. sorry I missed the memo.) Not too long ago, VMWare released a tool to do something like this (that tool is for windows…) This should work on any platform that supports dd and netcat (although I’m not sure if piping output from one program to another works with a dos command shell – maybe cygwin would be a good environment to accomplish this with.) Anyway… here are the details.

    (more…)

  • HDR – High Dynamic Range – Images under linux

    HDR – recently I heard someone talking about this with regards to digital imaging. The idea is that you have three identical images (landscapes) taken from a stationary (tripod) camera. The only difference is the exposure times vary. Together you can blend them to create a more impressive final picture. Yes, I just talked about fake photos and digital imaging. This, to me, is in a different class of photo editing…. enhancement(?) – well… anyway. There are a number of ways to do this, photoshop, I understand has support for doing this and it’s possible under linux as well with the Gimp.

    (more…)

  • 2,000 year old computer?

    The Register had an interesting article on the analysis of what may get classified as the worlds oldest computer (2000 years old – Greek.) Apparently it’s been known for a while (discovered in an old shipwreck around 1900). It’s been called the “Antikythera Mechanism” and has more than 30 dials and wheels. Anyway, it’s currently been undergoing detailed imaging analysis which has uncovered some new clues which may reinforce a theory that it was designed to track/predict planetary locations (Mercury, Venus, Mars, Jupiter and Saturn were known to the ancient Greeks.)

    (more…)

  • Microsoft should use a /home partition….

    I saw this yesterday or day before… George Ou has said that Microsoft should move user data to it’s own volume (or partition). He is ABSOLUTELY RIGHT. I think these days the default install for any modern operating system ought to assume you care enough about your data to seperate it from the main OS. I find myself slightly annoyed at linux distributions that DON’T do this by default, although most will at least let you make changes to the partitioning in the install process. I had got to just assume this was the way things were since Mandrake always defaulted to seperate home and root partitions.

    (more…)