[nSLUG] tarball extract failing

George N. White III gnwiii at gmail.com
Wed Aug 24 08:15:29 ADT 2011

On Wed, Aug 24, 2011 at 1:48 AM, Nancy <nancy at esnesnon.com> wrote:

> Does anyone know how I might extract some files from a .tgz file that seems to have something in it that makes it kack only a few GBs into the extract? It's a tarball of my OS X homedir and I'm pretty sure the offending file is in the Library directory. The tarball is a couple of years old and every now and then I pull it out and try (unsuccessfully) to rescue photos of my cat (that passed away). These are the only good photos of her that I have, so I never give up trying.
> If there is a way for me to "look" in the file and cherry-pick what I want, that would be the ultimate. I can't even remember all of the things I have tried to get this file extracted - there have been so many: cmd line tools and graphical ones. I have tried both Linux (Debian) and OS X to no avail.
> Any help would be appreciated!

Have you tried extracting the the files using OS X?  There are
differences between tar implementations between OS X (bsd flavor) and
linux (gnu flavor)

<http://norman.walsh.name/2008/02/22/tar> has some discussion.

"gzip -f <gzipped file>" will test the gzip compression.   If this
fails the file was corrupted or you encountered a gzip bug.
More likely the tar version you are using has a problem with a
particular file.   If "gzip -t" doesn't tell you there is a problem
(it follows the
"no news is good news" principle) then you can try different tar
implementations, such as "pax".

mentions problems with resource forks and suggests hfstar (which I
have never used, but is available from fink and macports).

<http://en.wikipedia.org/wiki/Resource_fork> discusses resoruce forks
-- Mac OS X allows multi-forked files

George N. White III <aa056 at chebucto.ns.ca>
Head of St. Margarets Bay, Nova Scotia

More information about the nSLUG mailing list