[nSLUG] ZFS through FUSE

Joe Dunn me at joedunn.com
Sat Apr 5 12:27:35 ADT 2008

On Sat, Apr 5, 2008 at 9:11 AM, George N. White III <gnwiii at gmail.com>

> On Fri, Apr 4, 2008 at 4:06 PM, Joe Dunn <me at joedunn.com> wrote:
> Which version of OS X?  Apple advertises "read-only" ZFS support.  I'd
> like to
> create multi-TB read-only caches of remote sensing images.   Typically
> there
> is a growling pool of raw data (called level 1) that gets reprocessed
> to "level 2"
> periodically, with new data processed as they arrive.   Users need to be
> able
> to extract time-series and do space-time binning of level-2 data.   OS X
> is
> currently the platform of choice for working with level-2 data.

>> Running Leopard 10.5.2. Apple developers network seeds a zfs RW module. I
don't use the developer seed anymore though. Off of macforge they develop
ZFS seeds for OS X faster. Apple developers work on them, since one
contacted me yesterday about my kernel panics :).

> I thought VMware  USB is limited to 1.x speeds, but I have only used it on
> P4 hardware.

Nope supports USB 2.0 spec

> I'm willing to give up some speed for the flexibility and the extra checks
> on
> data integrity -- CPU intensive may be OK these days.  Anything that saves
> human time (e.g., dealing with corrupted files and managing chunks of
> storage)
> is important.
> One caveat with ZFS ,with small files, its CPU intensive. I ran some
> bonnie tests, i'll post below. This is with a 1GB file.

$ Bonnie -s 1000 -d /Volumes/zfstest/
File '/Volumes/zfstest//Bonnie.1976', size: 1048576000
Writing with putc()...done
Writing intelligently...done
Reading with getc()...done
Reading intelligently...done
Seeker 1...Seeker 2...Seeker 3...start 'em...done...done...done...
              -------Sequential Output-------- ---Sequential Input--
              -Per Char- --Block--- -Rewrite-- -Per Char- --Block---
Machine    MB K/sec %CPU K/sec %CPU K/sec %CPU K/sec %CPU K/sec %CPU  /sec
         1000 27419 58.0 48486 15.8 22464 10.2 43259 91.1 65648 13.1 111.9
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://nslug.ns.ca/mailman/private/nslug/attachments/20080405/c04bd940/attachment-0001.html>

More information about the nSLUG mailing list