Storage and power consumption costs

Lately I’ve been thinking more and more about storage. Specifically, at one point I’ve used a Promise 8 disk IDE to SCSI hardware RAID enclosure, attached to a Sun system, and formatted with UFS.

Hardware RAID 5 eliminated problems with losing data due to disk dieing in a fiery death. I bought 10 Maxtor 120 gig drives at the time, and dropped two on the shelf. Over course of about two and a half years I used two spare drives to replace the ones inside. Once it was a bad block, and the other time drive had issues spinning up. Solaris 8 had support for only one filesystem snapshot at a time, which was better then no snapshots at all, but not great. I’ve had a script in cron running once a week, that would snapshot whatever was there, and re-cycle each week. Not optimal, but it saved me some stress a couple of times, when it was late, I were tired, and put a space between wildcard and pattern in an rm command.

Last little while I’ve been trying to migrate to Mac OS X. Part of the reason was the cost of operation. I like big iron, and paying for an E4000 and an external storage array operating 24/7 was getting costly when one’s a student, as opposed to being a productive member of the workforce. I thought that it’s cheaper to leave an old G3 iBook running 24/7 – after all iBook itself only “eats” 65 watts, right? Generally I’ve turned off most of the other hardware – Cisco 3640 got replaced by a Linksys WRT54GS running openWRT, three other Sun systems got powered down, etc. At this point I’ve only had an iBook and an Ultra 2 running 24/7.

This is when I’ve hit the storage crunch: I were rapidly running out of disk space again, and I still needed occasional access to the data on the old Promise storage array.

Easy enough solution was to buy more external disk drives, place them in MacAlly USB2/FW external enclosures, and daisy chain them off the iBook. Somehow iBook ended up with over a TB of disk space daisy chained off it.

fiona:~ stany$ df -h
Filesystem                Size   Used  Avail Capacity  Mounted on
/dev/disk0s10              56G    55G   490M    99%    /
devfs                     102K   102K     0B   100%    /dev
fdesc                     1.0K   1.0K     0B   100%    /dev
                   512K   512K     0B   100%    /.vol
automount -nsl [330]        0B     0B     0B   100%    /Network
automount -fstab [356]      0B     0B     0B   100%    /automount/Servers
automount -static [356]     0B     0B     0B   100%    /automount/static
/dev/disk4s2              183G   180G  -6.3G   104%    /Volumes/Foo
/dev/disk2s1              183G   182G  -2.3G   101%    /Volumes/Bar
/dev/disk1s1              183G   183G  -1.0G   101%    /Volumes/Baz
/dev/disk3s1              183G   174G -260.8M   100%    /Volumes/Quux
/dev/disk5s1              183G   183G  -1.2G   101%    /Volumes/Shmoo
fiona:~ stany$ 

In process I’ve discovered how badly HFS+ sucks at a bunch of things – it will happily create filenames with UTF-8 characters, however it will not add things like accent grave or accent aigu to normal files. Migrating files with such filenames from UFS under Solaris ended up not simple – direct copying over NFS or SMB was failing, and untarring archives with such files was resulting in errors.

Eventually I’ve used a sick workaround of ext2fsx and formatted a couple of external 200 gig drives ext2. Ext2 under Mac OS blows chunks too – for starters it was not available for 10.4 for ages, and thus Fiona is still running 10.3.9 (Yes, I know that a very preliminary read only version of ext2fsx for 10.4 is now available. No, I don’t want to betatest it, and lose my data). ext2fsx would not support ext3, thus one doesn’t get any journalling. So if I accidentally pull on the firewire cable, and unplug the daisy chain of FW drives, I will have to fsck them all.
fscking ext2 under Mac OS is a dubious proposition at best, and most of the time fsck_ext2 will not generate an auto-mountable filesystem again. Solution to this was to keep a CD with Rock Linux PPC in the drive, and boot into linux to fsck.

I’ve cursed and set all the external drives to automount read-only, and manually re-mount read-write when I need to. Pain in the back side.

Lately I’ve been eyeing Solaris ZFS with some interest. Big stopping point for me was the migration of the volume to a different system (be that same OS and architecture, or different OS and architecture all together). Turned out that migrating between Solaris systems is as simple as zfs export volumename, move disks to different system, zfs import volumename, which is a big win. Recently there were rumors that Linux folks and Apple folks are porting or investigating porting of ZFS to Linux and Mac OS X (10.5?), which gives hopes to being able to migrate to a different platform if need be.
All of that made ZFS (and by extention Solaris 10) a big contender.

It didn’t help that each of the external drive power supplies is also rated at 0.7 amp. One watt is one ampere of current flowing at one volt, so here I am with 3.5 amps. 65 watts that Apple iBook power adapter is rated for is only about 2/3 of the actual amperage, as it also generates heat, so here is another 0.7 amp or more. Oh, and there is the old Ultra 2, that, according to Sun consumes another 350 W, and generates 683 BTU/hr. So, assuming that Sun actually means that it consumes 350 W, and not that the power supply is rated for 350 W, that’s another 3.2 Amps of load.

This adds up to ~7.5A/hr 24/7.

This is where I get really confused while reading Ottawa hydro bills.

Looking at Ottawa hydro rates page, I read:

Residential Customer Rates

Electricity*
• Consumption up to 600 kWh per month	$0.0580/kWh
• Consumption above the 600 kWh per month threshold  $0.0670/kWh

 Delivery	 	 
• Transmission  $0.0095/kWh
• Hydro Ottawa Delivery   $0.0195/kWh
• Hydro Ottawa Fixed Charge  $7.88 per month

Regulatory	$0.0062/kWh**
Debt Retirement	$0.00694/kWh***

Thus, some basic math shows that:
7.5 amps * 110 volt = 825watts/hr

600kW/hr that Ottawa hydro is oh so generously offering me adds up to 600,000 watts / 31 days / 24 hours = ~806 watt/hr

In other words, I am using up the “cheap” allowance by just keeping two computers and 5 hard drives running.

825 watt/hr * 24 * 31 = 613.8kW/hr

Reading all the Ottawa Hydro debt retirement (read: mismanagement) bullshit, I get the numbers of
6.7 cents + 0.694 cents + 0.62 cents = 8.014 cents/kWh.

613.8kWh * 8.014 cents/kWh = 4919 cents = 49.19 CAD/month
Now, assuming that I were paying 5.8 as opposed to 6.7 cents kWh, it would still be 613.8kWh * 7.114 cents/kWh = 43.66 CAD/month.

Not a perty number, right?

So I am asking myself a question now…. What should I do?

I have two large sources of energy cosumption – external drives (I didn’t realize how much power they draw) and Ultra 2. iBook on it’s own consumes minimal power, and thus is at most about 10$/month to operate.

Option number one – turn off everything, save 50 bucks a month.

Option number two – leave everything running as is, swallow the “costs of doing business”

Option number three – Turn off Ultra 2, average savings of 22$/month, lose my e-mail archives (or migrate pine + e-mail to the iBook). Continue living with frustrations of HFS+.

Option number four – Migrate mail from Ultra 2 to iBook. Turn Ultra 2 off. Migrate all of the drives into the Promise enclosure (how much power it consumes I honestly don’t know until I borrow from somewhere a power meter – Promise is not listing any information, and neither is there any on the back of the thing), hook it up to iBook over RATOC SCSI to Firewire dongle. This will give me somewhere between 1.5 and 2 TB of storage, HFS+ or ext2 based. If I decide to install Linux or FreeBSD on iBook, well, the more the merrier.

Option number five – Migrate all of the drives into the Promise enclosure, hook it up to Ultra 2, turn off (or do not – on it’s own it’s fairely cheap to operate) remaining iBook. Power consumption will remain reasonably stable (I hope. I still have no idea how much power Promise thing consumes. It might be rated for 6.5 amps on it’s own). I could install latest OpenSolaris on Ultra 2, and format the array using ZFS. No costs savings, lots of work shuffling data around, but also has tons of fringe benefits, such as getting back up to date on the latest Solaris tricks.

I’ve just looked at specs for all the Sun system models that I own (Ultra 2, Ultra 10, Ultra 60 and E4K), and seems like U2 consumes the least power out of the bunch. Ultra 10 is rated for the same, but generates twice as much heat. Adopting Ultra 10 for SCSI operation is not that hard, but would force me to scrounge around for bits and pieces, and dual 300mhz US II is arguably better then a single 440mhz US IIi.

I guess there is also an option number 6 – Replace Ultra 2 with some sort of low power semi-embedded x86 system, with a PCI slot for a SCSI controller, and hook up Promise array to it. Install OpenSolaris, format ZFS, migrate data over. Same benefits as Option 5 with additional hardware costs, and having to use annoying computer architecture.

I guess I will have to decide soon.

Update: Promise UltraTrak100 TX8 is rated for 8 amps at 110 volts (4 amps at 220 volts)

iLife 06 and G3 processor and rant about Pacifist.app

After cobbling together an iBook (long story, but iBook in question is 600Mhz G3 with 100Mhz bus (as opposed to 600Mhz with 66Mhz bus that would make it much closer to molasses) 40 gig HD and combo drive), and throwing a clean install of 10.4.5 onto it today, I proceeded to turn it into a master disk image.

Every once in a blue moon I create an up to date install of OS with all the apps, system configured how I like it, accounts set up as I like them, and then use asr to back them onto an external hard drive. Then, in event I need to quickly roll out a system or recover from disaster I’d just need to asr the image back.

Two words about asr. Personally, I love asr. It can act as a poor man’s backup tool to create an identical bootable disk on a different drive (especially useful if you have some sort of bootable CD/DVD from which you can boot, as then asr would use fast block copy to copy data from disk to disk). Coincidentially, vast majority of macs (Let’s not talk about x86 ones. I am not yet sure I like them) supports firewire disk target mode. asr is also useful in creating and restoring from disk images.

Sadly for things like recovery disk I tend to use junky drives, as it’s not really a priority, just convinience, and coincidentially there is no funding for it. A disk with my last image died, so I decided to take advantage of the opportunity, as I were setting up a new system from scratch, with no baggage of software archeologies.

When I put in iLife 06 DVD into iBook and attempted to install it, I were told that iLife 06 only works with G4 and up processors.

So not being deferred, I’ve used Pacifist (See rant about Pacifist at the bottom) to extract iMovie package into a folder, to see what it is that Apple is trying to do on me.

I’ve talked about Fat files and lipo earlier, in case you feel like a review.

So a quick check with lipo confirmed what Apple is saying…. the compiled the binary for G4 and x86 processors only, obsoleting G3s. *sigh* First the cut off was presence of Firewire, then with iWork 05 (which was like 650 megs, yet shipped on DVD) it was presence of DVD, but now it’s G4 and up. I got to give a credit to the marketing/built in obsolescence people at Apple – they are good!

stany@Ghostwheel:~/Desktop/Root/Applications/iMovie HD.app/Contents/MacOS[03:46 AM]$ lipo -detailed_info iMovie HD 
Fat header in: iMovie HD
fat_magic 0xcafebabe
nfat_arch 2
architecture i386
    cputype CPU_TYPE_I386
    cpusubtype CPU_SUBTYPE_I386_ALL
    offset 4096
    size 3217924
    align 2^12 (4096)
architecture ppc7400
    cputype CPU_TYPE_POWERPC
    cpusubtype CPU_SUBTYPE_POWERPC_7400
    offset 3223552
    size 3327624
    align 2^12 (4096)
stany@Ghostwheel:~/Desktop/Root/Applications/iMovie HD.app/Contents/MacOS[03:46 AM]$ 

7400 is, of course, G4.

Attempts to run it generate ldynamic linker errors:

stany@Ghostwheel:~/Desktop/Root/Applications/iMovie HD.app/Contents/MacOS[03:46 AM]$ ./iMovie HD 
dyld: incompatible cpu-subtype
Trace/BPT trap
stany@Ghostwheel:~/Desktop/Root/Applications/iMovie HD.app/Contents/MacOS[03:54 AM]$

Now a quick rant about Pacifist.

Dear Charles Srstka.

I like Pacifist. I’ve not registered it using a pirated serial, and see 15 second time out each time I start it. One of these days I’ll even send you some money to support your effort (which seem to have been stalled since 2004). But, can you give me an answer to one question: Why the heck does Pacifist ask for administrator password each time one attempts to extract a file out of a package? Shouldn’t it only do that if one doesn’t have write permissions to the folder one is extracting files into? If I have read/write rights to files in package and to Desktop onto which I want to extract package’s contents, why does Pacifist want my password? Isn’t that getting users used to Pavlovian response of typing in admin password every time there is a prompt on screen, regardless of the need?
Please, think of the users, esp in view of the recent series of Mac OS worms that also ask for admin passwords.

Damn, if you fix it to actually check (and tell user why) if it needs admin password, and e-mail me about it, I’ll buy a license for Pacifist.

DYLD_LIBRARY_PATH

Anyhone has any clue why vast majority of the dynamic linkers out there (Solaris, Linux, BSD etc) all use LD_LIBRARY_PATH variable to specify where to load dynamic libraries from, yet Darwin/MacOS X uses DYLD_LIBRARY_PATH?

*grumble*

Compiling Alladin GhostScript 8.51 from source. It’s not hard, just quirky. Oh, and jpgsrc-6 and zlib-1.2.2 both need a config.sub from a recent package for configure to recognize Darwin/MacOS X.

MacOSX: Trimming fat from Mach-O fat files

MacOS X uses fat files.

A fat bundle contains data for different architectures. Here are some examples:

root@gilva:/Applications/Utilities/Terminal.app/Contents/MacOS[02:03 AM]# file *
Terminal: Mach-O fat file with 2 architectures
Terminal (for architecture i386):       Mach-O executable i386
Terminal (for architecture ppc):        Mach-O executable ppc
root@gilva:/Applications/Utilities/Terminal.app/Contents/MacOS[02:03 AM]# 

or even:

stany@gilva:/System/Library/Frameworks/Accelerate.framework/Versions/A[02:07 AM]$ file Accelerate 
Accelerate: Mach-O fat file with 3 architectures
Accelerate (for architecture i386):     Mach-O dynamically linked shared library i386
Accelerate (for architecture ppc):      Mach-O dynamically linked shared library ppc
Accelerate (for architecture ppc64):    Mach-O 64-bit dynamically linked shared library ppc64
stany@gilva:/System/Library/Frameworks/Accelerate.framework/Versions/A[02:07 AM]$ 

In order to operate on fat bundles Apple provides a utility called lipo.

If you are in a situation where you are limited by processor architecture – for example, if you never expect to use internal hard drive of your iBook in firewire target mode to boot up an i386 system – it is possible to remove the extra “fat” from fat binaries, to save some disk space.

Here is an example:

root@gilva:/Applications/Utilities/Terminal.app/Contents/MacOS[02:09 AM]# mv Terminal Terminal.bak 
root@gilva:/Applications/Utilities/Terminal.app/Contents/MacOS[02:10 AM]# ls
Terminal.bak
root@gilva:/Applications/Utilities/Terminal.app/Contents/MacOS[02:10 AM]#lipo Terminal.bak -remove i386 -output Terminal
root@gilva:/Applications/Utilities/Terminal.app/Contents/MacOS[02:10 AM]# ls -la 
total 2296
drwxrwxr-x   4 root  admin     136 Sep  6 02:10 .
drwxrwxr-x   7 root  admin     238 Aug 31 00:07 ..
-rwxr-xr-x   1 root  admin  386472 Sep  6 02:10 Terminal
-rwxrwxr-x   1 root  admin  783784 May 14 22:22 Terminal.bak
root@gilva:/Applications/Utilities/Terminal.app/Contents/MacOS[02:10 AM]# file *
Terminal:     Mach-O fat file with 1 architecture
Terminal (for architecture ppc):        Mach-O executable ppc
Terminal.bak: Mach-O fat file with 2 architectures
Terminal.bak (for architecture i386):   Mach-O executable i386
Terminal.bak (for architecture ppc):    Mach-O executable ppc
root@gilva:/Applications/Utilities/Terminal.app/Contents/MacOS[02:10 AM]#

A quick test confirms that Terminal.app continues to run as before, however to make sure that everything is kosher I would probably want to correct permissions on the new binary to match what it was on the original.

Disk space saving will not be big, as an average .app consists of many other objects besides the executable itself, so this is probably not a very big issue. If one tries to remove a non-existing architecture from a binary, lipo will complain:

root@gilva:/Applications/Utilities/Terminal.app/Contents/MacOS[02:18 AM]# 
lipo Terminal.bak -remove ppc64 -output Terminal
lipo: -remove ppc64 specified but fat file: Terminal.bak does not contain that architecture
root@gilva:/Applications/Utilities/Terminal.app/Contents/MacOS[02:19 AM]#

Another interesting option to lipo is -detailed_info:

stany@gilva:~[02:22 AM]$ lipo -detailed_info /System/Library/Frameworks/Accelerate.framework/Versions/A/Accelerate 
Fat header in: /System/Library/Frameworks/Accelerate.framework/Versions/A/Accelerate
fat_magic 0xcafebabe
nfat_arch 3
architecture i386
    cputype CPU_TYPE_I386
    cpusubtype CPU_SUBTYPE_I386_ALL
    offset 4096
    size 8488
    align 2^12 (4096)
architecture ppc
    cputype CPU_TYPE_POWERPC
    cpusubtype CPU_SUBTYPE_POWERPC_ALL
    offset 16384
    size 8564
    align 2^12 (4096)
architecture ppc64
    cputype CPU_TYPE_POWERPC64
    cpusubtype CPU_SUBTYPE_POWERPC_ALL
    offset 28672
    size 8488
    align 2^12 (4096)
stany@gilva:~[02:22 AM]$ 

writing a script that processes the output of:

 find . -type f  -perm -55 -exec file {} ; | grep i386 | sed 's/.for arch.*$//g'

while stripping out the arch, AND not screwing up the system is left as an excercise for the reader. 😛

Rendering a manpage

This is more of a general unix hint, that is not really MacOS X specific.

If you have a manpage that you want to look at, that is not in $MANPATH, (Something that got installed by hand into a custom directory, for example something that was built and installed using
./configure –prefix=/opt/packagename && make install ), yet you know where it is (for example because you did run /usr/libexec/locate.updatedb as root at least once since and now can use locate), you can use nroff to render the man page into text:

stany@gilva:~[12:06 AM]$ ls -la /opt/gnu/man/man6/figlet.6 
-r--r--r--   1 root  501  21054 Sep  3 17:41 /opt/gnu/man/man6/figlet.6
stany@gilva:~[12:06 AM]$ nroff -man /opt/gnu/man/man6/figlet.6 | head -20
FIGLET(6)                                                            FIGLET(6)



NAME
       FIGlet - display large characters made up of ordinary screen characters


SYNOPSIS
       figlet [ -cklnoprstvxDELNRSWX ] [ -d fontdirectory ]
              [ -f fontfile ] [ -m layoutmode ]
              [ -w outputwidth ] [ -C controlfile ]
              [ -I infocode ] [ message ]


DESCRIPTION
       FIGlet prints its input using  large  characters  (called  ``FIGcharac-
       ters'')made  up  of  ordinary  screen  characters (called ``sub-charac-
       ters'').  FIGlet output is generally reminiscent of the sort of  ``sig-
       natures''  many people like to put at the end of e-mail and UseNet mes-
stany@gilva:~[12:06 AM]$ 

QuickTime (Part 3)

Background

I have an iBook G4, that has a 32 meg Radeon 9200 mobile video card, that is below the minimal requirements for CoreImage. Technically CoreImage is supposed to be scalable, and if it can’t do a particular efffect on the video card GPU, it should try doing it on AltiVec unit of the processor, and, in event that the system lacks an AltiVec unit, it should fall back to the CPU.

In reality lack of CoreImage support doesn’t cramp one’s style all that much. I miss some graphical features of the GUI, which is cosmetics. However, occasionally it interferes with productivity, and pisses me off.

Consider the following example:

QuickTime 7 Pro and video adjustments

QuickTime have been coming in “free” and “pro” variety for a long while. The features that 29.95 USD Pro version has are numerous, but amongst the most notorious are:

  • ability to save some of the streamed media to hard drive
  • ability to export files to different formats
  • ability to do some rudimentary merging of video tracks using cut and paste
  • ability to adjust brightness, tint, contrast and colors of the video
  • ability to correct audio balance, etc.

All of the above features work reasonably well under QuickTime Pro 6.5.2, although color corrections are rather clunky and are represented as a slider on screen. However you can see the adjustments as the movie plays. Here is what it looks like (220K).

Tiger came with QuickTime 7, and once I entered the QT7Pro license key, one of the things that didn’t work on my iBook was color corrections. Apple-K presented me with options to modify the audio settings, but not the video settings.

Technically you can get video adjustments to work by performing the following steps: Export -> Options -> Video Filter and doing a bunch of adjustments there, however there is no fun in waiting for a few minutes in order to see if your guesswork was correct.

This is Broken[TM].

So I did some digging. Inside QuickTime Player.app there are two files: AvControls.nib and AVcontrolsMinimal.nib. One gets used when the system detects CoreImage supported video card, and the other one when it doesn’t.

My hypothesis was that if I were to swap the two around, I’ll get access to video controls:

First I copied QuickTime Player to a different directory, and then dropped to command line:

stany@gilva:~[05:11 PM]$ cd /Applications/extras/QuickTime Player.app/Contents/Resources/English.lproj/
stany@gilva:/Applications/extras/QuickTime Player.app/Contents/Resources/English.lproj[05:11 PM]$ ls -dal AV*
drwxrwxr-x   5 root  admin  170 Jun  5 08:09 AVControls.nib
drwxrwxr-x   5 root  admin  170 Jun  5 08:09 AVControlsMinimal.nib
stany@gilva:/Applications/extras/QuickTime Player.app/Contents/Resources/English.lproj[05:11 PM]$ 
stany@gilva:/Applications/extras/QuickTime Player.app/Contents/Resources/English.lproj[05:14 PM]$ sudo /bin/bash
Password:
root@gilva:/Applications/QuickTime Player.app/Contents/Resources/English.lproj[05:14 PM]#  
mv AVControls.nib AVControls.nib_ && mv AVControlsMinimal.nib AVControls.nib && mv 
AVControls.nib_ AVControlsMinimal.nib
root@gilva:/Applications/QuickTime Player.app/Contents/Resources/English.lproj[05:14 PM]#

After adjustment, on a non-CoreImage enabled system Apple-K menu looked like this.

Sadly, under Tiger the sliders for video correction still do not work, as they are dependent on CoreImage. However, I wonder if they do work on Panther (10.3). If they do, then likely this is the solution that would work for folks who haven’t upgraded yet. You see, there might be a reason to be a struggler. Comments, please.

Lastly, I wanted to give my modified version of QuickTime Player.app a different version string, so that I could see it when I ctrl-click on the movie, and select “Open with”. In order to do that, I ctrl-clicked on the QuickTime Player.app, and selected “show package contents”. Inside Contents folder, I’ve opened version.plist and Info.plist in Property List Editor. In Info.plist Root, I’ve changed CFBundleGetInfoString so that I know it was changed by me when I get info on the application, CFBundleShortVersionString and CFBundleVersion both to 7.0.1-stany and saved Info.plist. In version.plist I’ve modified CFBundleShortVersionString and CFBundleVersion to match the changes I did in Info.plist, leaving the rest of the properties the same.

Now, If I ctrl-click on a file QT recognizes, and scroll to “Open with”, it looks like this.

Last paragraph is the usual step needed to change a version of any application as recognized by the operating system. I should probably do something like this to all of those pesky Real Players, that I’ve been dealing with.

In closing, inside Info.plist it’s also possible to adjust the filename extensions and the icons that QuickTime is supposed to be able to handle. So you can rename your .mp3 files to, say, .jd, and associate just QT with these files (Barring presence of resource fork, etc, of course).

QuickTime (part 2)

Another braindump.

Somehow I ended up at PBS Nova Science Now page. It had lots of shiny TV goodness that I wanted to watch. Of course there also was a warning on the page saying This program is not available for downloading due to rights reasons.. Rights. Right.

First restriction was trivially bypassed. Netblock I am using is registered in Eugine, Oregon (which has a funny side effect that some web sites insist on hooking me up with “hot girls in Eugine”), which is actually correct, as I lease this /24 from it’s american owner.

So as far as PBS was concerned, I were tax paying merkin, and thus can be permitted to watch their programming (produced by taxpayer money). I can’t verify it right now, however I believe that they out right don’t permit folks connecting from outside US to view videos. *sigh* By the way, BBC does the same thing to some of their on-line content.

So QuickTime video was happily streaming off their web page. Due to pecularities of my network setup, that lead me to believe that they use HTTP protocol for content delivery. I viewed source, and grabbed http://www.pbs.org/wgbh/nova/sciencenow/video/nsn-wrap-new.mov (Feedback please. Does it play in your browser when you click this link?) , which when played in QT quickly sent me to http://www.pbs.org/wgbh/nova/sciencenow/video/rights_restrictions.gif. Right. So it plays from inside the browser, streaming, but not from HD. Joy, PBS.

Eventually I gave up and sniffed traffic ( tcpdump -i en1 -s 0 -w cookie ; strings cookie ). Noticed the following insteresting file: http://www.pbs.org/wgbh/nova/sciencenow/video/3204-new.xml (Sorry, not a hyperlink, as I want you to copy/paste it, thus this post not ending up in referer field – it might raise questions, as it’s not meant to be accessed by a browser).

It refers to a bunch of .mov files, that the program consists of, plus the “captions” for each part of the main movie.

Groovy.

wget --user-agent="QTS (qtver=7.0.1;cpu=PPC;os=Mac 10.4.1)" http://www.pbs.org/wgbh/nova/sciencenow/video/3204-new.xml

wget --user-agent="QTS (qtver=7.0.1;cpu=PPC;os=Mac 10.4.1)" http://www.pbs.org/wgbh/nova/sciencenow/video/3204-00-ref.mov 

This one is another container file. I had to save it and strings on it, to figure out the main file name. It is available in two qualities: 3204-00-300.mov and 3204-00-56.mov

so

wget --user-agent="QTS (qtver=7.0.1;cpu=PPC;os=Mac 10.4.1)" http://www.pbs.org/media/wgbh/nova/sciencenow/video/3204-00-300.mov

worked.

root@gilva:~/pbs[04:27 AM]# grep vidURL 3204-new.xml |sed 's/ref.mov/300.mov/g ; s/^.*http/wget --user-agent="QTS (qtver=6.5.2;cpu=PPC;os=Mac 10.4.1)" http/g ; s/< .*$//g ; s//video//g ; s/wgbh/media/wgbh/g' 
wget --user-agent="QTS (qtver=6.5.2;cpu=PPC;os=Mac 10.4.1)" http://www.pbs.org/media/wgbh/nova/sciencenow/3204-00-300.mov
wget --user-agent="QTS (qtver=6.5.2;cpu=PPC;os=Mac 10.4.1)" http://www.pbs.org/media/wgbh/nova/sciencenow/3204-01-300.mov
wget --user-agent="QTS (qtver=6.5.2;cpu=PPC;os=Mac 10.4.1)" http://www.pbs.org/media/wgbh/nova/sciencenow/3204-02-300.mov
wget --user-agent="QTS (qtver=6.5.2;cpu=PPC;os=Mac 10.4.1)" http://www.pbs.org/media/wgbh/nova/sciencenow/3204-03-300.mov
wget --user-agent="QTS (qtver=6.5.2;cpu=PPC;os=Mac 10.4.1)" http://www.pbs.org/media/wgbh/nova/sciencenow/3204-04-300.mov
wget --user-agent="QTS (qtver=6.5.2;cpu=PPC;os=Mac 10.4.1)" http://www.pbs.org/media/wgbh/nova/sciencenow/3204-05-300.mov
root@gilva:~/pbs[04:27 AM]# 

Now, these don't play in stand-alone QT and refer you back to the "Rights" image. However VLC will happily play them.

*sigh*

I am going to bed now.

QuickTime (Part I)

Note: This is just a quick braindump, so probably is inconclusive, and makes no sense

Situation

A few days ago LIVE8 concerts were held in major cities around the world. Most interesting (to me, YMMV, of course) was the reunion of Pink Floyd after over 10 years of not being around, with Roger Waters being on stage with the rest of the classic lineup for the first time in 24 years. Wow.

AOL has the license for the internet distribution of the videos, and has a reasonably nice site from which the clips can be streamed using QuckTime.

Clips are really good quality, where quality of the video was not sacrifised in favor of bandwidth. Thank you, AOL, you rock.

If one clicks on the little tab by the song name, a window pops up in which clip plays. One can view source, search for “mov”, and eventually find http URL to the actual file. So I grabbed the 4 Pink Floyd songs.

Problem

When I proceeded to play them in QuickTime, they played great. But every silver lining has a cloud – I wanted to build a playlist, where the songs would be played in sequence.

iTunes kind of helped – I am not a big iTunes user, but I imported .mov files into it, made a playlist, arranged them in sequence, and it kind of worked. There were two snags, however – there were ~2 second gaps between songs, and it was audio only. Grumble. I wanted something that could just play them all.

I could have used VLC.app, I guess. I just verified that it plays these tracks, and it has the concept of playlist down pat. But instead I fired up QT Pro 6.5.2, selected whole video, and wanted to paste it together with the next song, etc, to merge 4 songs into one 20 minute long video.

Of course nothing happened. QT had the copy and paste controls grayed out.

So I attempted to export it. It popped up a window telling me Couldn’t export “‘Breathe’ (LIVE 8)” because this movie doesn’t allow saving. Aaaarrrgggh!

Aimless wandering in the dark, searching for solution

So fater about half an hour of googling I learned that many others run into this problem. Seems like this “feature” of QuickTime got noticed when certain movie trailers (ST: Nemesis is one, apparently) were exported to QT with “do not allow modification” bit set. This had the added benefit of forbidding QT Pro to save the file to HD, and irked some folks to no end.

Hacker’s Guide to QuickTime (Which actually has lots of rather useless pointers, such as “open web page with QT component in browser, and then find the cached file in browser’s cache to save file to HD”, which doesn’t work as most of the time now browser just loads a small file (example) that in turn loads the rest of the content, if it feels like it, or folks actually deploy QuickTime Streaming Server, and browsers generally don’timplement RTSP protocol) mentions that:

Video editing programs like Cleaner allow authors to save movies in such a way that further changes to the movie are disallowed. When the author saves the movie, he simply enables the “disallow saving” check box. Some filmmakers chose to do this to prevent others from altering their work. Others chose this option to discourage users from making local copies of movies viewed online.

So this had a glimmer of hope: If I were to obtain the right software, I could make a small (2 – 3 seconds) source file, import it into video editing package, tell it to save once without disallowing saving, and once with, hexdump both files, and diff them. My stipulation is that it’s just a byte or two in the header, that QuickTime happily follows. If I were to know which ones, I potentially could just hexedit the restriction out, and solve my problem.

At this point for some reason I got diverted, and instead of investigating “Cleaner”, went and grabbed Sorenson Squeeze 4.1. Site e-mailed me confirmation and the above URL to the download package.

Sorensen Squeeze is VISE X packaged blob of data that has 30 day free trial, and that will watermark generated files (until you license it). I didn’t care about watermarking, as as long as it generates both protected and unprotected file identically, it’s not a big deal. I know save restriction doesn’t encrypt the file, as VLC.app happily plays them back.

After playing with Sorensen for a while, I realized that a) It does a rather poor job converting other QT files to requested form at(frame dropping. Gave it an 80K/sec mpeg4 inside QT container file (La Tortura from one of my earlier articles), and told it to generate 750K/sec result. Result had 8 frame/sec output, and was choppy as heck (source was 16 frames/sec). Maybe it’s another restriction of the 30 day demo) and b) I couldn’t find the menu to disable save in Squeeze’s features nor in documentation.

At this point I gave up in disgust, and uninstalled Sorensen Squeeze 4.1.

Another complaint about VISE X. Why the F*&^ does it demand that all other applications must be closed during uninstall of software? It demanded none such thing during install. I am not about to close Safari with 35 windows, nor X11 with 8 xterms. Aaargh, what a piece of crap. MS Media Player for Mac is also packaged with it, and in that case it actually demands admin password just to install an application into /Applications. WHY?

So this is as far I made it.

Questions

  • Is there a way to extract files from VISE installers, specifically out of Install.data, without running the installer? I always fear that it will spew files all over my system, and I’ll never find them.
  • Any advice about “Cleaner”? Admitedly I am reluctant to put this here, as I’m yet to google it.
  • Anyone has any experience dealing with QT restrictions?