Safari 4: Caching Images of Every Page You Visit, Where it’s Hard For You To Find Them

Safari 4 beta leaves data, privacy trail in its wake – MAC.BLORGE.

Yesterday I decided to give Safari 4 Beta another try.  It’s supposed to be super fast and all, and I was thinking about those sites like Facebook and which seem to cause a ton of extra CPU usage when I leave them open… thinking maybe a more efficient Javascript engine would make them more pleasant to have around.

I tried it, and it was super fast, but if anything it achieved that speed by causing even worse CPU churn, so after a while oohing and aahing at its speed I toasted it, uninstalled and went back to Firefox 3.

Then I read this article, and the article it links to, and sure enough, in the short part of the evening I was using Safari 4 I had generated 170+ megs of data in a hidden, can’t-reach-without-command-line-fu location on my computer, not even in my personal user directory.  (I don’t think it’s hidden for nefarious reasons; Apple doesn’t roll like that.  I think it’s hidden because this cache is what the Apple engineers needed to achieve the effect they wanted, and they didn’t think you should have to worry your pretty little head about how they did it or how much of your disk space they used to do it.)

And darned if that hidden cache directory didn’t include full size image files of every site I’d visited during that time.  Which stayed behind after I uninstalled the beta.  How much data would there have been after a week of usage?  A month?

Lame, Apple.  Lame.

Finder Grinder: The CPU Syndrome

I’ve had a problem for a while on my macbook (running Leopard) that the Finder would just suck up CPU like crazy.  Restarting it often didn’t change anything.  I never found any useful information googling on it, until recently.

In this thread I hit on the idea of sampling the Finder process, and when I did so, I found that the biggest sucker of CPU was a function called “getdirentriesattr.”  Googling *that* up, I found reports that this function is used iterating through really big directories calculating folder sizes, and that in fact:

It looks like it is iterating some directory to get its total size. This happens on my machine when I either File Menu->Get Info on a very large folder, or view the folder in List View with Calculate Folder Sizes enabled. Its trying to get the total size of that directory. Apparently there is a bug where the Finder doesn’t stop trying to get the size after you have stopped viewing that folder, or even turned off Calculate Folder Sizes in List View (until you quit/relaunch the Finder).

I opened up a Finder window, pulled down View – Show View Options, and deselected “calculate all sizes.”  I made that the default, and then restarted Finder.

Suddenly my Finder’s at 0% CPU instead of a steady 20-80%.  And my fan isn’t on all the time.

SWEET.  Thank you Apple for having the tools that made it possible to dig down into the process and discover this, but nuts to you for that bug existing in the first place!

MacPorts: Everything Else Sucks More

Random Bits and Pieces: MacPorts are Fatally Flawed gives a good list of reasons why MacPorts sucks.  I wanted to upgrade the Gimp to the latest and greatest, and I put the wrong flag into the upgrade command and it’s taken all damn day and upgraded nearly every port I had, so far, except the Gimp itself.  Ports I had forgotten I ever installed.  Compiling, compiling, compiling.

And I didn’t *know* I’d screwed up because sometimes even when you do everything *right* it goes nuts like that.

The big Macports thing is: you compile everything yourself.  This gives you great flexibility if you want some unusual variant of a program.  But the 95% of the time that you don’t, that you want a standard version of the program just like everybody else’s… tough.  You’re compiling it yourself no matter how long it takes.

It’d be nice if you could just download binary packages from a central repository like you can with most Linux package management systems, and not have to compile them all yourself.  It used to be possible to do that with fink, an alternative to macports, but in doing so you only got the “stable” (read: ancient and crusty) versions of the packages.  For the “unstable” (read: created this millennium) versions, you had to compile them yourself, and if you wanted *any* unstable packages, you pretty much had to use all  unstable packages… and fink’s set of packages is smaller and less up-to-date than macports’s, so you were screwed.

Now, when everything goes right with macports, it goes *really* right.  You can install a very up-to-date version of the Gimp, with a proper Mac application bundle, with some of the coolest new plugins (liquid rescale, RAW support via ufraw) and with the ability to access a wide variety of scanners via xsane, all built in.  That’s exquisitely cool.

But when things go wrong, there’s not much you can do about it, and the hassle you have to go through compiling even when things are going right is a complete pain.  This ain’t Gentoo, kids.  It’s a staid operating system for people who don’t want to invest the time and pain that a true open source operating system demands.  Does it have to be this way?

Guess it does.