I had a bit of trouble getting some extra SATA hard disks plugged in to my server, so I thought I'd write down my experiences to share with the internet (re)public.
The hardware is a AMD 3000+ 64 bit chip with an nvidia-based motherboard, which is already running two standalone (ie: non-RAID) SATA disks on its built-in sata_nv controller.
My first attempt was plugging in a cheap SI3112-chipset based SATA driver, which was easy enough to acquire but proved a huge waste of time. I tried all manner of drivers (there are two in the kernel, an old one and a new one: neither worked), different 2.6 kernel versions, and in fact three seperate machines in the end (two older 1Ghz machines, plus the new one).
Basically the driver would load, the drives would connect, function OK, but after copying a bunch of data for awhile (say 100GB or so) my logs would fill up with errors, the kernel would panic, the filesystem would be remounted read only, and my copy would fail. The throughput was constantly unimpressive but within that varied considerably from 7 - 35MB/sec (hdparm -t).
Despite what it says on the linux-ata page, seeing a bunch of others report similar experiences leads me to believe that the Linux driver for this chip is actually broken. It's probably low priority for the kernel wizards as it's not a popular (ie: on motherboards) or useful (ie: powerful) chipset, by which I mean it isn't an open spec. and doesn't support TCQ or NCQ. Therefore they may be skipping it, and you should too.
Just wanting to get something working, I had a try with connecting a couple of USB2.0 SATA hard disk enclosures. Again, crap throughput and errors stopped my (large) copies. Even had the same problem on Windows (different machine, new HP laptop). The hardware I got my hands on was dodgy, wait until these mature before touching them, if ever.
I bought this online through eBay for around 25 euros and it works great! I went to my colo, plugged it in, went home, lspci picked it up, so I compiled the sata_sil24 module, and within two minutes it was working perfectly!
Before software RAID, read performance is consistently 15% higher than my onboard sata_nv nVidia RAID controller (as judged by hdparm -t output). After the software RAID layer (RAID1 on two identical 250GB disks) the performance is 3.5% faster. Great stuff! (Also: supports TCQ + NCQ)
Uses the same driver as the SI3124 chip, unsure of the differences but buying one of those is probably a safe bet too.
Hope the above info helps someone.