Same acronym as nVidia's, but it stood for "Scan Line Interleave" instead of nVidia's "Scalable Link Interface."
Yep. And as can be assessed by the name, the 3dfx implementation just rendered every other horizontal line of pixels on every other GPU.
Nvidia has supported multiple methods over the years, including:
- Alternate Frame Rendering (AFR), where every other GPU renders every other frame in its entirety
- Split Frame Rendering (SFR) which is just a really complicated version of 3dfx's approach where it dynamically changes the percent of each frame rendered at the same time on each GPU,
- And there was also a AA version of SLI, where both GPU's render the same frames but slightly offset, and they are then combined for antialiasing, which resulted in very good antialiasing, at roughly single GPU performance.
Splitting the frames between the GPU's (whether using 3dfx's original every other line approach or Nvidia's dynamic frame splitting approach) results in the best input lag, but doesn't scale as well from a framerate perspective.
AFR scales
much better, but there are serious input lag penalties, as can be seen in this Toms Hardware Guide Illustration from their review of the original dual chip board, the Radeon RAGE Fury MAXX review back in 1999. (It used two RAGE 128 Pro chips and would only run in AFR mode)
It's a tradeoff. I always preferred the split frame render modes. Despite their lesser scaling, the performance felt more fluid to me. People often get really picky about input lag caused by their monitors, but then completely ignore it when it happens in their render pipeline for some reason.
Unfortunately most titles were designed with just the one render mode in mind, and it was built into game profiles, and that was AFR. You could force the mode in settings if you preferred SFR, but it was unpredictable and could result in crashing, graphics bugs and other problems, because the drivers had been optimized for each particular game with the mode built into the profile in mind.
IMHO, SFR (or the original every other line method) were the technically better solutions, but because kids are whiny and don't understand the tech, and complain that they don't get even close to double the performance with twice the GPU's, game devs and driver programmers (both AMD and Nvidia) almost always selected AFR as the default when collaborating on making multi GPU work for their titles.
It's a crying shame.
This is one of the big reasons why I abandoned SLI and Crossfire way back, and decided to just take a "single fastest GPU I can afford" approach. It just wasn't worth the trouble, and there were too many disappointments.