I’m deploying Frigate (camera NVR software) at work for our 100something security cams. We only buy Dell servers which is a shame because I would have probably went Supermicro in this instance. Anyway. I’m doing some research and it’s really unclear to me what Dell server I should go with if I intend to install a GPU in it.
I’m thinking I’ll probably go refurbished, like one generation or two old.
Should I go with a 4U server, and if I did, would that eliminate the need for a PCIe riser card?
Do I need a datacenter class GPU? I have read that many of the powerful consumer cards will simply be too large for the server case.
Right now I am testing with an R550 and there is only one available 8 pin on the power supply. How do I power a 12 pin on the GPU if all that is available?
I’ve used Frigate for a few years with up to 5 cameras, but 100 might be pushing it for a single card. I’m fond of the Google Coral M.2 chips for inference like the software maintainer recommends. You would need about ~5-10 I’d guess, and 1 low tier GPU if you’re not transcoding too much. I talked to the guy that made the project a few years ago when it was still small, and he helped me with FFmpeg parameters to get Cuda h.265 decoding. Which is also important, depending on your cameras. Maybe talk to him directly through GitHub.
Dell server
Check reeally carefully which cards (down to the make and model) are supported by Dell. Otherwise chances are good it simply won’t work.
Dell also does CAD workstations, also as rack machines. Those are specifically built to contain one or more powerful GPUs. Maybe that’s an option.

