For many years I worked in a Mac environment where we would output for internet in Quicktime h.264. It worked great, I preferred that quality over other codecs.
Since then, I've moved to a Windows environment, and in trying to output the same way, I seem to lose quality though I'm outputting at the same settings.
I remember once doing a test were I output in Premiere in both Mac and Windows at the Quicktime h.264 presets for 720p, and the Window's file looked much worse. I realized it was probably a data rate thing, so naturally I checked that. Both were around 6,000 KBPS, but the Windows outputted version looked much worse. In fact, I couldn't get the Windows output to look close to the original media (ProRes422) without hiking up the bit rate close 30,000, and it still didn't look as good.
Is this a fluke, my mistake, or a known problem?
One more question: If one exports a Quicktime h.264 at 6,000 KBPS and for whatever reason you would want to re-export that and say make it shorter, would you lose significant quality if you match the settings of the original exported Quicktime, or would you have to raise a setting like the bitrate to help the re-exported version match the original?