You won't actually get the full throughput even under ideal conditions, 100Mbps is just the max.
To start, I just want to make sure we're clear on what's being discussed in terms of overall setup. The assumption I'm making is you have a gigabit network. At some location there is a gigabit switch going to devices, with in all likelihood a relatively short run to the 100Base-T device in question.
I cited UHD BluRay as an extreme bandwidth corner case, but let me explain why it's not particularly realistic at this point:
1) It's expected to have a soft launch this year - if even that. No release dates for players have been confirmed to my knowledge. Sprint next year is looking more realistic.
2) At what point after launch does it even get cracked? Until then, there will be no rips. Sure it could be soon, but it could also take a while. It's completely unknown what timeframe is realistic for ripped UHD BDs.
3) I doubled checked the bitrates for UHD BD, and actually the larger sizes supposedly do have higher max bitrates, actually exceeding 100Mbs. However you have to remember those are peak bitrates, not nominal. The discs aren't large enough to maintain those bitrates, though admittedly you could have break up during peaks. However ...
4) No one is expecting uptake to be large, at least not immediately for UHD BD. So of those people who have it ... how many will want to actually rip them once a crack occurs? Now of those people, how many will actually want 1:1 rips? Most people don't even do that for normal BD's ... they typically use something like MakeMKV, Handbrake, etc to reduce the size somewhat. Now think of UHD BD? Of the few people that will potentially be ripping them in the next few years, who isn't going to reduce the bitrates?
So really the above is the absolute top-end corner case, and isn't even particularly realistic in the real world. The amount of people that this would likely impact, at least within the next few years, must be vanishingly small. And that's even assuming it gets cracked in that period.
The real world bitrates people will be seeing, even for UHD content, should have no problem in a Gigabit network where the playback device is capped at 100Mb. Realistically we're talking about bitrates in the 30's-40's. Nowhere near the 100Base-T peak, and easily in its nominal bandwidth.
It's also typically dependent on the whole of the network as most switches and routers will force the whole network to downgrade in the presence of a 100BaseT device but also share bandwidth among the devices.
And now I see where the problem is.
That is not the case.
I can only assume people are confusing WiFi with Ethernet if this is the conventional wisdom. The reason a WiFi router drops things to the lowest common denominator is a physical hardware limitation with the radio. For consumer routers, a given radio can only tune to a limited spectrum. So when an 11g/b device starts using that radio, all devices connected to it have to switch to the degraded mode.
For example, this is why if you have a concurrent dual band 11n router it's recommended you connect all of your 11n clients to the 5Ghz band if they support it. That band is only supported by 11n devices. Since only the 2.4GHz band radio can be degraded by having 11g/b devices connect to it, you'll maintain your good speeds on your 11n clients.
In the case of a Gigabit switch though, only the specific port connected to a 100Base-T device drops in bandwidth. It is not analogous to how the radios function in a WiFi system at all. All other ports continue to have access to the gigabit bandwidth pool. With that in mind it should now be clear why Gigabit is irrelevant in a device like this. It doesn't impact other devices on your network, you aren't generally copying files to and from it, and the bitrates of even UHD content that would be streamed to it will be within the real-world nominal bandwidth of the connection.