I gave the code another quick look. It seems loot containers more or less work like this:Gets abundance and multiplies it by 0.01, gets a value between the min and max of the count attribute for the <lootcontainer> element and multiplies that by the abundance. This is the number of items it will spawn, but the maximum it can spawn would be the size of the container (as defined by the size attribute of the <lootcontainer> element). Once it figures out how many items to spawn, it calls a method named LootContainer.SpawnLootItemsFromList and tells it to spawn that many items, but from here on out, it uses the abundance of 1 (100%), ignoring the game setting. So loot abundance only comes into play for the amount of items spawned, nothing more, and large values are a bit meaningless. For a garbage can, for instance, the min/max is 0,2. 500% LootAbundance will spawn either 0, 5, or 10 items. This is never exact because during the spawning of the items there is the probability that nothing spawns in a slot. So yes, a higher loot abundance than 500 will have a slight effect, as 600 will spawn either 0, 6, or 12 items. As 12 is the maximum for a garbage can, the highest loot abundance value that will have an effect would be 1200, as that will spawn either 0, 12, or 12 items (0*12, 1*12, 2*12, with maximum of 12).
Does this make any sense? I've not spent a ton of time looking at the code and the code is rather loopy (meaning there are several loops involved here and it gets a bit hard to follow, especially when recursive), but that's the sense I got from it. I was rather surprised seeing that an abundance of 1 is hardcoded for the actual items. I wonder what it would be like if this function was prefixed in a Harmony patch to always use the game setting abundance.