I always hear this with X-Com, and have experienced it myself. Though it is really possible to miss 90%s several times in a row. It shouldn't happen often, but our monkey brains hate it and it feels unfair. It is a hard balancing problem.
Why not have a system that is able to average out the rolls to happen in a way that good/bad rolls don't clump together and are spaced more evenly? Couldn't you generate the rolls and then sort them in favorable way or reroll whenever a roll is too far outside of some predicted thresholds?
There's a question of how you want to play with statistics. Do you want to play with fair dice or dice that give you an advantage? You might be saying to yourself that your storage tactic is still a uniform distribution, but you're forgetting sample size. You're also removing some uncertainty.
So, you can roll dice over and over, but you actually need a lot of samples for the stats to converge. That's why it is called the law of LARGE numbers. It is still possible to roll two 12s in a row on dice, but we wouldn't expect it to be common. By your clumping together, you need to not only store a lot of data, but now your "random" events are dependent rolls and not independent.
Really it is just a question of what you want to do and how you want fairness perceived. Do you want your game to act like dice? Or do you want a slight advantage? BTW,
i2om3r linked the video I was talking about there Meier discusses peoples' perception of fairness. In the end, you have to determine what is best for your game. Maybe stacking the deck makes better gameplay, maybe it doesn't.