$ time php -r '$a=array_fill(0,100000,"test");while(array_shift($a)){}'
real 0m10.238s
user 0m10.238s
sys 0m0.000s
$ time perl -e 'my @a=("test") x 100000;while(shift(@a)){}'
real 0m0.027s
user 0m0.027s
sys 0m0.000s
$ time node -e 'a=Array(100000).fill("test");while(a.shift()){}'
real 0m1.350s
user 0m1.343s
sys 0m0.011s
Interesting side effect of PHP's magicfull array, which is an array/vector, a linked list and a dictionary/hashmap in a single data structure and in that case copies all elements on each shift into the new location to keep the indexes starting at zero. If you break up the "array", say with an `unset($a[0]);` after the array_fill, which tells that you don't want "array" behavior, it becomes fast but all elements keep their original index instead of being moved forward.
But a tradeoff in that design is it still has to keep track of keys and update them if you remove stuff from the start of the array. Adding and removing elements at the end is very fast, though.
That seems like some kind of local fitness minimum in the space of design choices. Even though there certainly are worse choices one could make, I couldn't imagine anything much worse than that would even stand up to any sort of use.