Abstract
Establishing reliable frameworks for predicting unknown outcomes from empirical observations is of great interest to ecologists and evolutionary biologists. Strong predictability in evolutionary responses has been previously demonstrated by the repeated observation of similar phenotypes or genotypes across multiple natural or experimental populations in analogous environments. However, the degree to which evolutionary outcomes can be predicted across environmental gradients, or in fluctuating environments, remains largely unexplored. Presumably, the phenotypic evolution in an intermediate environment could be interpolated from the evolved phenotypes observed in two extreme environments, but this assumption remains to be fully tested. Here, we report on the experimental evolution of Escherichia coli under three nutritional transfer periods: every day, every 10 days, and every 100 days, representing increasing severity in feast/famine cycles. After 900 days of experimental evolution, populations experiencing intermediate durations of starvation had evolved longer times to reach maximum growth rate, smaller colony sizes, higher biofilm formation, and higher mutation rates than populations evolving in the other environmental extremes. Because the intermediately starved populations exhibit significantly high molecular parallelism, these distinct phenotypes are likely due to non-monotonic deterministic forces instead of increased stochastic forces commonly associated with fluctuating environments. Our results demonstrate novel complexities associated with evolutionary predictability across environmental gradients and highlight the risk of using interpolation in evolutionary biology.
Competing Interest Statement
The authors have declared no competing interest.
Footnotes
In response to the thoughtful suggestions provided by anonymous reviewers, we revised this manuscript to increase the overall clarity of our findings. We thank them for their time and believe their input has resulted in a more impactful story.