The stimulus reads:
Measurements of the extent of amino-acid decomposition in fragments of eggshell found at archaeological sites in such places as southern Africa can be used to obtain accurate dates for sites up to 200,000 years old. Because the decomposition is slower in cool climates, the technique can be used to obtain accurate dates for sites almost a million years old in cooler regions.
I got it right, but I picked C only because the other answers were not supported at all by the stimulus. Still, I'm not 100% sure I agree that C is supported to a significant extent by the stimulus. C says that if a site being dated has been subject to large unsuspected climatic fluctuations during the time the eggshell has been at the site, then application of the technique is less likely to yield accurate results. The stimulus gives me cause to believe that how far back sites can be dated depends on whether the climate of the site is cooler or warmer. In addition, we already know that no matter what, the apparent best the technique can do is identify sites up to a million years old. But where do we infer that changes in climate are going to make the technique less accurate or effective? Is it because if a site dates back 500,000 years, but experienced some climatic fluctuations that included bouts of warm weather, the decomposition would have occurred more rapidly, thus negatively impacting the technique's efficacy? I don't know, it just seems like a stretch to, from the stimulus, infer that in order for the technique to be effective, the climate has to be stable. Maybe the word "unsuspected" is what it all hinges on? If so, then climatic fluctuations can be accounted for they're "suspected"? What does that really even mean?
Maybe I'm just being pissy, but I thought this wasn't the brightest question.