Under what scenario might an algorithm's observed runtime paradoxically appear to decrease as input size N increases?

Answer

When comparing average-case complexity against worst-case complexity

Situations where runtime seems to decrease can arise when comparing average-case performance to worst-case performance, such as Quick Sort exhibiting $O(N \log N)$ on average but $O(N^2)$ in its worst case.

Under what scenario might an algorithm's observed runtime paradoxically appear to decrease as input size N increases?
efficiencycomplexityalgorithmscalinginput size