Correlation of data sent by different sensors in a wireless sensor network can be exploited during the data gathering process to improve energy efficiency. In this paper, we study the energy efficiency of correlation aware data aggregation trees under various sensor network conditions and the tradeoffs involved in using them. Through simulation study we invalidate two commonly believed myths about correlation aware data aggregation. We also conclude two rather surprising results that the energy improvement in using correlation aware aggregation is not considerable under many network scenarios, and a relatively small delay tolerance is enough for a sensor application to guarantee a correlation aware aggregation tree with a near optimal cost.