We have modelled the spectral energy distribution of the carbon-rich star UU~Aurigae, which shows an excess flux in the far infrared and sub-millimeter regions, in terms of a detached shell generated by an episode of higher (than the current) mass loss rate. Two different compositions for the detached shell were used: oxygen-rich and carbon-rich dust grains. By assuming that at longer wavelengths the extinction coefficients of the oxygen-rich grains follow a $\lambda^{-2}$ law, and those of the amorphous carbon follow a $\lambda^{-1.3}$ law, we show that the model including a detached carbon-rich shell produces a more satisfactory fit to the observational data compared to the model with a detached oxygen-rich shell. Moreover, we derived a relatively small value for the distance of the detached shell from the central star, which implies that the episode of high mass loss rate ended only few hundred years ago. The results of our analysis are consistent with the scenario for the stellar evolution on the asymptotic giant branch which predicts a short time-scale modulation of the mass loss rate induced by repeated Helium shell-flashes.