In this paper, a new noise minimization approach is proposed for CMOS current-mode (CM) circuits whose input stage is differential. This is realized by focusing on input stage and some output stage transistors' transconductance. Effect of output stage over the noise model depends on output stage's operation. This minimization is introduced to designers as a trade-off between design parameters and noise reduction. Analyses are presented in basis of Differential Difference Current Conveyor (DDCC) for simplicity. To reinforce theoretical concept, simulation results are given both in schematic and layout based. Moreover, a DDCC filter application, which has single input and four outputs is presented to verify theoretical minimization approach. After minimization, it is shown that significant noise reduction is obtained up to 50%. In addition, Monte Carlo analysis is given in order to investigate process variations and temperature effects on measured input referred noise.