In this paper, we deal with an important issue generally omitted in the current literature on evolutionary multiobjective optimization: on-line adaptation. We propose a revised version of our micro-GA for multiobjective optimization which does not require any parameter fine-tuning. Furthermore, we introduce in this paper a dynamic selection scheme through which our algorithm decides which is the “best’ crossover operator to be used at any given time. Such a scheme has helped to improve the performance of the new version of the algorithm which is called the micro-GA2 (μGA2). The new approach is validated using several test function and metrics taken from the specialized literature and it is compared to the NSGA-II and PAES.