MidJourney Version3 Comparison and Review

For the past two days I have been exploring and testing the new Version3 generational algorithm from MidJourney. This one’s goal is to improve accuracy of output and a new upscaler to remove distortions and artifacts. My initial experiments showed the potential of this new algorithm but it seemed to take away some of the “dirtiness” I came to like when beginning to work with MidJourney.

I decided to perform some experiments comparing the Version2 and Version3 algorithms and see how the Version3 can be tweaked to regain some of the “dirt” the V2 algorithm had that made it so special.

To start, I generated the inside of an abandoned shopping mall taken on a Polaroid. MidJourney V2 did a really good job of capturing that grit and small distortions that capture the aesthetic Polaroid film would have.

/imagine a polaroid of the inside of an abandoned mall –seed 0420 –v 2

This image will serve as the reference point I will try and recreate with the V3 algorithm. By setting a seed value, the various generations will be somewhat similar.

/imagine a polaroid of the inside of an abandoned mall –seed 0420 –v 3

To start, I created a V3 generation using all the default settings. This is the most “neutral” of the V3 options in stylizing and quality. Comparing the two, it appears the V3 one is more “clean” in appearance as some in some cases the photo doesn’t look like a Polaroid at all.

/imagine a polaroid of the inside of an abandoned mall –seed 0420 –v 3 –stylize 5000

Bumping up the stylizing gives it a way cleaner look. There’s hardly any distortions in the sample renders and the subject of the images themselves appear too straight and perfect. This has started to drift away from an abandoned mall.

/imagine a polaroid of the inside of an abandoned mall –seed 0420 –v 3 –stylize 20000

Pushing the stylizing further up generates more abstract renderings. As shown above, these are completely far from want a mall looks like. There’s also mountains, clouds, and other structures in the resulting images that don’t make any sense. When testing these high stylizing options on other generations, those features mentioned always show up. Regardless of what the subject is, multi-color clouds, mountains, and other fantasy objects appear.

/imagine a polaroid of the inside of an abandoned mall –seed 0420 –v 3 –stylize 625 –q .25

Next I decided to explore the quality options while keeping the stylized settings as low as possible. The lowest quality, .25, produced renderings closer to what the V2 algorithm did. The images just don’t seem that appealing as the quality is really low.

/imagine a polaroid of the inside of an abandoned mall –seed 0420 –v 3 –stylize 625 –q .5

Pushing the quality up a notch while keeping the stylizing the same got me much closer to V2 renderings. The proper aesthetic for an abandoned mall is there and the images do look like they were taken with a Polaroid. I’m happy with these, but there’s still some exploring to do with other options.

/imagine a polaroid of the inside of an abandoned mall –seed 0420 –v 3 –stylize 1250 –q .5

For the final rendering, I put the stylizing back to the default and still kept the quality down a bit. This produced images that are the closest I was able to come to the V2 aesthetic of a Polaroid of an abandoned mall. Not all the grit from V2 is there, but enough of the characteristic remain for it to be usable.

One thought on “MidJourney Version3 Comparison and Review

  1. Pingback: MidJourney V4 review | DistortedReality.io

Comments are closed.