Yeah it is. The training data skews white, so they added a “make some people non-white” kludge. It wouldn’t be needed if there was actually racial diversity in the training data.
It’s the “make some people non-white” kludge that’s the specific problem being discussed here.
The training data skewing white is a different problem, but IMO not as big of one. The solution is simple, as I’ve discovered over many months of using local image generators. Let the user specify what exactly they want.
It’s not the training data that’s the problem here.
Yeah it is. The training data skews white, so they added a “make some people non-white” kludge. It wouldn’t be needed if there was actually racial diversity in the training data.
It’s the “make some people non-white” kludge that’s the specific problem being discussed here.
The training data skewing white is a different problem, but IMO not as big of one. The solution is simple, as I’ve discovered over many months of using local image generators. Let the user specify what exactly they want.