Feedback-Gemini 2.0 Pro Experimental 02-05 - Random Erratic Code Generation

Working on a main.py file the AI was unable to generate the whole file.
We agreed to have him send it in parts.
He choose 13.
On part 12-13 the generate code started hanging up.
Part 12 partially completed, we tried several more times, and it appeared to complete.
Part 13 would not generate so I asked for the remaining main.py starting at Part 13 be broken up into parts.

After all parts we generated I had a 2500 line file, but my main.py file was only about 600 lines.

I am sure human error is a likely cause, but in a few other instances I have run into this issue with this model, like when we are working on something,
and he seems like a distracted person who goes off into other issues and is kind of ALPHA and just starts creating code and getting off main topic of current objective.

I agree with your sentiment, 02-05 has a somewhat awkward “personality”, I think that’s why several people on this forum are pining for -1206, he was gentler. One thing that you can do that helps: instead of negotiating how many parts the generation will require, let the model generate. As you have seen, it then chokes at some point, mid-code. That’s not a problem. You then issue the one-word instruction “Continue”. The model then resumes, usually from the top of the function it was in the process of generating when it approached the output token limit, which is why it “choked” in the first place.

You can easily splice the generated parts after a “Continue” yourself in your editor. Hope that helps.