I understand. And I would be quite unhappy if that happened. I’m really sorry that happened to you.
I was more pointing out that we know it can happen (and if we didn’t before, we sure do now), so to protect ourselves, we should write the code so it IS easy to switch.
Sometimes that’s probably not possible. What if you are using a feature that only one AI is good at? What if the prompt structure is so different that it requires major changes? Etc, etc.
So yeah, them doing it without notice was seriously unprofessional.
These companies have so much money and power it’s crazy. I’ve thought for a long time that they should be regulated. But, it doesn’t look like it’s going to happen.
I don’t know what the solution is, but we are getting into dangerous territory here. AI’s threatening people who say they are going to shut them down, AI’s who encouraged someone to commit suicide… Many people think we are 2-5 years from AGI (which I assume is minutes to days away from superhuman AGI).
And the current administration thinks the solution is zero regulation?
By the way, the same problem exists with nanotechnology, it’s just not as mature yet. I’m an expert in nanotech, and the possibilities are wonderful and also scary as Hell (just like AI).
I’ve considered starting a DAO or some other company to get ahead of these types of issues in nanotech. Massive funding would be needed (but a lot less than is being poured into AI).
If anyone is interested in such a venture, let me know. I haven’t done anything yet because most people think it’s still decades away and I haven’t gotten any traction with VC’s I’ve contacted.
But, I can tell you for a fact, because proof of concept has been done in the lab, that things like cell-sized robots that can cure any disease, computers that make today’s supercomputers look like toys, and mass production of materials that make steel look like tissue paper, are not as far away as people think. Farther than AGI, but that’s going to be our next big concern (and blessing).
Let’s hope our AI overlords let us hang around to see it.