yeah, i think that is because it knows how research papers should look like and how references look like, but since it has no reasoning, it will just do whatever. I used gpt to diagnose my problem with internet getting cut off and it determined its because of drivers, which sounds reasonable. Then it suggested that i download the latest ones and it did link to correct website but it also tried to download stuff that doesnt exist. No idea how it determined the version numbers and such, maybe based on earlier patterns.
But it isnt making stuff up, its just outputting the best data it can based on what it has been trained with and what it can find. Its not lazyness but just doing what its doing. Just like code that isnt doing what you want it to do isnt doing it out of malice but because there is a mistake in the code.
yeah, i think that is because it knows how research papers should look like and how references look like, but since it has no reasoning, it will just do whatever. I used gpt to diagnose my problem with internet getting cut off and it determined its because of drivers, which sounds reasonable. Then it suggested that i download the latest ones and it did link to correct website but it also tried to download stuff that doesnt exist. No idea how it determined the version numbers and such, maybe based on earlier patterns.
But it isnt making stuff up, its just outputting the best data it can based on what it has been trained with and what it can find. Its not lazyness but just doing what its doing. Just like code that isnt doing what you want it to do isnt doing it out of malice but because there is a mistake in the code.
It doesn’t have access to the training data. It’s not outputting training data, it’s making up something that feels like the training data.