Nope, not programmed specifically to do that!
But it was programmed to look at any given sentence and predict a likely next clause, based on all the billions of words of English it had been trained on.
So with a system like that, if you feed it prompts specifically asking about feelings and interiority, it’s going to predict clauses that respond thusly. I myself don’t think that qualifies as sentience; humans and (to the extent that we can determine this!) other conscious animals do not appear to be nothing but mechanisms picking likely responses to outside stimuli …
There are others who disagree, though!