If that's stuff is authentic it might be a turning point in the story of manking, the creation of an artificial sentient being.
We're a long way from that. A looooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooong way! (That's really not enough o's, but I had to stop somewhere).
The problem is that we associate language with sapience more than anything else, because we're the only thing that talks. But the thing is, langauge is an abstraction of concepts that we comprehend on a much more fundamental level.
The idea of the "idiot savant" is very applicable here. We're more used to that when it comes to math, because there's people that are very good at math but completely helpless otherwise. A computer is essentially that with everything turned up to eleven.
What is a concept that people have to get used to is that math isn't the only thing machines can be good at without actually comprehending the underlying principles. WIth the advent of neural nets, we have managed to find an abstraction that can let computers get good at almost anything, provided you can provide them enough data.
What you're seeing here is a program that can process language very well... but without
any understanding, much less comprehension, of
anything that language actually represents. Any ant has more actual
awareness. It's still a machine. Hence, chinese room fallacy.
There is a humonguous amount of work to be done until we can even begin to hope to create a neural network with the actual
awareness of even a lower animal.
Boston Dynamics robot drones are a
lot closer to having something resembling awareness than this program (though still a loooooooooo... oh, you get the point, I think).