Using ChatGPT

Discussion about the New Testament, apocrypha, gnostics, church fathers, Christian origins, historical Jesus or otherwise, etc.
User avatar
Giuseppe
Posts: 13931
Joined: Mon Apr 27, 2015 5:37 am
Location: Italy

Re: Basic reason why the name of Pilate was absent in the Earliest Passion Story extrapolated from Mark

Post by Giuseppe »

I have asked to ChatGBT:

Giuseppe:
Do you know what "causatively" means with respect to Hebrew verbs?

ChatGPT:
Yes, in Hebrew grammar, causative verbs are used to indicate that the subject of the verb is causing someone or something else to perform the action. For example, the causative form of the verb "to eat" would indicate causing someone to eat, rather than eating oneself.

Giuseppe:
Can פלט mean in Hebrew "to release, to deliver, to set one free"?

ChatGPT:
Yes, the Hebrew verb פלט (pālaṭ) can indeed mean "to release," "to deliver," or "to set one free." It is often used in contexts where someone is being freed from captivity, danger, or oppression

Neil was right in saying that ChatGPT is a more honest interlocutor than many human beings.
DrSarah
Posts: 57
Joined: Sun Aug 27, 2023 11:44 pm

Re: Basic reason why the name of Pilate was absent in the Earliest Passion Story extrapolated from Mark

Post by DrSarah »

Giuseppe wrote: Wed Feb 28, 2024 12:54 pm I have asked to ChatGBT:
Okaaaaay, pause it right there. ChatGPT is well known for not being reliable in its accuracy. It's an AI designed to simulate conversation, not to fact-check. There are many examples of ChatGPT stating inaccurate information with complete confidence. It is - and I cannot stress this enough - a really bad idea to rely on ChatGPT for any information.
User avatar
billd89
Posts: 1412
Joined: Thu Jun 25, 2020 6:27 pm
Location: New England, USA

Re: Forever?

Post by billd89 »

DrSarah wrote: Thu Feb 29, 2024 12:16 am
Giuseppe wrote: Wed Feb 28, 2024 12:54 pm I have asked to ChatGBT:
Okaaaaay, pause it right there. ChatGPT is well known for not being reliable in its accuracy. It's an AI designed to simulate conversation, not to fact-check. There are many examples of ChatGPT stating inaccurate information with complete confidence. It is - and I cannot stress this enough - a really bad idea to rely on ChatGPT for any information.
Well, Yes and No. I myself have posted some examples of chatbox bollocks and replies to what has been -- to this point in time -- misuse of the technology. For now, I think it's more ... "overreach," caveat lector.

But ChatGPT is adapting, 'learning' advancing: certain deficits several months back may not be present now. I wouldn't throw the baby out with the bathwater, not just yet. ChatGPT3.5 is/was a great improvement; so, who here knows how far it will advance in 3 years?

Historically speaking, such total, permanent rejections of new technology/adaptive tools have not proven not wise.


Image
Post Reply