You don't necessarily need to do it exactly like this, but yep, it's pretty neat. A more general version of this "trick" is to simply aim to provide the chatbot with as much context as possible.
For instance: Let's say you have a meeting transcript and want to generate a meeting summary based off of it. You could probably get away with a prompt like
Based on the attached transcript, write me a summary
But you would probably get much further with something more like this:
I never gave gpt prompts longer than 2 line. I n. The more data you feed it with your instructions the better the results. Thank you for the lessons I'm going to practice right away
Awesome, I'm very happy to hear that. Do share whatever interesting stuff you figure out with us :) Also, feel free to tag me if you have questions or issues about anything in this regard. I'll help if I can
wait so this is how prompt engineering works, you first have to tell gpt who it is before instructing it? What a trick π€
You don't necessarily need to do it exactly like this, but yep, it's pretty neat. A more general version of this "trick" is to simply aim to provide the chatbot with as much context as possible.
For instance: Let's say you have a meeting transcript and want to generate a meeting summary based off of it. You could probably get away with a prompt like
But you would probably get much further with something more like this:
https://inleo.io/threads/view/mightpossibly/re-taskmaster4450le-24wdbzaae?referral=mightpossibly
I never gave gpt prompts longer than 2 line. I n. The more data you feed it with your instructions the better the results. Thank you for the lessons I'm going to practice right away
Awesome, I'm very happy to hear that. Do share whatever interesting stuff you figure out with us :) Also, feel free to tag me if you have questions or issues about anything in this regard. I'll help if I can
I'll definitely tag you, got a lot of prompt and programming questions tooπ
Looking forward to it π