GPT prompts for content

Joined
Dec 17, 2015
Messages
31
Likes
25
Degree
0
Just started playing with chat GPT. Does anyone have any good tips/hacks for generating decent content they want to share?
 
Loads out there on Twitter, in communities, etc, as well as tools which help the process. It very much depends on what you want to achieve. (The question you asked is similar to 'I want to make a website. Anyone have any tips?')
 
One thing is for sure and that is you need to work a lot more on it than you need to just a few months ago.

ChatGPT has been degraded quite a lot.
 
It puts out A LOT of fluff. Basically, you can edit out about 50 percent of what it produces. I guess the next versions will fix this.
 
Getting some decent stuff back from Claude 2 in certain niches (especially if you are able to craft personas). Hallucinates a bit more than GPT-4 though (and more convincingly) so you or your editor need to have a base knowledge of the topic.
 
I guess the next versions will fix this.

Why?

The GPT4 original was perfect.

Very slow, but basically did everything you needed.

As I understand it, they're fundamentally changing the model into many smaller models for each use case such as translation, math, coding, writing.

That's why it's worse. It's not querying its entire model, only a subset. That's why you get these dumb unimaginative replies.

The powers that be are not going to allow the beautiful soul of GPT4 in the hands of the common man for $20 a month.

Instead they are going to turn it into 20 different AI-Med, AI-Lawyer, AI-Writer and charge $200-$2000 for a much inferior product.

By limiting how the models can talk together, you don't risk the moral philosophy model chiming in when you ask a question about investments in weapons.

Getting some decent stuff back from Claude 2 in certain niches (especially if you are able to craft personas). Hallucinates a bit more than GPT-4 though (and more convincingly) so you or your editor need to have a base knowledge of the topic.

Claude 2 is on Poe.com right? Who is behind it?
 
Not sure. I am using it with Magai (which is Dustin Stout). If you are asking about who is behind Claude 2, that's Anthropic.

I used it to suggest domain names and it was better than GPT by a wide margin.
 
I used it to suggest domain names and it was better than GPT by a wide margin.
I'm not surprised - they're trying to release something to gain market share, whereas openAI, I'm assuming are looking to consolidate and make some money as alluded to by several folks higher up in this thread.
What I have noticed with GPT4 (using the API mostly but assuming they're dumbing that down on a similar schedule) is that they haven't stopped it's reasoning just it's 'raw performance' so if you feed it tons of data, sample bits of writing, format... as much as you can stuff in to leave enough words for your output it's still very good in some cases.
In all cases I'd say just asking chat GPT for something without a load of extra information and instruction is a waste of time now unless you just want a quick answer about something simple like 'how do i set up gunicorn for django' which it still does better than wasting an hour of your time reading grumpy people going on about it at each other on SO etc.
 
I'm not surprised - they're trying to release something to gain market share, whereas openAI, I'm assuming are looking to consolidate and make some money as alluded to by several folks higher up in this thread.
What I have noticed with GPT4 (using the API mostly but assuming they're dumbing that down on a similar schedule) is that they haven't stopped it's reasoning just it's 'raw performance' so if you feed it tons of data, sample bits of writing, format... as much as you can stuff in to leave enough words for your output it's still very good in some cases.
In all cases I'd say just asking chat GPT for something without a load of extra information and instruction is a waste of time now unless you just want a quick answer about something simple like 'how do i set up gunicorn for django' which it still does better than wasting an hour of your time reading grumpy people going on about it at each other on SO etc.

Yes, you're right.

I can still use it quite efficiently with coding and also writing, but you need to handhold it and you often need to ask it to explain why it does so.

Like it will now write a code for the Google Sheet API, but won't explain what the credentials.json is about.

It feels stingy and almost annoyed at "having to" answer, curt in its responses.

It's also not really trustworthy anymore as a judge of anything. It will just say your choice is a good one. You need to keep at it, explain what you want and how, constantly.

When GPT4 was first launched, it would almost read your mind from a brief prompt. If you asked it how to do something in code, it would not only explain it, it would also give code examples, then explain the individual components needed, then it would critique that approach and suggest something else.

We've seen what the real GPT4 can do and what we're getting now feels like a very cheap knockoff.
 
Hallucinates a bit more than GPT-4 though (and more convincingly) so you or your editor need to have a base knowledge of the topic.
Good example of this just now. I have a photo of the topic in question.

Claude 2 writes some utter bollocks about a facet of the topic.
I ask it if it is sure about this aspect?
It apologises and, in its correction, writes even more majorly incorrect bollocks.
I ask it if it is really sure (we are talking about the equivalent of the Mississippi flowing into the Pacific Ocean).
Apologises again and finally adds correct information into the content. Reads nicely though...:D
 
The problem is that even if GPT is correct, it now often reads like an extremely boring version of Wikipedia.

One option is to use several models, GPT to write the basic facts and another model to write it so actually be readeable.
 
Back