Avoid bad use cases

The easiest way to optimize GPT costs is to not use it 🙂

You might think I am merely joking, but it is already quite clear that people are using GPT for bad use cases and facing the consequences of their poor choices.

This talk by spaCy founder Ines Montani provides a very structured way to think about this. I recommend taking the time to watch it in full. Not only will that save you some time and money in the future, it can help you avoid unnecessary headaches brought on by using GPT for the wrong use case.

I will elaborate on the four problems she mentions:

1 Specific is better

There are some NLP tasks – for e.g. Named Entity Recognition over a large volume of text, where using GPT does not even make much sense.

Even if you leave aside the cost incurred due to the token usage, by using an actual NLP library like spaCy you can be much more specific about the result you want. That is, the entity type, the entity value which is extracted, and the specific ID of the entity (i.e. entity linking) – all this can be trained to your custom needs.

2 Faster is better

It is now well known that the latency of OpenAI requests can be a bit too high for some use cases such as chatbots (there are some ways to mitigate this problem).

Unless OpenAI dramatically improves their latency, you would simply be better off using much faster bot frameworks like Dialogflow as long as the tradeoffs make sense, and they often do.

3 Private is better

There isn’t much need to elaborate on this.

GPT’s accuracy depends a lot on your ability to supply highly descriptive prompts with plenty of context, which means it is a bad idea for use cases where you need to protect your user’s data privacy.

4 Better is better

This might seem a little hard to believe if you are only listening to people who hype up GPT, but there are some NLP use cases where GPT will simply not be up to the task. These are usually algorithmic techniques which are based on structured NLP outputs. I will provide an example of this in a future lesson.