I've seen a lot of "GPT detection" products floating around lately. Sebastian discusses some of the products and their approaches in this article. Some products claim to have developed an "algorithm...
Logs
Brex wrote a nice beginner guide on prompt engineering.
A low-effort quality-of-life improvement for oncall has been starting a week-long shift on a Friday instead of a Monday. Beginning a weekend with oncall isn't the best, but it's more than offset by...
LMQL is a SQL-like programming language for interacting with LMs. It takes a declarative approach to specifying the output constraints for a language model, with a SQL flavor.
marvin's @ai_model decorator implements something similar to what I had in mind for extracting structured data from an input to a language model.
Restricting the next predicted token to adhere to a specific context free grammar seems like a big step forward in weaving language models into applications.
Using system prompts provides an intuitive separation for input and output schema from input content.
With the support of GPT-4, I feel unstoppable. The overnight surge in productivity is intoxicating, not for making money or starting a business, but for the sheer joy of continuously creating ideas...
I wrote a few paragraphs disagreeing with Paul's take, asserting that, like Simon suggests, we should think of language models like ChatGPT as a “calculator for words”.