(2023-05-24) Udell When The Rubber Duck Talks Back
Jon Udell: When the rubber duck talks back. The pre-release version of the plugin consolidated pagination for many tables in one place. That was a good thing, but the downside was that there was only one Steampipe table which represented what should have been many of them
This set up the possibility for an interesting comparison. ChatGPT-4 builds on OpenAI’s LLM; Sourcegraph’s Cody, on the other hand, uses Anthropic’s Claude.
Another key difference is that ChatGPT only has the context you paste into it. Cody, sitting inside VSCode, can see your repository and has all that context.
In both cases, as should be no surprise, it wasn’t enough to just ask the tools to consolidate the pagination logic. They were perfectly happy to propose solutions that could never work and might not even compile
The key insight
then came the insight. The helper functions could stream results directly to Steampipe, and just return nil or err to the calling List function
Partnering with machine intelligence
I came away with a profound sense that the real value of these assistants isn’t any particular piece of code that they get “right” or “wrong” but rather the process of collaborating with them. When you’re working alone, you have an ongoing conversation with yourself, usually in your own head. The point of talking to a rubber duck is to voice that conversation so you can more effectively reason about it. (rubber-ducking)
when the rubber duck talks back, it’s a whole new game
As Garry Kasparov famously wrote: The winner was revealed to be not a grandmaster with a state-of-the-art PC but a pair of amateur American chess players using three computers at the same time. (centaur)