AI Copilot Interfaces

Tools that make me less smart

ChatGPT is the most powerful tool I’ve used in my career. It’s incredible. I feel so lucky to be living in a world where my work is supercharged by machines that think. I use it for most of my daily work, especially programming. I hate to admit it, but as a lazy programmer, I often use ChatGPT to bootstrap projects, functions, components etc. Examples:

  • NextJS/Tailwind app? -> “Create a landing page for a nextjs/tailwind app, flex, two columns, stacked on mobile”
  • FastAPI server? “Write a basic fastapi server with the endpoints”

I then find myself going back and forth with ChatGPT as I start building on those foundations. It’s taken a while, but I’ve realized that I’m not a fan of this mode of interaction because it leads me to actively becoming more dumb: copying and pasting blobs of code is not good for my brain (sounds obvious, but I didn’t see it this way for a while). It’s so tempting to do because those blobs look 90% correct, but there are so many reasons as to why this is a terrible idea.

Tools that make me smarter

After using some other tools, I noticed that this feeling is not universal. Some tools truly feel like a copilot for my work, and I think that is due to the fact that they are deeply integrated in whatever tool I’m doing work in, or they are the tool and keep me in the driver’s seat. Examples:

  • Github Copilot. I use it inside VSCode and find it incredibly helpful for asking questions about functions, using tab autocomplete etc. I always feel like I have control over the code I’m writing and it feels like I’m being supercharged while still flexing the muscle that makes one better at something.
  • Perplexity. I use Perplexity about 50% of the time (that number is rapidly approaching 100%) and find it so much better than ChatGPT because it is less open-ended, and is deeply integrated with a web index.
  • Krea*. I started using Krea for basic image generation, but the fact that it’s a design canvas itself makes it easy to work with images once they are generated. Their deep integration with other design tools also really helps.

With all the above said, there are also some AI tools that I find myself not using - Notion AI is a good example. I find myself pretty consistently going back to ChatGPT when I’m writing Notion docs. I think it’s something to do with the UX of how Notion generates text. I’d prefer more of a copilot-style experience (tab autocomplete, for example), as well as a sidebar that keeps track of AI suggestions rather than a floating pop-up window. My guess is I’m not using Notion AI because that feeling of being in the driver’s seat is not there (yet; Notion is a wonderful product and I have no doubt I will be using their AI features in the future).

Future of AI/copilot interfaces

An interesting question to ask: which is the right approach? If a language model is so much better than me at performing a task, why should I be in the driver's seat? As LLMs become more powerful, I’m curious how this paradigm evolves. Some tools I’d love to see a copilot-style experience for:

  • Spreadsheets. I’m surprised there aren’t a crop of AI-native spreadsheet companies popping up. I like Rows’ approach, but admittedly have not played around with it deeply.
  • Vertical Knowledge work. Most knowledge work in a given vertical requires specialized tools, knowledge, data etc. I am in the camp that there should be vertical copilots for any industry with complex knowledge work setups. Github Copilot was the first example of this that I came across (code), Perplexity is another (general search - not really a category but still). I’m sure one will crop up for finance (Check out Lynq*), scientific research (Elicit), and many, many other categories.

If you're working on any ideas in these spaces, I'd love to chat! I'm luke (at) pebblebed (dot) com.

* I'm an investor in Lynq & Krea