Esther speaks to community member Donald about Directus Copilot - a panel extension allows users to ask contextual questions about their data within Insights dashboards.
Speaker 0: Getting a back end that is customer friendly out of the box is huge.
Speaker 1: Hi, everyone, and welcome to another episode of the beyond the core show. The beyond the call show is really a director show that shines a spotlight on extension developers in the community. And today, I have a special guest with me. I have Donald. He is the winner of the past directors AI hackathon.
And, yeah, he's just gonna be sharing about his journey on implementing the Director's Copilot extension. My name is Esther, and I work as a developer advocate at Director. So thanks for joining me, Donald. Would you like to introduce yourself?
Speaker 0: Thanks for having me, Esther. Yeah. So I'm a software developer. I've been doing consulting for about 8 years now. And, you know, one of my niches is helping people move from WordPress to scale their business when they need custom software.
And, you know, I found Directus is is a crucial part of that now. I think it's a I think it's one of the best headless CMSs out there. It's it's,
Speaker 1: That's good to hear.
Speaker 0: Yeah. So we can talk more about that later, but, yeah, I I happened to enter the the AI hackathon. I think it was maybe the first one that you guys Yes.
Speaker 1: It was the first one.
Speaker 0: Yeah. And so I I don't know. I just like, I'll take a shot at it, and, it was a lot of fun. Give me a good excuse to try out some of these new AI technologies. And, Yeah.
Yeah.
Speaker 1: Yeah. Nice. We'll be we'll be diving into that very soon. And I'm just curious to know, how did you get to know about directors in the first place?
Speaker 0: So I had heard about headless CMSs, and I had a client project that needed 1. And I just I evaluated all of them out there pretty much. Mhmm. You know, some of the other popular ones are, like, Strapi and Contentful. Yeah.
Direct has checked all the boxes for me. So, you know, open source. It sits on a Postgres database, which is really nice. I mean, you can connect it to any database. But, and then I looked at the code, and it was all TypeScript view, like, very well written, had a great community.
Yeah. So I think I did, like, a pull request, and then everyone was super nice to me. Because when I was evaluating, I found, like, a bug. And so I just found that the whole community around it was really good too. So, yeah, that's how I found out about Directus.
I ended up using it for for a client project after evaluating a bunch of other headless CMSs.
Speaker 1: You you worked on the Director's Copilot extension, and just looking at it, it looked really amazing. I really loved it. So would you like to tell us more about this extension and, yeah, what does it do exactly?
Speaker 0: Right. So it it basically connects Directus to OpenAI chat, and Mhmm. It allows you to use kind of like a chat interface to ask questions about your database. And so, you know, the idea behind it was someone that's nontechnical. It would be nice if they could formulate requests to the database just using a chat type of window.
So, yeah, that was kind of the idea behind it, in terms of the user perspective. You know, I was thinking of really, when I was approaching it is, what can I get done in a short amount of time to be able to enter the hackathon? And, Directus had some very handy features for, connecting all those things together. So that was another aspect of it.
Speaker 1: Nice. About how long did it take you to, you know, complete the extension? Did it take long?
Speaker 0: No. It it came together pretty quickly when I discovered the pieces that I needed to put together. I think it was about, like, a day and a half.
Speaker 1: Oh, that's fast. Yeah.
Speaker 0: Yeah. So, like, a little a little over a day.
Speaker 1: Okay. Okay. Nice. So would you like to just walk us through maybe some parts of the code and then a quick demo of the extension? I guess you could share your screen.
Speaker 0: Right? Okay. So the main the main feature that I used in Directus was the specification service. So there's a there's an API endpoint in Directus that gives you back an open API schema back that basically describes all the endpoints that you can call with API and, you know, all the parameters that those endpoints take. Mhmm.
And and that, OpenAI schema happens to fit perfectly into the, OpenAI. There's this feature of OpenAI called functions. And so when you submit an OpenAI request, you can tell it, hey. Here's a bunch of functions you could call, and here's the schema for it. And Nice.
So then you send that along with your prompt and say, hey. You know, how many orders did I have today? And then that funk it looks in all those functions and picks the right one to call. It tells you that back, so you get so it's kind of 2 requests to OpenAI. So, you know, the first one the first endpoint so this is the endpoint for the, the chat interface.
So first, it just hits Okay. Oops. It hits ask, and it basically tells ask OpenAI what endpoint should I call. Open a OpenAI tells tells, it responds with the API that things should be called with the parameters that will meet the the needs of the prompt. And then there's another call that actually makes that API call and sends the result back to OpenAI, and then it sends kind of like, once it has the result, it part then OpenAI parses that result and gives you
Speaker 1: I see.
Speaker 0: A language version of what the result it describes the result for you, essentially. Nice. So this is kinda as basic as you can get with, like, a, you know, an OpenAI, like, agent type. It's not really an agent. It could do a lot more fancy things.
Like, you know, you could like, right now, it basically has to answer your question in one API call. It can't, like Okay. Call multiple APIs yet and and kinda then, like, combine those together somehow. But it could if you know, potentially, it could do something like that. But this is just kind of a
Speaker 1: a program. One call per time?
Speaker 0: Right. It's like a one shot one shot. So OpenAI has to be able to get it in one shot.
Speaker 1: Okay.
Speaker 0: And, yeah, I guess there's just, you know, the the UI code. Mhmm. But, yeah, there's, like, a service that interacts with OpenAI here. And so you can see, like, you know, what the prompt is. So there's a, like, a base prompt.
Trying to remember what my base base prompt was here. Oh, okay. Yeah. It's it's something like this.
Speaker 1: Okay. That's the base prompt just before you type anything extra.
Speaker 0: Exactly. So this this gets put at the top of the prompt to OpenAI.
Speaker 1: Nice.
Speaker 0: And and, yeah, that that's that's pretty much all there is to it. I I use something called langchain.
Speaker 1: Yeah. I'm interested. What's langchain used for?
Speaker 0: So langchain helps you build, like, a chain of prompts that, you know, it can get pretty fancy to where you can have, like, an agent style thing where you can have multiple branches of, you know, prompt and and answers and kinda go back and forth and have, like, a you know, it's like a a graph of of what the agent could do. You know, this is a very simple chain where it's just, you know, an API call with a prompt and then, you know, like I explained before. But lane chain helps you build out those build out those workflows for the AI. And the nice thing about it is you can combine a bunch of different LLMs. So, like, you could do some you could choose, like, chat g g GPT 3.5 for some steps and GPT 4 for, like, harder steps where it needs to be smarter, I guess you could say.
Okay.
Speaker 1: So yeah. Yeah. Yeah. So, I have an interesting question. Would we still have, like, implemented this copilot without necessarily using launching, or is he an optional, like, tool?
Speaker 0: Yeah. Yeah. I think I think you could've. Yes.
Speaker 1: Okay.
Speaker 0: I was I was just using it because it
Speaker 1: It optimizes the experience and the performance more like. Right?
Speaker 0: Yeah. And it was just a little easier than than trying to trying to build it build it all myself. Yeah.
Speaker 1: Yeah. Nice. Alright. Cool. Like, I guess we can go into, like, directors to see how the panel works.
Speaker 0: Yeah. This is the insights the insights tab, and
Speaker 1: Yeah.
Speaker 0: I've set up a little little dashboard here and added my component. So that's what this extension is. It's just a, yeah, it's just a an insights panel.
Speaker 1: Mhmm.
Speaker 0: So insights panel here, and it has some settings. So here here's my key, which I'll I'll have to revoke after the after the call, but you can select the model that you wanna use. Okay. Put your open add key in. It'll get it from the environment if you don't put it in there.
And, yeah, then you could just start asking questions. So I've got a pretty simple database here.
Speaker 1: Data. Okay.
Speaker 0: Like, a customer's table with with some products, and I've got, like, some orders that have it's kinda like a like, an invoice or something. So I maybe we could try
Speaker 1: Yeah. Let's see the products. Maybe I could just, like, pray, like, what product is this price, though?
Speaker 0: Yeah. And if do you wanna do you wanna go shopping and and pick pick some toys?
Speaker 1: Yeah. Yeah. Sure. Maybe the Rubik's cube and, yeah. We could say how much is the Rubik's cube, something like that.
Speaker 0: Okay. Yeah. I'll I'll create an order for you too. So Okay. I'll have to I'll have to make you a customer though customer though.
Speaker 1: Okay. Let's do it.
Speaker 0: Is that
Speaker 1: Yeah. That's it.
Speaker 0: And then save there. Yep. Okay. So make you an order. Let's get Esther.
And today Yeah. We'll add, oh, yeah. You wanted a Rubik's cube.
Speaker 1: Yeah.
Speaker 0: Okay. Group is q. K. Looks like it's oh oh, no. What?
2? Okay. We'll just do 1.
Speaker 1: Okay. Let's do 1.
Speaker 0: And so let's see. I think I have these. You don't have to pay any tax.
Speaker 1: Nice.
Speaker 0: And click save. Okay. So now you've got an order here. I just need order. So here's order number 2.
Okay. Okay. Sorry. I'm just making sure this is correct here.
Speaker 1: Yeah. That's fine.
Speaker 0: Something seems a little strange. Sorry.
Speaker 1: Is it fact that it's just one order item?
Speaker 0: Yeah. Because I thought I I don't know if maybe I clicked in here. Did I add a new order items? I just thought
Speaker 1: You added the one for Rubik's cube.
Speaker 0: Okay. And then I thought center oh, I did I added an existing, I think, is my problem. I added an existing item. So I I I guess I'd cannibalize another.
Speaker 1: Okay.
Speaker 0: I'll I'll I'll have this make make someone else buy this. So that just sketch It's in a sense.
Speaker 1: Yep. So let's
Speaker 0: Sorry.
Speaker 1: So now
Speaker 0: we have So now we have 2 2 others.
Speaker 1: Now we have 2 others. Yeah.
Speaker 0: Okay. So if we go back to the chat window, I think you had asked how much is a Rubik's cube.
Speaker 1: Rubik's cube view.
Speaker 0: I don't know if it will be able to do this one.
Speaker 1: Let's see.
Speaker 0: Oh, okay.
Speaker 1: It did it. Yeah.
Speaker 0: It did it. Okay. So Yeah. Keep keep in mind that price is incense. Okay.
Here. Well, I'll have to ask it again. So how much is a Rubik's cube? Keep in mind prices are in cents.
Speaker 1: In cents.
Speaker 0: Shouldn't say it should say $7 now. Okay. There we
Speaker 1: go. Dollars. Yeah.
Speaker 0: 700¢. $7. Okay.
Speaker 1: Yeah. Makes a lot of sense. Yeah. It's really smart. It's using OpenAI, so it's really smart.
Speaker 0: Yeah. Yeah. It it can it can be pretty surprising with what it can do. You know, one thing I I realized when building this is the the open the open API, not to Mhmm. So it's kinda confusing because there's open API and then open AI.
But the
Speaker 1: the open API Okay. Schema,
Speaker 0: is is generated from the stuff we basically, from the way we create fields and things in. So, you know, when you're when you're creating a field, it can be helpful to to, like, give it semantic information because that could help inform the API or the the AI, if if that makes sense. So I tried to, like, have a field description to to say that the price was in cents here. So that might be but that I don't think that passes through to the schema
Speaker 1: yet. Yeah.
Speaker 0: But if we could pass that through the schema, it might be able to get that in in one shot in one shot. But
Speaker 1: Yeah. Possibly. Possibly. Do you wanna do another prompt?
Speaker 0: Yeah. Let's do that. Okay. What did you have in mind?
Speaker 1: Nothing.
Speaker 0: Okay. Could say something like how many orders did
Speaker 1: I have today? Yeah.
Speaker 0: Okay. So that didn't get this one. The product does not contain information by orders.
Speaker 1: Maybe we can modify to how many orders do I have instead of today. Okay. Let's see.
Speaker 0: I wonder why it's not Should
Speaker 1: we call it other items? No?
Speaker 0: Well, I wanted to list how many orders that has here.
Speaker 1: Okay.
Speaker 0: But, you know, this very possible that I'm doing something wrong, and it is a prototype after all.
Speaker 1: Yeah. It says.
Speaker 0: But it has been able to answer questions like these in the past. I'm not sure I'm not sure why. Make sure I'm using GPT 4. Okay. Yes.
We could try a GPT 3.
Speaker 1: Let's see. Yeah. 3.5. Okay.
Speaker 0: How many how many total orders do I have? Okay.
Speaker 1: Yeah. So it got So 33.5 worked. Okay. Could it be an issue with the model? Not sure.
Speaker 0: Yeah. I'm not sure what's tripping it up, really. It seemed like it wasn't find finding the right API to call.
Speaker 1: Okay. Yeah. So I'm guessing you also when you were developing this extension, you also added the, error message and the fetching. Like, why it says when it's fetching and says calling the API, you're the one that's programmed that here.
Speaker 0: Yes. So this this is the once it once you send it this prompt Mhmm. And then it gets the response, At this point is where it has either successfully chosen an API to call or it doesn't understand how to fulfill your request.
Speaker 1: Fulfill them. Okay. Thank you. That makes sense. Okay.
But it's really powerful. This is a prototype. I'm sure, there are other things that can be improved, like just making the experience a lot smoother. But this is really amazing. Just seeing how we can implement open API, AI's open API, and, you know, create an extension to just query your orders instead of having to.
Because, you know, if you have a long list of so many orders or so much data, you can easily just use this parts for extension to find the at the actual detail for that data instead of going through the entire spreadsheet or database. So this is definitely very useful. And what what were some maybe key considerations or things that you had to, you know, modify when using open a open API. I always trip on that. So confusing when using the open API to, you know, develop these extensions where there's certain considerations and things they had to modify.
Speaker 0: Yeah. So, you know, I mean, one of them was actually with Linkchain. So Linkchain is it was originally developed in Python, and I needed the JavaScript version. So there's a JavaScript version that's kind of trailing along the Python version.
Speaker 1: Thank you.
Speaker 0: And, at first, it wasn't parsing the open API schema correctly, and so it it it wouldn't work. But that's another good community over there. I was able to submit a pull request to fix that little bug, and they merged it and and now it works. Yeah. Another one was you can only send a limited number of functions.
So I tried to reduce what functions I was sending in the schema. So, basically, I only allow OpenAI to call, get methods. So that trims down the function to only only APIs that can do, read data. And if you have a large number of collections, though, you're still that's still gonna be too many, because I think you maybe only have 40 functions that you can send it. So that is just kind of an inbuilt limitation now.
You'd need something like a a way to organize or or, like, to to pick the most relevant, API calls for your prompt. And one way to do that is with vector embeddings. We create vector embeddings for all of your all of your functions and, like, the description of the function, and then use that to select the most relevant functions to send to OpenAI, OpenAI. But, Directus doesn't support that yet, but it could in the future, which would be awesome. I've seen some chatter, on GitHub about that.
So that was an that's another limitation. And, you know, limiting to get requests is I think if we if if I could open it up to post requests, there could be some interesting things that it could do. Like, for example, you could ask it to, hey. Could you mock up could you create some mock data for my customers table? And then OpenAI is pretty good at creating data like that.
So if you if you allow it to write to your bay database, which I I don't think many people might might not wanna do that. But it kind of opened up opens up some interesting possibilities. But, yeah, I would say those two things are probably the most difficult, the the lang chaining bug and then, figuring out how to pair down what functions get sent to OpenAI.
Speaker 1: Yeah. Yeah. That makes sense. But it's definitely incredible to see how much you're able to pull off. Like, just pulling off this in a day and a half.
Like, it's amazing. And, yeah, I'm excited to probably see this in the marketplace when we eventually launch and see all the incredible work that you'll also be doing in the director's community. So, yeah, thanks for sharing and, yeah, excited about what you do next. Alright. So before we wrap this up, I would just like to ask you, an exciting question.
I know when we're starting this, you talked about how your well, I say your mission is to, like, migrate develop parts from WordPress to directors, and, you know, you've used directors and you enjoyed. So what would you say is the most exciting director's feature that you've what what's that thing that you may love about directors? Or if there are more than 1, feel free to share.
Speaker 0: Okay. I mean, I mean, getting getting a back end that is customer friendly out of the box is huge for me. So that's a that just saves so much time not having to build that. And, so that that's that's very powerful. And then, my favorite feature is probably flows.
I've actually been able to do a lot with flows, And it from email reminders and all these got, like, half a dozen email flows, set up on one of my projects, and I've been really impressed with Flows. Yeah. It's a it's a very powerful system for just developing new features that even your customer can can do themselves. So, you know, it's kinda like a no code solution, but you can add a little bit of code to to really ramp up the the abilities. So yeah.
Speaker 1: Okay.
Speaker 0: Flows is my favorite feature, I think.
Speaker 1: Yeah. It's it's interesting you mentioned that because I actually really love flows as well. It's so useful. And, you know, just combining that with your data, there's so many amazing things that you can do with flows instead of maybe keying into a third party app to just handle, like, a basic flow. Directors can do that for you.
So, yeah, it's amazing to hear. Alright, Donald. It's been a pleasure just having you come talk to us about, your extension and everything that you you know, the the key points that you took notice of when developing it. So I'm excited about what you do in community, and, yeah, feel free to always drop by in our Discord community. I know you're an active member already, so feel free to do more.
And should we expect more extensions for from you in the future?
Speaker 0: Yeah. I think so.
Speaker 1: Okay.
Speaker 0: Yeah. I'd like to hit help out with some of the the vector database extensions. So if if anyone's out there that is thinking about doing that, love to love to chat.
Speaker 1: It's the not up.