The web is full of data. We'll show you how to extract it for AI, automation, or whatever you're building.
Speaker 0: Alright, folks. Welcome to another session here at Directus Leap Week. I am joined by Leo Gregorio from Fire Crawl. We've got some exciting stuff for you today. We're gonna be taming unstructured data.
Right? Turning the wild web into ready APIs using FireCrawl and Directus, to to serve those APIs. I am Brian Gillespie, a staff product engineer here at Directus. And, again, joined by special guest, Leo Gregorio from FireCrawl. Leo, maybe maybe I'll kick it over to you for a quick intro for everyone.
Speaker 1: It's a pleasure to be here, Brian. I've used Directus for quite a long time, and I'm stoked to be here.
Speaker 0: Yeah. I'm super excited to have you, man. Yeah. I've messed around with FireCrawl a little bit before this. I agree.
I'm I'm no way an expert. I'm glad to hear that you've messed with Directus before. But for the folks who are new to FireCrawl, you know, can you take a few moments just to explain what it what the tool is? Like, what do you guys do? What do you enable for developers?
Speaker 1: Sure. Over at FireCrawl, we specialize at providing web data to agents. Not only agents, we can also have endpoints where you can build your data, data web fetcher upon those APIs. But we mainly focus on providing data over to agents in a way that, it's it's basically giving years' eyes over to your cloud code instance to wherever you're really building, to fetch for data. You should be thinking of FireCrawl to do that.
Speaker 0: Sick. What are, like, some of the, like, specifics within that? Like, what what capabilities does FireCrawl have?
Speaker 1: Yeah. To fetch for data, I like to compare with our manual way of fetching data. So if you want to fetch a specific website, what you would do is search for something. So we have that specific endpoint on searching. You can then use scrape to scrape that specific URL.
And from the scrape endpoint, we have a bunch of different formats that we will likely be exploring through this presentation, which are screenshot. You can actually screenshot the entire page. There's a bunch of different ways to take that screenshot. It can be exactly what we're seeing right now. It can be of the entire page, mobile.
So that's just an example of one of our formats. We can fetch for summary. We allow the agents to execute and interact with the page, and that helps a lot with context, for example. So you're not only fetching pure HTML. You can fetch clean markdown.
You can fetch exactly what you need from that given page, and that's that's the beauty of it. That's what we try to optimize to. And, yeah, we we've been been pretty successful, while doing so.
Speaker 0: Yeah, man. I I've watched the growth of the last couple years, so kudos on that. It's it's a really interesting tool. And as I understand it, like, you guys have an agent thing on your endpoint as well. So you could you could basically call an agent your agent from a different agent.
Yes. And and get really Yes. Metal with it, which is nice. I I love the ability to, again, just throw in a URL in here and get structured data out of the other side because it Oh, yeah. Complements perfectly with Directus.
And what we do is, like, if if you give us structured data, we'll give you ready to go APIs and permissions and and all of that. So, super excited for this. Alright. Let's get down to brass tacks. Right?
What are we gonna build today? So the scenario that I've got, basically, I've got a directory of different software. This is on my own site. So this is a a Better Sign Shop. I love the print and sign industry.
I can't get out of it even though I've tried. Software comes up a lot. You know, I I guess it does in any industry these days. But, for sign and print, you know, how do they manage their business? There's a bunch of different tools for that.
And we've got a directory here that, is sorely unmaintained. Right? All this data was collected manually some years ago, and we've got things like features of a particular software. On on some pages, we have pricing information. But, you know, it's a lot to manage all this personally.
It it would be a lot to put on the vendor of the software to maintain. It'd be a lot for anybody on my side to say, okay. Keep all this up to date. Does this seem like a good use case for Firegirl?
Speaker 1: It's a perfect use case. Since we provide structured data, you can fetch for that. There's plenty of ways we can fetch for that. So if you already have the specific URL, you can use scrape or you can, like, use the agent that you mentioned. And by the way, the design of that website is pretty neat.
Okay. Thank you, man. Yeah. If my, pretty neat.
Speaker 0: Okay. Thank you, man. Yeah. My, I I've got three daughters. They love pink and purple.
So, hey, like, anything that I do personally, like, it has to have pink in it. Right? Alright. So let's let's talk through maybe how we're gonna build this real quick, and then we will dive into actually, like, going through it. Right?
So we've got Okay. We've got a software collection. You know, mostly, I wanna have, like, pricing data. We've got, like, a name. Let's just call it, like, a summary.
We're providing a URL. What else do we have? We got pricing data has to have a specific format, which should be good. I'm yeah. Maybe pricing is the the best place to start.
So a like, that is going to be I don't know the pricing URLs for this, but I do have the domain. So, like, walk me through what what this is gonna look
Speaker 1: like on the FireCrawl side. So from your website, it seems like you already have specific URLs. It doesn't seem like the idea is having a bunch of different URLs. It seems like it's focused on quality. Mhmm.
Therefore, if we were to have the specific pricing page, it would be better just because we can send off just one endpoint. It would be just scrape. So if we have that in the database, we can just search for that specific URL and get everything, all the structure, the price, we can check track if the the price changed, everything. If we don't, if the idea is scaling this, so for example, if your website were to have anyone add, any, type of software there Mhmm. This would scale up.
And we wouldn't have control over what is the URL paid, the the pricing page, the URL for everything. Right. And then we'd have to create some kind of system where we search and then actually find that that the the pricing page. We can either do that with crawl, because inside of crawl, we have an LLM, or we can use agents to grab everything.
Speaker 0: I I got you. What do you recommend in this situation? I'll just go back to FireCrawl here and go to my dashboard.
Speaker 1: Because we I'd recommend going for for the pricing page, and we can test that inside of the playground for fire crawl. Let me send the link over.
Speaker 0: I will look here in the chat. Here it is. Alright.
Speaker 1: So grab any pricing page that we already have for one of those software queries.
Speaker 0: Aware of. Okay. I think I think this is the right one. Let me make sure. Pricing.
Speaker 1: Okay.
Speaker 0: Yep. There we go.
Speaker 1: Nice. So
Speaker 0: we're just gonna throw this
Speaker 1: in. Perfect. As for the format, you can select, let's see. Jason.
Speaker 0: Yeah. Let me zoom back out just a little bit. Alright. So we're gonna
Speaker 1: hit edit options. Perfect. And that's basically an LLM that understands what it's seeing inside of the page, and you can prompt it to fetch only the pricing data from that specific page.
Speaker 0: Gotcha. Okay. So let me pull this up over here. Oh, format JSON. Alright.
So we're gonna edit our options. Right? And here is just basically going to be what I want from the pricing data. Right? So we are going to find this particular one in here.
We're gonna look at this and alright. So Perfect.
Speaker 1: You already have the schema. Like a
Speaker 0: Like a can I just paste JSON here, and it'll pick this up Yep?
Speaker 1: Or no? I don't think you can, that would be all It's gotta be, it's gotta be
Speaker 0: JSON schema. Okay.
Speaker 1: But that's a nice idea. We should add a way to get grab an example and produce the schema. That's that's something.
Speaker 0: Starting price, that's gonna be a number. Actually, it's a string.
Speaker 1: I I think the first one should be plans, and and it should be an array. Okay.
Speaker 0: It should be right. Nested dot. Yeah. Got you.
Speaker 1: And then it's an array of objects.
Speaker 0: Objects. Perfect. Starting price. And we want pricing
Speaker 1: period. Oh, no. For each Actually, inside of the the array of objects, it would be, like, the the plants. Oh, yeah. I think you I think you were go going right, and I yeah.
We reversed it. I was just, like, the array.
Speaker 0: Yeah. Sorry. No. All good. I I started typing it, and I realized it as well, man.
No worries. It's usually me, like, fat fingering typos on these things instead of, like, actually typing something completely wrong. But, billing cycle. Cool. Billing period.
And then we have features, which is gonna be an array itself, array of strings. I got highlight, but I don't think we really need highlight in this case. Okay. Hit save options. Okay.
Yeah. Cool. So anything else that I need to That's a checkbox.
Speaker 1: It. We can hit start scraping, and let's see what we get. Depending on what we get, maybe the website has a heavy JavaScript, and that might require us to wait a couple seconds before applying the scrape. So that's where the other options comes in. Sick.
Alright. There it is.
Speaker 0: Price. Sick. Okay. Shopbox Express. Features.
Yeah. So we could just it's probably wise to pull this up side by side just to verify what's what's coming in. One zero nine plus 29 user a month. The billing cycle is monthly. We have features, pricing tools.
Yeah. This is this is pretty much one to one. I like that the thing truncated, like, some of this data because it's not really necessary.
Speaker 1: The show can specify a bit more because the pricing, depending on the the the the column, maybe we don't want exactly the same way the the pricing is there. But
Speaker 0: Yeah. And is that done just via the prompt or, like, in the actual JSON structure? Or
Speaker 1: Yeah. We we can decide either if, because we can have, like, a strategy where the price could be a number. Right? And then it would be forced to place only a single price and then have, like, optional prices, right Yeah. I get it.
Yeah. So it's really it's really up to whatever we're really building with that.
Speaker 0: Yeah. So it looks like the, like, the the way I've got the front end for this particular thing structured. Right? We're relying on let's take a look at what the pricing plans look like. Right?
So we've we've got this really specific, like, format here of, like, we need a number there that's large. Then we've got, like, some additional stuff, and then we have, like, the month. So
Speaker 1: let let's have a price, which is a number, a float. Right? And then, additional pricing information. How about that?
Speaker 0: Yeah. So we got price. That's a number. Alright. Let's just go back to direct us.
Cool. Price. Alright. So would we just specify this, like, up in the prompt of, like, billing cycle should be something like slash month or slash is this the best way to inform it?
Speaker 1: Yeah. That that's that's one way that should work pretty fine.
Speaker 0: Okay. In this case, we'll say billing period just captures additional per user or extra cost. Alright. Let's see what that does. Let's see.
That gives us what we're looking for. And if not, we'll just work on actually integrating this into Directus instead of actually pulling. K. 29. Okay.
Price slash per month. 29 per user per month. Yeah. So this is already looking much better. Sick.
Yeah.
Speaker 1: Okay. I think that's that's better for the front end.
Speaker 0: Alright. So if I hit go get code here, this is gonna give me what I need to integrate this inside Directus.
Speaker 1: Exactly. You can get a JavaScript code on on on on the right.
Speaker 0: Yeah. Okay. Yeah. And I could also, like, get curls. There's a couple different ways to do this.
Like, we could easily, like, write an extension inside Directus that could could do this for us. Since this seems pretty lightweight, we could just use direct as flows for this as well. And I I think we've even got, like, a fire crawl extension in the marketplace for this, but let's just let's just figure out how to do this, through flows. That might be the the easiest, quickest way here so we don't have to worry about distributing an extension anywhere. Alright.
So Let's go. We will set up a new flow. I think we probably just, like, manually trigger this, and then we could set it up on a later. Alright. So we're gonna say fetch pricing data.
Sick. Alright. We're gonna manually trigger this as a hook. We are looking for this is always whenever you do a demo, you always find flaws in your own software. We'd need a we'd need a search for this for sure.
Alright. So we're gonna manually trigger this on the software collection. Next, we would just do what we call a HTTP webhook requesting URL. That's the operation. We'll say call fire crawl.
And I'm gonna definitely have to roll the token after this. But
Speaker 1: Yeah. I was just about to mention.
Speaker 0: Luckily, this is also a local host as well. Alright. So we're just gonna add a header, authorization. Bear no. I could also, like, throw this in the EMV, but we're not gonna bother with that for now.
Authorization. There's those typos that I spoke about. What else we need? Content type. Application.
JSON. Sick. Okay. And then here's the data. So that's what we're sending in the body, except the only thing that we're gonna have to change here, right, is the URL that we're we're triggering this from.
So Exactly. In this case, Directus has this functionality where we could do something like this, where we add, like, a a mustache syntax, and we can say trigger dot body dot, I don't know what we're gonna call this pricing URL. I I think it may be payload. I don't know. It will we'll test this in a moment.
So, basically, what this will do is populate that from what we call the the trigger. And I'm just gonna go in and because we're not tracking the URL, maybe we just we do that pricing URL right here. Pricing page URL stream. Cool. Hit save.
Alright. And before we add anything else, alright, all I'm gonna do is go in. We're gonna test this out. Fetch pricing data. Excited for this.
Why didn't it show? It's probably fetching data, but it is not did I forget to save it? Requires selection. Oh, I guess I did forget to save it. Pricing URL.
That happens. Alright. Sick. There we go. Got an input.
Save. Okay. Now alright. This failed anyway. Right?
Bad request. So I had to look at that. Let's see. But now if we go into ShopVox, here we go. It fetch pricing data.
We've got the URL somewhere. Run flow. Cool. That triggered successfully. I can just go back to the flow.
Normally, I would have these, like, set up differently. But alright. So we've got the body. No. It's not under payload.
That's why it failed. Okay. So here's the pricing URL. It's triggered. So, basically, direct as flow's easy way to create these automations.
Whenever you run a flow, each op each step or operations as we call them, they append their data, to an object that you could pull from. So in this case, the trigger has a special one where it has a dollar sign in front of it, and it's gonna be trigger.body or, well, it should be payload.body, I thought. Pricing URL. It should be that data. Here's what we passed in, but, yeah, we could see it's it's undefined right here.
So that's probably where the issue lies. Trigger that body.
Speaker 1: A column? An additional column at the end, maybe?
Speaker 0: Trigger that body.
Speaker 1: Oh, no.
Speaker 0: No. No. This should be trigger.payload. Here's the options passed. Trigger.payload.body.pricing URL.
That should be it. Let's just add what we can do is add an intermediate step in here. I'm not sure why this is not populating correctly. Triggered. Let's try I just wanna see what payload comes back with.
Do you have to pass HTTPS for your for the API, or will it
Speaker 1: pick up without HTTPS? It it should be able to pick up without HTTPS.
Speaker 0: Okay.
Speaker 1: It should just add it there.
Speaker 0: Alright. Let's see. Still showing undefined. Trigger. Why are we not getting this information that we need?
Alright. Let's put in an intermediate step here. Just gonna basically run some JavaScript. We have a way to do that. We're just gonna use run script.
I'm just gonna call this format. And here, what it does, it it receives all the data up to this operation that's already ran. And I just wanna return data dot trigger. And I'm I'm just gonna see I'm gonna wire this up, make sure we're getting that data properly. Flows.
Dropbox. HTTPS. Run flow. Fetch our pricing data. Here we go.
Unexpected token. What are we even doing here? The demos. The the demo gremlins. Unexpected dot.
Why is it not liking that? Oh, yeah. Duh. Return data. Let's do it this way.
Alright. Now open a new window. Get a little smarter here. Close. Fetch pricing data.
URL. K. Refresh. What do we get? Okay.
Cool. There we go. We can see the data. I don't know, again, why it wasn't liking that, but we'll just wire this up now. Let's see if this actually works.
We're gonna call formats. And, basically, in this case, let's just return dot payload body. Alright. We can see we're getting oh, okay. I don't see a payload.
That's probably where we were going wrong. Should just be body. That's my fault. Alright. So let's test one more time.
Test. Here we go. Okay. There we go. Data.
Alright.
Speaker 1: There it is.
Speaker 0: This is gonna this is gonna work this time. 100%. Here we go. Let's go. We are going to actually now we're gonna fetch that from format.payload.
Actually, it's not, is it? Body.pricing URL. Nope. It's just nobody. There we go.
Format .pricing URL should be good Or did I data dot pricing URL. Gotta help us. Format dot data. Alright. It is going to work this time.
We are going to alright. So now we could see. Right? This is, we could we could actually set this up to be asynchronous as we wanted, but it is waiting for that to come back. And now bada bing bada boom.
What do we have? This is the this is the data structure that we want. Right? Yeah. Exactly.
Sick. Okay. Alright. So now we just need to basically wire that up to direct us. And in this case, it's going to be, let me make this full screen again.
Speaker 1: And we'll Additionally, just to make this workflow, more reliable and just continue applying changes if something was actually changed from the pricing page, we can use the change tracking. So this would be just an additional format that that would instruct our workflow, yeah, something changed, right, instead of always fetching. Yeah. But that's Yeah. There's plenty more things that we we can we can add just to make sure everything's on point.
Speaker 0: Yeah. Okay. Well, it let me let's make sure we can update the software first, and then we will then we'll tackle that, man. Alright. So we've got software.
That's our collection. We are gonna I'm just gonna give this full access for now. And what we need to do, this should work here. We're gonna do something like trigger dot keys dot zero. So it trigger dot keys.
This is just an array of the the primary key for, whatever we're triggering on. I'll hit enter there. And then the payload that we're going to send I'm gonna save this real quick. We're gonna look at what we received back. So it's just Navigate the tree.
Speaker 1: Dot JSON, basically.
Speaker 0: Yeah. So it should be payload
Speaker 1: Actually, dot dot data dot data dot data. Yeah.
Speaker 0: Dot data dot
Speaker 1: Are there the q dot data there? Oh, oh,
Speaker 0: I see JSON. Yeah. The it's like it's it's nested because the operation itself returns it in a data, and this is the fetch call that gets returned. So it's so it is data dot data dot JSON. JSON.
Is what we want. Yep. Alright. So we should be able to do this where we have we're just gonna check the key here. I call this call fire crawl.
I can always adjust these keys and that's, again, that's what it's gonna pin to. So it should be able to do something simple like this where we say firecrawl.data.data.json. And this is the pricing that we're going to update. I think it's surprising. What did we we'll open this up.
What did I what did I call it here? Pricing data. Let me look at the actual data model real quick. Pricing data. That's that is indeed what it needs to be.
Pricing underscore data. Alright. Now go back to the side by side, get that lovely shot of fire crawl. And now this is actually gonna work. Right?
Yep. If the demo gods are on our side.
Speaker 1: Always in the demo.
Speaker 0: Here we go. Let's save this just just to make sure. Alright. Run flow. Now we should see this pricing change if everything works okay.
Updated two fields. Something I'm assuming something failed. So it wasn't fire crawl. It was just me. BSS software.
Keys undefined. Alright. Body dot keys. Okay. That's what it is.
Operator error again.
Speaker 1: Body dot keys.
Speaker 0: Alright. One more time. Fetch pricing data. Oh, no. You're gonna use all my FireCrawl credits before we even use them, actually.
Yeah. Sick. Okay. Now we could see it actually worked this time. So amazing.
Right? Now we've got this wired up where it will update the pricing data for us on on this schedule, or well, actually, not schedule. It's totally manual. Right? That's still not what we want.
Talk to me about the the change tracking bids.
Speaker 1: Yeah. Change tracking is it's a format that you'll specify just like we specify JSON as a format, we can add chain tracking. And inside of that chain tracking, like we fetched to JSON, there'll be the chain tracking object that will tell us, hey. Something changed in the website or, no. Nothing changed.
So you don't really need to update anything. And that's just that's an add on. We don't we don't really need to implement this in here. But it it's nice that you mentioned about wasting your Firepower credits because the reason why we went for scraping, which, by the way, we can use batch scraping to do everything, all at once, do everything in parallel. Every scrape you do to to every single page wastes one single credit.
Gotcha. And the more you abstract Yeah. Yeah. Yeah. And the the the more you abstract, the more, like, for example, if you use crawl, crawl is a combination of map with scrape, then, you'll like, the credits will, be higher for that just because you're fetching the different pages.
If you then abstract away to using agents, since the agent executes everything for us, you don't really need to think about searching, scraping, mapping, or anything. It should, it will be a bit more expensive just because it's handling a bunch of things that you really don't need to decide. So Gotcha. Yeah.
Speaker 0: So it's like a sliding scale of what your what your your tolerance is of, like, hey. I know the exact workflow that I need. Exactly. You could you could be very surgical or, like, the agent Yes. Would just be, like, here's the URL.
Here's some guidance. Here's the data that I want. Yes. Just work until you find that. Yes.
Speaker 1: Yeah. And and we can get creative with that. So, while you were implementing, I thought we could use agents to fetch for the pricing page or for whatever pages from that specific software, and then just place that in over back in Directus, place it in the database, and then always have Fireflies scrape for that given URL. So we would just be using agent to populate the URLs that will be scraped, not needing to use agent every single time. So we can get really creative with this, to ensure that we're only scraping at the times that we really need to.
Speaker 0: Yeah. Okay. Sick. Let's let's see how we do that. Right?
I'm in the the fire crawl dashboard here. Is this something that's better done here, or is it Yeah. Like, in
Speaker 1: code or It's it's it's a nice place for us to test it.
Speaker 0: Okay.
Speaker 1: Yeah. Alright. In there, you can actually just type in it could be a very high level. So example, just get the name of the you don't even need to place the URL, to be honest. You can just place Okay.
Place in the name of the software and say, fetch the pricing for this software.
Speaker 0: For let's see what we got. Let's do a different one. Let's say, core bridge here. First, the pricing for CoreBridge software.
Speaker 1: Yeah.
Speaker 0: We'll just say sign industry in case they've got there's multiple. Right? So what's the the difference between Spark Mini and Spark Pro?
Speaker 1: Spark Mini is so it's lightweight. It it's like the the comparison between using a Haiku model and a Claude or an op opus model. So one will be faster and the other will be just more efficient.
Speaker 0: Alright. Yeah. So here we are, Leo. We've seen, we've got this thing is it's fetching the pricing, and now it's asking for the details here. Is that, like, a failure for for me No.
Speaker 1: It's actually instructing this? Yeah. It's a default just to fetch for more information. Usually, we don't give enough context to LLMs. So it's just asking for that context, but let's we can skip that.
There's a button to just skip and immediately so down there Okay. Yeah.
Speaker 0: I see it.
Speaker 1: We could just skip through it. It'll generate a specific schema that it should use. So it's already defining that schema. We always always try to output structured data. Yeah.
Sick. And then now we and then Yes. Exactly. And then you can just run the agent, and it should fetch everything that we need. Since it it it runs asynchronously Uh-huh.
You can ask for for it to alert us back whenever it's done.
Speaker 0: So let's see what it's come up with. Right? It's got core bridge starter, SMB, core bridge plus. It's it's came back with, like, eight plans.
Speaker 1: Oh, yeah. Let let's see if it's even, like, previous pricing for them.
Speaker 0: No. It looks like there are that many plans. Annual. Okay. So it's divided into monthly and annual.
Okay. Six. So, you know, I have $1.99, $3.99, or $1.29. I'm sorry. 549 and 899.
Yeah. This is so the data checks out, man. I think, you know, that's a a concern with everybody these days, especially on, like, the the agents. Do you what's the the secret sauce? Is there yeah.
Do you care to share? Like, how do you guys get accurate results on this stuff?
Speaker 1: Oh, it's it's a mixture of system prompting using the best LLMs, understanding the flow of scraping. So Mhmm. Just having a fallback to when you need to interact with data, understanding, yeah, caching is is a part of it as well. So, yeah, it it's a bunch of I I I I feel like it's a bunch of little things that when summed up gets a pretty nice result like this.
Speaker 0: Yeah. Nice, man. Nice. Well, I know we're we're probably coming up on time. So, like, I won't go through the the trouble of integrating, like, the agent, but, you know, the fact that it would be, like, super simple here
Speaker 1: Yeah.
Speaker 0: Again, it's just a v two slash agent, and then we'll It would be
Speaker 1: the same process as before.
Speaker 0: Yeah. Yeah. Actually, it may not be that bad. Alright. So we go back to our flow.
You know, I would probably change this as well, right, where instead of a manual trigger, we could go into, like, a cron, and I could trigger this, like, once a week or once a month or something like that. We'll we'll just keep it manual for now. But let's let's try this out and see. So we're gonna change the
Speaker 1: Yeah. Change it over to agent.
Speaker 0: And then it's just gonna
Speaker 1: be need to add a prompt and a a schema?
Speaker 0: Do we have the we got the prompt already? Right? And then the rest of this, we just
Speaker 1: remove? Nope. Yeah. I I think
Speaker 0: she can actually
Speaker 1: remove everything since the prompt was specific to the URL it was given.
Speaker 0: Yeah. We just need the the schema. Right? Alright. So we have prompts.
Yep. Fetch the pricing for company name. Let's do the oh, I don't know how this is gonna work out. We'll see.
Speaker 1: And we can actually explore even more. I I mean, we don't have enough enough time to do so, but every single schema from your website, not only pricing, but could be handled from the agent. So we could have many different arrays and and objects to explore all the sections of that.
Speaker 0: Got you. Yeah. So it doesn't have to be, like, pricing. We could go in and basically, we're saying we could do the entirety of this and have Yes. Like, agent just continue.
Yeah. Like like, attributes or any of that. Everything. Yep. Oh, man.
That would be amazing. Exactly. Yeah. Sick. Okay.
Well, hey. This is probably a good stopping point then. You know, on on fire crawl, man, like, you know, this is just as obviously, like, one simple workflow that we've touched on. Yeah. What's what's next for you guys?
Or, like, what, is there anything you wanted to highlight out of the out of this?
Speaker 1: Let's see. I believe I believe FireCrawl. I think from this simple demo, it shows implementing something as simple as this. You get you get a a lot. Right?
So from just one endpoint or one tool from Firegral, you can explore, the entire web. And there is so many things that we can build with this. Because before before AI, before all this wave of AI, scraping was something really difficult. Scraping was something that, you'd have to handle everything. You'd have to try to treat and filter out specific words, specific sessions.
And now with FireCrawl, you have an LLM handling everything for you. It just allows us to build a lot more. And not only to build with FireCrawl because we we showed an example where we're using the FireCrawl API inside of our app. Right? Yeah.
But even coding, and I I'm seeing directors implementing more and more AI tools. Eventually We are then. Yeah. Eventually, you could integrate Firewall into those agents as well and have Firewall have information, have much more context because whoever uses AI, enough understands that context is king. Right?
And if you parrot it with the best web scraper out there, then you'll you'll have a lot of, pretty much best results.
Speaker 0: Yeah, man.
Speaker 1: But yeah. I mean Can we file or you guys
Speaker 0: have got the CLI now as well. I see. Right? Yes. That could be the next step for me is, like, oh, let's mess with this.
Right? Because they like, my day to day right now is a a lot of quad code certainly.
Speaker 1: Yeah. And if the agent uses a CLI, it it can do a lot more. Right?
Speaker 0: Yeah. Yeah. Sick. Okay. Yeah.
So yeah. Another follow-up session then then for sure where we're gonna take the direct us MCP, and then we will add the fire crawl CLI, and then we will just turn cloud code loose. Because everything that we did here today, like, you could do that via, the direct SMTP. So I could have Claude build a flow that just calls fire crawl agent and, you know, we could simplify all this work. But Yeah.
Speaker 1: It would be a matter of five minutes for it to build the entire page.
Speaker 0: Yeah, man. Sick. Sick. Cool. Well, Leo, man.
And thanks for the the session. I I really appreciate you guys joining.
Speaker 1: Thanks. Thanks for for having me. Just one more just an additional tip. Follow our YouTube channel.
Speaker 0: Yeah. Our
Speaker 1: YouTube channel, they're we're we're posting everything there. We launch a lot. So to keep up with the scraping industry, feel free to subscribe there.
Speaker 0: Fire crawl YouTube. At fire crawl underscore dev. There you go. Yeah. Yeah.
Sick. I see your face. There you are, man. Awesome, dude. Well, again, thanks for joining.
This has been a fun session, and we could definitely follow-up again with a a run this back, man.
Speaker 1: Absolutely. Alright. Excited to do it again.
Speaker 0: Alright. See you. Thanks, everybody.