Join us for The Changelog, a new way of taking you through the month’s Directus updates including product updates, new content and community contribution highlights. This month's show includes the MUX Uploader extension contribution, Hannes talking though solving hard problems with the Directus 11 access policies and Kevin with a tutorial on using the Directus UI library in extensions.
Speaker 0: Welcome to the changelog everyone. This is the first of, regular recurring monthly event that we're hoping to do, showing you what is next, new with Directus. So hi. I'm Beth. I am in the developer experience team here with Directus.
We would really like to hear your feedback on this event and what we've got in it and what you'd like to see from us in the future. So with that being said, the first set of is with Kevin.
Speaker 1: We are going to kick off the changelog with what is probably the most significant announcement for today, and that is the full release of 11. Following on from the release candidate, which we announced in June at our last leap week. And the headline feature of directors 11 is our brand new permission system, which is now based on policies. This is a big shift in how directors handles access control, giving you more power and flexibility by creating reusable sets of permissions. Now of course I can just sit here and talk to you about it, but I've prepared a little demo because I think that's gonna be a little bit better.
So let's go over to my screen and I'll show you around. In this demo of directors 11, we are managing a restaurant. And there are 4 collections, staff, shifts, bookings, and sales. And here in the settings, we have this new access policies section, which allows you to manage the new access policies feature. Now I've created a number of policies, each with their own set of permissions.
If we look at bookings manage, we see that this policy allows for full, permissions in the bookings collection. And I've also created others to allow users just to add new sales or perhaps to manage sales providing full CRUD access. If we head over to this shifts view policy, we see that there are a number of permissions across multiple collections. Here, there is full read permissions over the shifts collection and partial read access over the staff collection. And specifically here, we're only allowing users with this policy to see the names of staff and no other information.
Now policies can be attached to either users or roles or both. Let's add the ability to view shifts to the shift staff. So this is a role I have already created, and now this policy has been added to this role and all users which have it. Now each role or user can have any number of policies attached, and their permissions are an aggregate of all policies. Now this is just a small demo, but I'm sure you can see the power behind the new policies feature for modular, reusable sets of permissions.
So that isn't all for directives 11. Of course, access policies is the primary new feature, but there is more. One of our highest upvoted feature requests of all time has been to assign multiple roles to a user. And you can now achieve this as roles can contain other roles and all of the characteristics of all of the child roles will be applied to the user. We also now have a new dynamic variable called current roles, which is an array of all of the roles that is held in a user, both the top level and all of the child level roles.
And finally, a couple of breaking changes to be aware of. Of course, there has been a change to the user roles, permissions, and now policies APIs to account for this new data model. So take a look at our API reference for that, but this is also quite important. If you request non existent fields, you will now get thrown an error. So if you're using the query parameter or you're using a query property using the director's SDK and you ask, for example, for an age field and that does not exist for this user, you will receive a 4 zero three error.
And another change that's similar is if you are using a many to any field and you're using it via the API, you now have to specify the target collection for fields. You can find details about this in the breaking changes page in the documentation. So that's all about directors 11. Thank you so much, Beth, for giving me the time, and I'll hand back off to you.
Speaker 0: Excellent. So, next, we're gonna be discussing the Director's Labs experiment, extensions. And for those of you who might not know what Director's Labs is, it's where we publish the extensions I'm about to show you. It's on the Directors Labs org in GitHub. So the first set of extensions we have coined as the media demos, and they allow rich rendering of the file types.
So we have the video which displays a video from either directors in the local file from directors, which is what you're seeing now, Or we've got Vimeo, and we've also got YouTube. So if I get a YouTube ID to add in here, it shows you immediately the video, within the directors. And if I show you the raw value, you can see that the service and ID are both saved in a JSON object here as well. Very similarly, we have a audio player, which allows you to select an audio source and display an audio player from either a URL or a local file from Directus. Again, like the video interface, you can see the field holds the service and the source.
Lastly, with the media trio we've got is the PDF viewer, which allows me to see beautiful. A PDF directly within the editor. It's very simple, but it allows if you work with PDF regularly, it's really lovely to see. You can also fit your page to your screen really comfortable. So if you do work a lot with PDFs, this is a nice one for you.
Next up, we have the video, no. We've got we've got a whiteboard demo. So the whiteboard interface, it supports, free drawing, shapes, text, and uploading images from your, files. And I'll show you here. Beautiful.
We love it. We've got a lot of for the whiteboard. The extension implements fabric. Js and stores data in a restricted way as you can see. So if you want a whiteboard you can also scroll around with it.
So, it's a lot of, space for your drawing and text needs. Next up, we have the multilevel demo. So the multilevel API autocomplete interface is probably one of the most powerful, flexible interfaces, that we've released. So it allows for multistep completion where each step can reference the chosen values before it. And we've got 2 demos for you.
The first, we've got countries. And so this is where we can select the region. And here, we've got Europe and Americas. This is populated from a hard coded list containing just these two items. I'm gonna select Europe, and then you can hopefully see yep.
Perfect. That once that's completed, the value selected is passed into an external web request to an API that can filter based on selected region. So with the European, we can now see that only European entries are being shown. Although, pick this one. Pick whatever.
There we go. Although this demo only has 2 steps, you can continue to add more. As you can see within the raw data, each step is stored along with a payload object in a format where we defined, which sorry. Along with the payload object in a format we defined when setting up the interface. So this is what you're seeing.
The second demo part is part of the multilevel. We're gonna filter based on the data in the director's, project. So we've got an author's collection, which shows that we have the authors Ben and Reich, and we have a post collection with 3 blog posts that are attached to either then or right here. If we go back into posts, we can see that the first autocomplete this time uses, the author generated director's API for this project to return the authors. So we've been alright.
And then from there, that feeds the result in the second API call to filter and show us the post that are by the selected author. So we've got returning, introducing director Sullivan and what we considered when building a marketplace. Cool. The penultimate one, the many to any interface demo. Very cool.
So the cool thing about these extensions is that they allow us to experiment and this one is, very much in akin with that. So you can have new UI and UX ideas without needing to roll the changes out to everyone. This is a fairly small experiment. We introduced the UI change to the many set any builder, by adding these big buttons for creating new items. So you can see here, we've got authors, posts, and directors files in create new.
It's a small idea and one we rolled out but we're not necessarily seeing a future in it. We'd welcome your thoughts though if you do think this is gonna be useful for something else or you are inspired by it, think you're gonna use it, we'd love to hear about that. Moving on to the last of the demos going into flows, we have the rss t JSON operation. So a small extension to pass, RSS feeds into JSON that can be added to the data chain within Directus flows. What it does is takes a URL, in this case, it's a podcast, RSS feed, and returns JSON.
Okay. So what you're viewing here is the output of the you can but these you can infer steps of your automation. For example, you could extract the m p 3 URL and use our AI transcription extension. Kubers does it, and this is the RSS team at JSON, operation, which we think is pretty cool. So we've got some cloud updates for you.
Very excitingly, the key update we have is a new starter tier at $15 a month when paid annually. So we've reduced the entry point from $99 a month to our breakeven point for our managed hosted, product. This includes provisioning and hosting a database, asset storage, and caching, all things you need to configure if you were self hosting. So we're hoping you are as excited as that as we are. And we've also got a new self-service that includes single sign on and the introduction of seats for billing.
And seats are a new billing product. Seat is a user any user who has access to the Data Studio web application that admins and any users in roles that have access enabled. The starter date comes with one seat, and you can add more. There's also a blog post, around if you want to read more about it. And if you've got any questions, we'd also we're here to, help you as well.
Got some upcoming, content updates for you. So we'll start with docs. So over the last few months, we've put a lot of effort into inviting our friends from other language ecosystems to direct us. We already had a lot of JavaScript framework guides. So we've recently expanded out and published getting started guides for the PHP framework Laravel, hyphen framework Flask to accompany our existing post, Flutter with Dart, and building Android app with Directus and Kotlin.
So they are all ready for you with getting started guides within the dock. We've also just today released the third of, 3 tutorials on integrating search indexing services with Directus. So you can keep your Elasticsearch, MiniSearch, and Algolia indexes up to date with your Directus project through building custom extensions. And we also have some very cool project tutorials from our community. So I think we've got 3 to showcase.
This post from Andreas features reverse engineering a baby monitor and hooking up children to a working service, Opsgenie, via direct us. It's very, very funny. It's great to see. It's very helpful. It sounds very funny, I should say.
It's not very funny. It's actually really useful. We also have Jay showing you how to integrate directors with a Chrome extension, which includes authentication, grabbing page data, and storing it and retrieving it from a director's back end. And lastly, we've got Triste, who wrote a tutorial building an ecommerce store with Next. Js as its front end, director's.
And then there's also Stripe for payment processing. Lastly, within the content, we've got Directus TV. If you're not familiar, that is our developer focused streaming platform that has dozens of shows focused on education, entertainment, and stories from across the Directus ecosystem. And we've got short hops where Bryant has just released a brand new season, with 8 new quick tips to get the most out of Directus. Season 2 includes an episode on asset transformation presets, focal points, dynamic variables, and complex API queries.
And that wraps up the announcements. For the rest of the changelog, we've got 3 short segments we think you'll find interesting. The first one is we've got Hannes, one of our engineers, talking through some of the hard technical problems we solved for the new Directus, 11 access control system.
Speaker 2: I just wanna talk about what hard problems we had to solve as part of the policies rewrite or permissions rewrite of the whole director system. First of all, I wanna start off with saying that I joined the company mid April while the rewrite was already way on the road and happening, since Wrike took the leak on lead on that and started implementing it. I looked back in our notion docs. I saw that we started planning it officially, I think, November last year or something. And then the whole process of implementing it from start of development all the way to the release probably took a couple of months, like, 4 or 5 months since it has been quite a big rewrite of the whole permission system, the underlying permission system, which we're using in the core of directors.
And touching that and rewriting that and rethinking that presents a challenge since everything relies on it, basically. So whenever we touch all of this, you really need to be sure that we don't break user setups, that we don't introduce any security vulnerabilities, and all of that has to work across 7 different database vendors. Under the hood, we are using the when case statements, so the SQL when case statement. Since we already talked about it a lot, right, we allow different policies for one collection now, and different sets of permissions for those. So what happens if you have a policy that allows access to one field in some cases and a different policy that allows access to some other fields in other cases?
There's conflict, essentially. So your item structure might vary. For some items, you might be allowed to see a set of fields, and for another item, you're allowed to see a set of other fields. So to manage that and to have the database do the hard work of deciding which fields are, should be visible, we had to resort to using SQL internal mechanisms or SQL, mechanisms, which we're using. We're using when case, which essentially is an if statement within your query, that allows you to, have a set of permissions.
So basically, similar to what you would do in a where clause, that allows you there was always a case in there. No is a new page, problem with, policies. Kevin just asked, wasn't that always the case that there might be a convergence in fields we allowed to see previously with roles. There was one permission per collection per action. So in the end, this decided if an item was visible.
And if that policy or that permission didn't match, you didn't see the complete item. And now within your policies, you might see some items and some fields within those items, but not all of the fields. And those sets of what you see can be different now since there's multiple policies applying for the same, applying to the same collection and the same action. So we use when case. When case essentially is a, if statement in your SQL query that, allows us to write stuff like, if the price of this item is larger than $5, show or allow access to the name.
And the allow access is basically return the name, otherwise, return null. So in your new API output with policies, you might see some fields showing up as null even though they might be non nullable, for example. But that just comes back to the fact that for some items, you might be able to see the name of the item, and so for some, you might not. Using this statement within our core querying engine resulted in a lot of edge cases, a lot of tinkering, a lot of, r and d involving cross vendor support. You know, director supports 7 different different database vendors last time I counted with a lot of different dialects since MySQL isn't always MySQL.
So in the end, we had to test it cross vendors and figure out what are the constructs we can use since there are is a common set of SQL statements you can use in SQL, like, a common part group of stuff you can use. But they might behave differently in Postgres or SQLite or they might return different things. They might expect different different inputs. So all of that had to be solved in a cross vendor support, which again, because it's very deeply integrated within the core itself, prevented a hard challenge to figure out, like, how do we rewrite the actual court permission engine to take into account the new updated requirements for allowing some fields to be present and some not. I just said it's very deeply integrated.
So there was a lot of corner cases we had to account for, a lot of, internal APIs we relied on that aren't up to date anymore, that has to be updated, that now expect a completely different inputs, completely new paradigms of looking at permissions because previously, it was fairly straightforward. But now, there is more edge cases to take care of, which has been very interesting to look at. In the end, it resulted in a lot of automated and manual testing to ensure that we don't introduce any regressions, that the API output stays the same, and that we provide a migration that allows users to directly migrate from b 10 to b 11 without any hiccups, hopefully. So we migrate all the existing permissions from roles, stuff them into policies, and figuring that out also was fun. Let's say fun.
And something we're noticing right now is that we have a very, very large API surface. So, a lot of users of our product expect some undocumented behavior to work the same before v 10 and v 11, which now in the aftermath of having it released to general availability shows that a lot of edge cases we didn't account for, which we have to now after the facts actually look at and solve on a timely basis. And I think that's my 5 minutes to talk about why it was very hard to solve all the new policy and permissions across different vendors.
Speaker 0: Thank you so much. Moving along to our community showcase section. We have a video very kindly, sent to us by Matthew Ruffino, who is talking about the Mutts uploader.
Speaker 3: Hi, everybody. My name is Matthew, and I am the current developer for the Mucs upload extension. We're gonna watch a quick little demo. So here we have a brand new page where you can upload directly to Mucs via your direct us back end, that is. This is using the built in Mucs uploader.
Future plans will have our own custom uploader where we can have it as a custom interface into pre existing collections. Currently, we'll go to our new videos collection and we'll see here that we have a new upload ID, asset ID, and playback ID. When we go into the page, we can see a new interface type that uses the custom mux player for our playback. This right here allows us to quickly upload video, have it transcoded, and be available for view and playback on our back end or in our front end applications. So the motivation for this is client facing like always.
I had a client that reached out to me that wanted to build a custom video vault. They had years years years of video content that they wanted to upload and they wanted a way to have it nice and digitally organized. And so the first thing that I thought was, let's use the Directus. And so I did. Directus is now powering this video vault.
But there was just a little bit more that was needed. Right? So the video vault is good for for back end stuff. We can quickly load s 3 and have playback there via mpeg 4. But what if we want to now share this via front end, right, via something like a streaming service?
This is where Mucs comes into play. You can easily be able to share video via an Amazon S3 bucket, But when you start serving this content for thousands and thousands of people, you'll start to see a lot of slowdown in variation unless you do a whole bunch of configuration. Using mux allows you to easily transcode and get a playback ID immediately to be able to use in your front end application by use of thousands of people and be able to securely know that your video content is safe using DRM. So the approach that I built this was considering some other options that were available. But I've always wanted to use mukes in a project and so I thought that this would be the perfect use case.
The biggest challenge was really just sitting down and be able to start. Right? We look at documentation all the time and we have ideas, but the hardest part is literally sitting down and just getting started. Once I got started and read through all the documentation and spun up something on Docker, I was off to the races. And it was one of the most easy things that I've ever started to do.
To be honest, when it comes to development for different services, back ends, and and things like that, right? Directus just made it very easy out of the gate to be able to start developing this extension. So as I'm developing this, I have a lot of ideas planned for version 1. So the extra things planned would be something like live streaming. Being able to quickly create a live stream in our back end and receive the keys needed for programs like OBS or vMix, or to be able to tie that into a front end and to allow users or other people in the front end application that we're building to receive their own live streaming keys and then to live stream.
Other options and other planned features would be things like watermarking, closed captioning, and secure stream options. So with each video that you upload, you'll be able to include custom watermarks, closed captioning files like SRTs to have them built in, and to be able to secure each video the way that you want to tied to user roles. I'm actually also planning to build the same thing for BunnyStream by BunnyCDN. BunnyCDN is a great choice for storage and CDN services, and they have been expanding their technology over the years. BunnyCDN offers something very similar to Mucs where you can upload content, video content, audio content, and to be able to receive streaming links right in return.
Same thing with closed captioning, DRM, and more. So once the Mucs upload extension is completed, I'm gonna set my sights on BunnyCDN to be able to use BunnyStream built into Directus. I'm always looking for help. Always looking for help. I have so many clients, I have 3 little girls, and my hands are always tied.
If you like this idea and you think that you could benefit from it, the source code is available in GitHub right now. I'm always looking for contributors to help me build the best things that I can. And if you wanna be a part of that, that would be amazing. Please find me as Mateo on the Directus Discord and we can talk more. Or go ahead and make a PR and we can start there too.
Now the best for last, Mukes is actually planning on featuring this extension on their website once it's completed. I had preliminary talks with them this year saying, hey, you guys support all these other platforms but why not Directus? And so I showed them what I what I was working on and I shared videos and code base with them and they're very excited. While they can't say or take the code and run with it, they are gonna do a highlight series kind of similar to this where they'll be able to talk about me, Directus, and how everything works together with mukes. And it's not only gonna be beneficial for mukes, it's gonna be amazing for Directus to have that kind of of, you know, happiness and feedback from a completely different team.
I'm very excited and I hope that you guys are just as excited as I am. Again, my name is Matthew and I'm the sole developer here currently for the Mucs uploader looking for contributors. You guys have a great rest of your day. Bye.
Speaker 0: Thank you so much, Rousseau. I very much all of you legend in the Director's, Discord. We now have a developing tutorial. And for this month, Kevin is going through using Directvist UI library and extensions.
Speaker 1: For the very first change log, we've decided to, as we get towards the end, pop a little developer tutorial in here for one of the questions we're seeing come up an increasing amount. And that is for new extension authors who are wondering how to make their extensions feel consistent with the rest of the director's data studio, by using some of the premade components that exist in the rest of the application. So here I spun up a new directors project locally. Take note that extensions auto reload is set to true. And we are mounting in an extensions volume, which is this directory here in the sidebar.
And then we're going to go into the extensions, one moment, we're gonna go into the extensions directory, and run MPX create extension at latest. You can pick an extension type. We're going to pick a panel. I'll just call it test very inspired, and leave all of the other options as default. So now we have this test directory here, which contains our entry point file, this index dotjs, which contains all the configuration for the panel, and the view component itself, and this is where we'll be doing our work.
So we're gonna CD into this directory, and run NPM run dev. And what that's going to do is watch for changes inside of this directory source folder. And whenever there is a change made, it will rebuild it into this dist folder. And this is what will actually load in. So I'm just gonna just first time just restart the, container, restart the the Docker container for the first time, and then it should auto reload when we make changes.
And if I refresh this empty, dashboard here, we should now see there is our custom panel right there. There is the test, sorry, text option, which is this one. So I might say, hello, and hit save. And we see that that's rendered into the panel here. It's passed in as a prop, and we can use it straight within here.
So that's fantastic. We've got this panel. But now we wanna actually start to include, these UI components that exist. Now we actually ship what we call a component playground. This is a components dot directors.io.
We're still adding a few last components here. So this isn't an exhaustive list, but it is many of them. And this contains everything from a directors feeling checkbox, we can see here that looks like what exists in the data studio. A input complete with, you know, the focus and the active states, stuff like that. Pagination and so on and so forth.
And you can use any of these by taking code that is generated by this component playground. So we have this fancy select, for example, here. Right. Ultimately, just to select, kinda cool. And what's nice is we get this little controls area on the right hand side where we can make changes.
So let's find one. This is probably not a fantastic example of the customizability. Maybe we look at the input for a quick example here. We see that there are quite a few options. Is it disabled?
And you see immediately it gets grayed out with all of the, behavior and the styling that you would expect. Are we gonna it? So whenever you type, you know, you put a space, it replaces it with a dash and so on, only lower case and so on. The slug separator, you know, you could make it an underscore. And there's just all of these options that we have here.
And you'll notice that as I'm checking them here or changing the values, the prefix could be, I don't know, a smiley face or maybe HTTPS, something like that. We see that it's changing in the UI here in this little, this little example, but it's also changing it down here. And the best thing about this is all of these components are made globally available inside of the data studio, including for extension. So I can literally copy this v input here, copy it. I can paste it directly in.
Now this one has a v model, so just because it's, just because it's view, I will just quickly make sure that it has a value it can re it can write into value quote. Great. And that's it. Like, that is us using the v input. The extensions auto reloaded.
So if I go back here and refresh, there we are. There's our UI component. And this is a panel, but you can use this across any of the app extension types, any of any of the components that are listed in that components playground there. So that's a little example of how you can use the work that we have already done in order to build consistent, and in my opinion, quite nice looking, extensions that feel like they belong inside of the Data Studio. So, again, thank you for giving me the time to show you this, and I'll hand back over to Beth.
Speaker 0: So wrapping this up, we want to take this moment towards the end of the changelog to thank our amazing community contributors who do a lot and give their time to improve the Derivatives project and for whom we are very very grateful. So since the last time we did this where we thanked contributors, we've had multiple releases and multiple contributors so there is quite a bit of a list. I am going to thank them individually. So, thank you to the following people. Parker for adding support for ad admin token environment variable, which will be great for testing and initial project bootstrapping.
Josh for fixing an issue causing the TUS uploads not to respect the relative path of the app. Joel for prioritising access token in a query over cookies for web sockets authentication. Dominic for optimizing the type signature of the item service collection parameter. Yep. Gerald for enabling caching of field information and foreign keys as part of schema caching.
Florian for increasing visibility of the data model expand slash collapse buttons. Junhong for ensuring the drop down interface correctly works when there are no options. Adding auto reset of a drop down face value after conditional update of options, fixing the versioning drop down to with long version names, fixing detail groups collapsing on save and stay, fixing the calendar layout crashing with invalid dates, removing the update delay in the block editor interface, and lastly, fixing the list structure in draggable list. Thanks also to Max for fixing filename disk extension not getting updated when replacing a file with another file extension. Andre for ensuring the assets transform image max dimension value is also respected for extraction of metadata during image upload.
Johan for adding a retry mechanism for SQLite if a SQLite busy error occurs. Florian for adding support for listening on UNIX WebSocket, UNIX sockets via a new UNIX socket path variable, Danilo Burger for fixing an issue that would cause the translation display not to use the correct language if the user relied on system language, and finally, clicker for fixing a warning when using Docker with MySQL slash Maria DB. So that was a long list, and we are so grateful for every single one of you. Thank you again. And if you're listening to this and you want to see the specific pull requests, you can find them inside of the full release notes on GitHub.
That concludes I'm just going to see and pause if there are any questions that we've done with. There's not. That concludes the first, change log. And as we said at the beginning, but just to reiterate, we are so welcoming of feedback of anything you might like to see, what you like to see more of, maybe what you like to see less of if it's constructive. We are hoping to be really flexible with the different segments we show you so some things will stay consistent I.
E. What's new and the updates but we'll be bringing different things. We've got a lot of good ideas and we definitely want to be here yours as well. But that's everything so thank you so much for joining us and taking the time. We really appreciate it.
We hope that you found this interesting and we'll be back again so keep an eye out on the events section and hopefully we'll see you next time. But if nothing else we'll be around here for a couple minutes in case there's any last questions But if not, thanks everyone and have a very great rest of your day.