Navigation
Recherche
|
Bring your data to your code with Azure’s Data API Builder
jeudi 2 octobre 2025, 11:00 , par InfoWorld
Azure is now so big it’s hard to keep on top of all its features, let alone drill down into its ever-growing line of developer tools. That’s not surprising; in the past two decades or so, it’s become the place where Microsoft builds its own and its customer-facing products.
That internal developer focus eventually brings tools to the rest of us, as internal APIs and services mature and become products themselves, opening up to the wider world. One area where this process is obvious is Azure’s many different service APIs, which often give language- and platform-independent ways to build Azure services into your code. These cover everything from storage to artificial intelligence and provide tools for interacting with the underlying Azure platform. So many APIs, so little time Azure’s many different platform services can be a challenge for developers, especially when each service has its own APIs and SDKs. The days of needing to know only a handful of APIs are long gone, buried in complexity and a Cambrian explosion of new tools. What’s needed is a move to consolidate APIs across the entire service, bringing similar tools together so we can use a common grammar with different services, and making code easier to understand and easier to move between Azure services. This last requirement is perhaps the most important. Code needs to be able to scale, and the services we use may change over time. To ensure our code is portable, it needs to be cross-platform, so we can bring existing skills from Windows to cloud-native development and to mobile. That consolidation seems to have begun, driven by tools like.NET Aspire and Dapr, which aim to abstract development away from complex cloud-native concepts. Alongside those developments, the Azure platform team is bringing out a new generation of APIs as part of merging wider platform elements into higher-level services like Fabric. Introducing DAB Fabric has brought us a rather useful tool: the Data API Builder (DAB). It can add an API to most common Azure data sources, using REST or GraphQL, and then run it on-premises or in the cloud. And it’s free. You can use it to implement common database operations at scale, one benefit of its Fabric heritage. As it’s containerized, it can run anywhere, and that includes other hyperscale clouds as well as your own systems. The list of supported databases is impressive. Alongside Microsoft’s own Azure data sources (including two different Cosmos DB personalities: MongoDB and PostgreSQL), there’s support for two common open source databases: MySQL and PostgreSQL. It also provides OAuth2 authentication, along with built-in OpenAPI documentation so you can automatically generate client libraries. Getting started with DAB Building Data APIs starts in the tool’s.NET CLI. It’s powerful, providing an interactive environment for building the necessary JSON configuration files. There are tools for initializing and configuring a database connection, adding and updating entities in existing configurations, exporting configuration files so you can add them to your project’s Git or similar repository, validating configuration files, and launching and running the service container. As the CLI is a.NET tool, it’s installed from NuGet using the dotnet command line tool that’s installed as part of a recent.NET release. There’s a minimum required version of.NET 8 to download and run Microsoft.DataApiBuilder. You will need Docker or Podman to load the rest of the Data API Builder tool, as it runs in a Docker container. You can pull it from Microsoft’s own container registry, where it’s part of the Azure databases store. It’s a good idea to use DAB alongside your choice of data management tools so you can check your queries against the API you’re building, for example, ensuring you have the right table names. Once you’ve got the DAB CLI installed and running, you can build the API configuration files. Building portable data APIs The first step is to use the init command to set up the necessary connection string, with the address of the server, the user and password you’re using, and, of course, the name of the database that’s being accessed. With that in place, you can add the entities for tables you want to expose using the add command. You now have a basic DAB configuration that provides a connection to defined tables, ready to be run and tested. The start command will run the DAB service on your development PC, opening a port for the API and displaying its endpoint URI once the DAB server is running. To get data from a table, navigate to the server endpoint using its GraphQL API and append /api/. Usefully you can simply append /swagger to the URI to get the OpenAPI documentation for the API, which can then be used in your code to build the necessary interfaces to the data source. With a working DAB solution, you can now deploy to Azure or any other service. Deploying to Azure is simplified by using the Azure Developer CLI, logging in to your account, and defining an environment. The CLI will generate the necessary Bicep code and, as DAB is a Docker container, will upload configurations and enable it and an Azure database, along with a basic web application, as a set of Azure Container Apps. Again, you’ll be presented with an endpoint URI that you can test using Azure’s built-in OpenAPI tool. Using DAB from your code With an API built and running, you can start to use it in your code. The API is RESTful, so you can build requests using familiar HTTP API grammars, with the ability to manage read, write, replace, and update operations, as well as delete. These are the familiar CRUD operators used by most database applications and can be applied by choosing the appropriate HTTP method used to call the API. Results are returned as JSON, with a default of 100 records. As well as large-scale retrievals, you’re able to use primary keys to specific items. You can use a single key or, where available, compound keys. More complex queries can be constructed using parameters; standard SQL select and filter queries are common options that can keep traffic to a minimum, as the operations are carried out in the DAB container and only the results are returned to your client application. There are some limitations to DAB, dependent on the database and version being accessed, with Microsoft keeping an updated list of the available features. This table shows what features are supported by which database, along with the minimum supported release. This means if you’re using a version of PostgreSQL prior to Version 11, you won’t be able to use DAB with it, nor SQL Server prior to Version 2016 or MySQL Version 8. The Azure platform services that work with DAB are automatically updated, so you will always be using a supported version. Out of the box, Data API Builder uses HTTP connections. If you want to secure them, you need to provide your own certificates and open the HTTPS port in the container. In many cases, it’s easier to use a reverse proxy like YARP to mask public access to the API server, having the proxy manage and provide HTTPS connections without needing to reconfigure the DAB container. This allows you to update it automatically as needed. It’s good to see Microsoft releasing tools like DAB and keeping them updated, with the recent 1.6 release. By providing one simple way to access and manage data in different databases with a common API design, it’s possible to use the resulting abstraction to switch back-end storage without changing client code—and even migrate between on-premises storage and your choice of cloud provider, as well as from traditional n-tier applications to cloud-native ones as your application grows and scales. Perhaps the best way to think about DAB is that it takes a logical data-first approach, rather than platform-first, to providing access to that data. Thinking about it that way makes it easier to understand Microsoft’s design decisions and how you might use DAB in your code.
https://www.infoworld.com/article/4066410/bring-your-data-to-your-code-with-azures-data-api-builder....
Voir aussi |
56 sources (32 en français)
Date Actuelle
jeu. 2 oct. - 14:23 CEST
|