On one of our recent projects, we needed to implement an application with CRUD operations and some relatively simple integration by pushing content to two different systems. We also had to monitor if they had been processed or not. Our brief included a list of technical requirements such as the following:
We opted for the Amazon Lambda route with Amazon DocumentDb. The goal of this blog post is to summarize the developer experience during development while comparing them to Azure Functions. As .NET developers, we realised that the developer experience is significantly better using the matching Azure technologies, and running an Amazon Lambda function locally might also present a steep learning curve for some people who are not familiar with Docker and containerization.
The first thing we needed to realize was that the Document DB is MongoDB compatible, which is fine, but it also does not have an emulator that can be used for development purposes. So, as a developer, we needed to choose between the following options: we either use MongoDB for local development, or we use a shared instance of DocumentDB only for this purpose. Using a different service during development always has its risks if the services are different from each other (which is the case based on this article).
Document DB is an expensive database service of Amazon, so running it is something that needs to be calculated in the initial estimation when going this way. Another drawback of Document DB is that is only accessible from an Amazon VPC, so you either develop on an Amazon EC2 virtual machine, or you need to spin up an EC2 instance with SSH tunneling.
The second thing is that running an Amazon Lambda on a development machine also has its limitations. If you want to debug your functions in Visual Studio, you need to use the Mock Lambda Test Tool, which is essentially a static web page that uses websockets to execute the functions. It does not use real HTTP requests. Plus, you also need to mimic the behaviour of the Amazon API Gateway by forging the data model that passes to your Lambda Function. This is great when something really does not work and you need to use the debugger, but this cannot be called from the frontend. So, to check the code alongside with the frontend, this is not a viable option.
This is where Amazon Serverless Application Model comes to the rescue! This is a tool that allows you to build the Lambda functions for the local environment, and also running them while simulating the Amazon API Gateway, which is required to invoke Lambdas through HTTP triggers. With some configuration in the template file, and using the CLI, you can run your application in a Docker container, which is created on-demand. The advantage of this is you can run the SAM continuously, and when you rebuild it, it will pick up the latest version. The disadvantage is that it is slower than it would be when running it on Amazon Lambda (at least your frontend developers can also implement the loading state of your API calls without adding any artificial code 😊). Keep in mind that your code is running in a container, so you need to use some Docker DNS features to access the host machine, or other containers.
Due to the fact that this was a simple set of Lambda Functions, and they were no parallel development, we chose to simulate the real use case by having our own environment of Document DB and Amazon Lambda functions deployed to AWS. As this is a really simple CRUD application, opting for the local MongoDB environment would have been a good choice as well because the main difference between MongoDB and Document DB is the functions of the query language, and we do not have complex queries in this app. The attached sample code, however, shows the case of running everything on your machine, because I wanted to show you a more common scenario where every developer would have their own environment.
Because of the drawbacks that we faced with Amazon on this project, I wanted to see how a similar setup works with Azure. The first good news is that there is an emulator for Azure Cosmos DB which was created for development purposes. Setting up the Cosmos DB on a local environment is clear and straightforward, you just need to go through the next-next-finish wizard, and it works.
The next thing then is to run the Functions locally, which is also straightforward by using the Azure Functions Core Tools. You might also want to install the Azure development workload for Visual Studio as well, so you do not have to use the CLI to create the project. This tool prepares Visual Studio to run and debug your functions, just as you would do this for any .NET application. Azure Functions also has a nice feature for functions with HTTP trigger, which is the ability to create OpenAPI documentation for your backend. This is something that we really missed during development with AWS, because we have a way to generate client-side TypeScript code based on OpenAPI documentation for our frontend applications. (Sidenote: obviously, we could have created this manually and maintained it, but as there was only a handful of API methods, we decided that it would just give us some overhead.)
As by default, you do not need to run the functions in a container, you also do not need to deal with its networking features, the setup just works out of the box. You can copy the connection string from the CosmosDB emulator, and the functions you are running can access them without doing any additional research or configuration. I would also like to express that running the AWS Lambdas in containers is not necessarily a drawback, but it is worth mentioning, because it can cause some overhead based on your project team and their experience.
To sum up, Azure’s tools just work out of the box to develop for their services, so the developer experience (strictly as a .NET Developer using Visual Studio for a couple of years) is seamless and easy to deal with. To set up an environment for AWS, it requires a little bit of research and learning, but at the end of day, it also works.
While Azure Functions can create OpenAPI documentation, please also note that AWS Lambda has a nice template that allows you to wrap a complete ASP.NET Core application into a Lambda function and you can create a serverless application by using the common methodology. You can learn more about this here and here.
The complete working solution is available in our GitHub repository.
In this article, I will show you a solution for zero-downtime deployment in Azure Kubernetes Service. To add a context for it, first, we are going through some Deployment strategies. Then, I will choose the one that fits our needs. Some of them are supported by Kubernetes natively, some are not (yet). Next, I will outline a System overview by showing you the necessary Kubernetes objects in our AKS. The following part of my article presents our Azure DevOps deployment pipeline to you and briefly goes through the scripts and other settings that do the main thing: zero-downtime deployment. Finally, I am going to Wrap up the things
IT outsourcing is very popular nowadays due to the fact that it has a lot of pros from a business perspective. There are a couple of blog posts written on this topic. You can take a look at one of them here. This post is about showing what makes an outsourcing partnership tempting for a developer – and at the end of the day, it is a win-win situation when both you and your developers are happy.
A 2020-as covid időszak közben után, jelentősen megnőtt az igény az online kereskedelemre. Sok, addig csak fizikai boltban értékesítő cég döntött úgy, hogy belép az online piactérre. De miért éri meg a piacon lévő dobozos megoldások helyett egyedi fejlesztésű webáruházat választani?
Sitecore JSS and Sitecore Headless have been out for a few years now, but we started our first headless project last year. Our chosen stack was Sitecore Headless 10.2 with Next.js SDK (19.0.0) using Server-Side Rendering (SSR). I would like to talk about two main topics in this post. The first one is how the development processes are changed because of the changed tech stack. The second part is about the technical challenges we met during the implementation.