You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: content/1.posts/1.testing-your-api-with-rest-client.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -34,7 +34,7 @@ Let's see that with a simple GET request to [The Start Wars API](https://swapi.c
34
34
35
35
05578273
36
36
37
-
Nothing new or complicated here, just the request you would have written intuitively. Like this, you can write any kind of request you want simply following the standard RFC 2616. Even if you don't know the standard, it's pretty straightforward and you often find samples with this format on the documentation of the API you are querying, like on the [Microsoft Graph API documentation](https://docs.microsoft.com/en-us/graph/api/user-list-memberof?view=graph-rest-1.0#example) for instance.
37
+
Nothing new or complicated here, just the request you would have written intuitively. Like this, you can write any kind of request you want simply following the standard RFC 2616. Even if you don't know the standard, it's pretty straightforward and you often find samples with this format on the documentation of the API you are querying, like on the [Microsoft Graph API documentation](https://learn.microsoft.com/en-us/graph/api/user-list-memberof?view=graph-rest-1.0#example&wt.mc_id=MVP_430820) for instance.
38
38
39
39
REST Client works on text files in vscode by selecting _HTTP_ as the Language Mode (by default this language mode is associated with files having the _.rest_ or _.http_ extension). It provides you with some autocompletion and a few snippets to help you write your queries. You can write multiple requests on the same file in vscode just by separating them with ###. Above each request an actionable _Send Request_ link allows you to run the request and see the response in a response panel.
40
40
@@ -103,4 +103,4 @@ GET https://swapi.co/api/planets/?search={{planetName}}&format=wookiee HTTP/1.1
Copy file name to clipboardExpand all lines: content/1.posts/10.azure-functions-custom-configuration.md
+5-5Lines changed: 5 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -22,7 +22,7 @@ Configuration used by functions in a Function App is stored in settings that can
22
22
23
23
{.rounded-lg .mx-auto}
24
24
25
-
Speaking of secrets, they should never be directly stored in the application settings of a Function App (the same goes for App Services by the way). Why not? Because secrets would be available to anyone who has access to the Function App in the Azure Portal. The right way is to use an Azure Key Vault which is the Azure component for securely storing and accessing secrets 🔒. Once your secrets are in the key vault, you have to grant the Key Vault access to the identity of your Function App and you can then reference the secrets you need directly in your application settings. These are called [Key Vault references](https://docs.microsoft.com/en-us/azure/app-service/app-service-key-vault-references) because an application setting does not contain directly the value of a secret but a reference to the secret which is stored in Key Vault. When running, your function will automatically have access to the secret and its value as an environment variable, as if it was a normal application setting.
25
+
Speaking of secrets, they should never be directly stored in the application settings of a Function App (the same goes for App Services by the way). Why not? Because secrets would be available to anyone who has access to the Function App in the Azure Portal. The right way is to use an Azure Key Vault which is the Azure component for securely storing and accessing secrets 🔒. Once your secrets are in the key vault, you have to grant the Key Vault access to the identity of your Function App and you can then reference the secrets you need directly in your application settings. These are called [Key Vault references](https://learn.microsoft.com/en-us/azure/app-service/app-service-key-vault-references?wt.mc_id=MVP_430820) because an application setting does not contain directly the value of a secret but a reference to the secret which is stored in Key Vault. When running, your function will automatically have access to the secret and its value as an environment variable, as if it was a normal application setting.
26
26
27
27
{.rounded-lg .mx-auto}
28
28
@@ -49,11 +49,11 @@ This is an example of a generated `local.settings.json` file:
49
49
50
50
However, as you can see, the settings corresponding to secrets contain the Key Vault reference values that are used by Azure to link the settings to the secrets. But this is an Azure mechanism, locally the true secrets value won't be loaded into configuration. So you will have to manually retrieve the value of the secrets in your key vault and set them manually in your local settings file. That may be okay for one secret but that gets quickly annoying when you have many secrets. You don't want your team members to constantly lose time copying secret values from the key vault on their local environment. I don't even talk about the loss of time understanding what is wrong when a secret value has changed and you did not realize it or the bad habits it could lead to like sending secrets by email or chat messages.
51
51
52
-
This is a terrible local debugging experience and honestly, you don't want that. What you want is that your function code just works when you or one of your colleagues clones or pulls a new version of the function app code. When debugging locally the code of an ASP.NET Core application deployed in an App Service you don't have this kind of problem because usually your code directly loads the secrets from the Key Vault thanks to [Key Vault configuration provider](https://docs.microsoft.com/en-us/aspnet/core/security/key-vault-configuration?view=aspnetcore-3.1).
52
+
This is a terrible local debugging experience and honestly, you don't want that. What you want is that your function code just works when you or one of your colleagues clones or pulls a new version of the function app code. When debugging locally the code of an ASP.NET Core application deployed in an App Service you don't have this kind of problem because usually your code directly loads the secrets from the Key Vault thanks to [Key Vault configuration provider](https://learn.microsoft.com/en-us/aspnet/core/security/key-vault-configuration?view=aspnetcore-3.1&wt.mc_id=MVP_430820).
53
53
54
54
## Here comes `IFunctionsConfigurationBuilder`
55
55
56
-
If you are already familiar with dependency injection in Azure Functions, you already know the `Microsoft.Azure.Functions.Extensions` NuGet package that allows you to inherit from the `FunctionStartup` abstract class and register the different services you want to inject into your functions (you can find more about that in the [documentation](https://docs.microsoft.com/en-us/azure/azure-functions/functions-dotnet-dependency-injection)). In the latest version of this NuGet package, a new virtual method has been added to `FunctionStartup`: `ConfigureAppConfiguration`. It allows you to specify additional configuration sources you would need in your functions.
56
+
If you are already familiar with dependency injection in Azure Functions, you already know the `Microsoft.Azure.Functions.Extensions` NuGet package that allows you to inherit from the `FunctionStartup` abstract class and register the different services you want to inject into your functions (you can find more about that in the [documentation](https://learn.microsoft.com/en-us/azure/azure-functions/functions-dotnet-dependency-injection?wt.mc_id=MVP_430820)). In the latest version of this NuGet package, a new virtual method has been added to `FunctionStartup`: `ConfigureAppConfiguration`. It allows you to specify additional configuration sources you would need in your functions.
@@ -65,7 +65,7 @@ This way, no more copying secret, no more storing secrets values locally, no mor
65
65
66
66
## The triggers case
67
67
68
-
Well in my title I said *"you **almost** no longer need key vault references"* and the **almost** is important. As the [Azure Functions documentation](https://docs.microsoft.com/en-us/azure/azure-functions/functions-dotnet-dependency-injection#customizing-configuration-sources) about customizing configuration sources mentions:
68
+
Well in my title I said *"you **almost** no longer need key vault references"* and the **almost** is important. As the [Azure Functions documentation](https://learn.microsoft.com/en-us/azure/azure-functions/functions-dotnet-dependency-injection#customizing-configuration-sources?wt.mc_id=MVP_430820) about customizing configuration sources mentions:
69
69
70
70
::callout{icon="i-heroicons-light-bulb"}
71
71
For function apps running in the Consumption or Premium plans, modifications to configuration values used in triggers can cause scaling errors. Any changes to these properties by the FunctionsStartup class result in a function app startup error.
@@ -75,4 +75,4 @@ Therefore, if you use a trigger that needs a secret (the connection string of an
75
75
76
76
## To conclude
77
77
78
-
To summarize, after a quick recall of how Azure Functions configuration works we have seen how Key Vault references can help to avoid having secret values in settings. We talked about the downside of this approach for the local development experience and how using the Azure Key Vault configuration provider solved that except when a secret is needed in a trigger.
78
+
To summarize, after a quick recall of how Azure Functions configuration works we have seen how Key Vault references can help to avoid having secret values in settings. We talked about the downside of this approach for the local development experience and how using the Azure Key Vault configuration provider solved that except when a secret is needed in a trigger.
Copy file name to clipboardExpand all lines: content/1.posts/12.devops-future.md
+1-3Lines changed: 1 addition & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -133,9 +133,7 @@ If you have watched Game of Thrones, the practice "divide and rule among your co
133
133
134
134
Of course, playing "divide and rule among your company" with good old does horizontal teams does not look the most trendy thing to the outside world yet it is not that complicated to be among the cool kids anyway. There is probably in your company an Application Life Management team, or maybe a team in charge of automating build and release pipelines for all the other teams, or even a team dedicated to writing scripts to build your software infrastructure. You can take that team or merge these 3 teams in one if you have these 3 teams and then just rename it to "DevOps team". Of course, you are not applying DevOps practices by doing that, you just have a bunch of people in a team called "DevOps" that automate stuff for the rest of the company, but who cares? Most people do not know what DevOps is about so they will just think "What an innovative company, they are doing DevOps, they even have a team dedicated to doing DevOps". In addition to that, people in this new team will feel great because they will have a new and nice job title to put on their CV: "DevOps Engineer" (which is non-sense as DevOps is not a job title but whatever... recruiters are looking for this). If you prefer you can also use the name "Site Reliability Engineering" for the name of your new team, what is important is not what the name means but how it sounds and how it can make your company shine to the outside world.
135
135
136
-
::div{.flex.justify-center}
137
-
:Tweet{id=852879869998501889}
138
-
::
136
+
{.rounded-lg .mx-auto}
Copy file name to clipboardExpand all lines: content/1.posts/14.w12-2021-tips-learned-this-week.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -21,7 +21,7 @@ Creating a git tag for your repository stored in Azure DevOps can be done quite
21
21
22
22
Doing things manually is error-prone and takes time, so for repetitive tasks, it is a good idea to automate them. And Azure Pipelines are great at automating things especially when it is relative to building or deploying code. In my team what we wanted was to have our CI/CD pipeline compute in what version was the code we were building and automatically tag the commit built with that version.
23
23
24
-
Computing the version in an azure pipeline is not the topic here, so let's just say there are multiple ways to do that like using [variables and the counter expression](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops#counter) or using the [gitversion task](https://marketplace.visualstudio.com/items?itemName=gittools.gittools).
24
+
Computing the version in an azure pipeline is not the topic here, so let's just say there are multiple ways to do that like using [variables and the counter expression](https://learn.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops#counter&wt.mc_id=MVP_430820) or using the [gitversion task](https://marketplace.visualstudio.com/items?itemName=gittools.gittools).
25
25
26
26
Once you know the version you can use the git command line in a script task to create the tag and push it.
27
27
@@ -33,7 +33,7 @@ Once you know the version you can use the git command line in a script task to c
33
33
workingDirectory: $(Build.SourcesDirectory)
34
34
```
35
35
36
-
For this script to work, you have to ensure that the identity that executes your pipeline has the right to push a tag on your repository. Concretely you have to give the `contribute` permission to the **user** named `Project Collection Build Service ({your organization})` as described [here](https://docs.microsoft.com/en-us/azure/devops/pipelines/scripts/git-commands?view=azure-devops&tabs=yaml#grant-version-control-permissions-to-the-build-service).
36
+
For this script to work, you have to ensure that the identity that executes your pipeline has the right to push a tag on your repository. Concretely you have to give the `contribute` permission to the **user** named `Project Collection Build Service ({your organization})` as described [here](https://learn.microsoft.com/en-us/azure/devops/pipelines/scripts/git-commands?view=azure-devops&tabs=yaml#grant-version-control-permissions-to-the-build-service&wt.mc_id=MVP_430820).
37
37
38
38
Moreover, you need to add an extra checkout task at the beginning of your pipeline. By default, you don't have to add this task, pipelines automatically do a checkout. But in this case, you want to set to true the parameter `persistsCredentials` to reuse the same credentials used for the initial checkout in the following other git operations in your pipelines.
39
39
@@ -63,4 +63,4 @@ One thing to note though is that by default Application Insights sets the `Assem
Copy file name to clipboardExpand all lines: content/1.posts/16.once-upon-a-time-in-dotnet.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -29,9 +29,9 @@ I wrote a very basic ASP.NET Core API [`MyLotrApi`](https://github.com/TechWatch
29
29
30
30
## About using records
31
31
32
-
Instead of using basic C# classes for the models in this API, I used [records](https://docs.microsoft.com/en-us/dotnet/csharp/whats-new/csharp-9#record-types). Many people are talking about records nowadays because it is one of the latest trendy features of C# 9. Unfortunately, that makes other people think records are just another syntactic sugar added to C# that they do not need to use in their code. Yet, there are a lot of benefits in using records.
32
+
Instead of using basic C# classes for the models in this API, I used [records](https://learn.microsoft.com/en-us/dotnet/csharp/whats-new/csharp-9#record-types?wt.mc_id=MVP_430820). Many people are talking about records nowadays because it is one of the latest trendy features of C# 9. Unfortunately, that makes other people think records are just another syntactic sugar added to C# that they do not need to use in their code. Yet, there are a lot of benefits in using records.
33
33
34
-
In my sample, I declared my models with the [positional syntax for property definition](https://docs.microsoft.com/en-us/dotnet/csharp/language-reference/builtin-types/record#positional-syntax-for-property-definition) which is very concise. Conciseness might not be something important for you but for me, it means fewer lines of code to write and to maintain and more clearness.
34
+
In my sample, I declared my models with the [positional syntax for property definition](https://learn.microsoft.com/en-us/dotnet/csharp/language-reference/builtin-types/record#positional-syntax-for-property-definition?wt.mc_id=MVP_430820) which is very concise. Conciseness might not be something important for you but for me, it means fewer lines of code to write and to maintain and more clearness.
Copy file name to clipboardExpand all lines: content/1.posts/17.winget-import.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -16,7 +16,7 @@ tags:
16
16
17
17
## About Windows Package Manager
18
18
19
-
You probably have already heard of the new [Windows Package Manager](https://docs.microsoft.com/en-us/windows/package-manager/) and its command-line tool `winget` that allows you to automate installing and upgrading software on your Windows 10 computer.
19
+
You probably have already heard of the new [Windows Package Manager](https://learn.microsoft.com/en-us/windows/package-manager/?wt.mc_id=MVP_430820) and its command-line tool `winget` that allows you to automate installing and upgrading software on your Windows 10 computer.
20
20
21
21
With winget you can install an application very easily simply by executing in your terminal a command like this one which installs PowerToys:
22
22
```powershell
@@ -96,4 +96,4 @@ Import is great but there are still things missing like the ability to silently
96
96
97
97
## Final thoughts
98
98
99
-
Chocolatey will continue to be my main package manager for now: on the one hand for the number of packages available and on the other hand for being able to specify some parameters for a package installation (like the workload and components to install for Visual Studio 2019). Yet, `winget` will be part of my toolbox as well to install some packages (including Microsoft Store applications) and I expect it to continue to get better and better.
99
+
Chocolatey will continue to be my main package manager for now: on the one hand for the number of packages available and on the other hand for being able to specify some parameters for a package installation (like the workload and components to install for Visual Studio 2019). Yet, `winget` will be part of my toolbox as well to install some packages (including Microsoft Store applications) and I expect it to continue to get better and better.
0 commit comments