Recording from HAVIT Educational window from 20th June 2019.
- Demo: https://github.com/hakenr/AskMe
- Live demo: https://askme.blazor.cz
- Slides: 2019-06-20 – HAVIT – Blazor [PDF]
Recording from HAVIT Educational window from 20th June 2019.
Recording from HAVIT Educational windows from 16th May 2019. Lukáš Rada presented an introduction to TypeScript.
Recording from HAVIT Educational windows from 28th March 2019. Jiří Činčura presented the ref returns in C#.
We are proud to announce Castle Windsor Support for ASP.NET WebForms.
Sources, documentation and usage example can be found on GitHub, NuGet package Havit.CastleWindsor.WebForms is in the public NuGet feed.
There’s no doubt Redux is pretty verbose. Don’t get me wrong, I know there is a good reason for that, but the price is still quite high. Fortunately, there is a way to reduce boilerplate without losing any fundamental parts of Redux.
I’m going to focus on one of these parts – reducers.
Let’s examine a very common implementation of a reducer which you can find on every article on the internet.
function todoApp(state = initialState, action) { switch (action.type) { case SET_VISIBILITY_FILTER: return { ...state, visibilityFilter: action.visibilityFilter } case ADD_TODO: return { ...state, todos: [ ...state.todos, { text: action.text, completed: false } ] } default: return state } }
Cool, but where is the single responsibility principle? A reducer is just a function, but this function knows too much. In this case our reducer implements a logic for adding a new todo item and handling filtering.
Also, official documentation of Redux pointed it out, so keep in mind general good practices in order to write a clean code.
Writing functions with single level of abstraction and single responsibility principle is much better practice. A function should do only one thing and every statement of a function should be on the same level of abstraction. By obeying these two laws (that are valid in object oriented and functional paradigm too, by the way) we can achieve cleaner and more readable functions.
Reducer is responsible for creating a new state based on the current state and the data that are coming through as an action payload.
As I mentioned before, you can find the way to solve this problem in Redux documentation. Extracting that low-level logic from every case statement into a separated handler function and creating a createReducer factory function with mapping between action types and handlers is way better.
But don’t forget that reducers are pure functions, so we can’t deal with the side effects here. The right place for the side effects is a middleware and therefore we have to split our logic between the middleware and the handler functions. Sometimes, it would not be so easy to decide which logic we should put in the middleware and which one belongs to the handler functions.
I’m going to show you slightly different approach.
First of all, let’s get rid of the low-level logic from our reducer as well!
In my opinion there exists even more straightforward way: we can simply put the whole logic into the middleware. Now, reducers can only receive actions with final data which we are about to store in the state. We can simply merge current state and incoming data to create a new state.
Let’s make our reducer look like this.
function todoApp(state = initialState, action) { switch (action.type) { case SET_VISIBILITY_FILTER: return { ...state, visibilityFilter: action.visibilityFilter } case ADD_TODO: return { ...state, todos: action.todos } default: return state } }
The Todo reducer is now much cleaner. But sooner or later we come across another issue. In every case in the switch statement we are forced to do the same thing. We just take a payload of an incoming action and merge it with the current state over and over again. Actually, I don’t like the switch statement at all. I believe we have (in OOP and also in Functional paradigm) more advanced way to solve this kind of situations.
We can create higher order function which serves us as a factory function for creating reducers. Similar to a createReducer from Redux documentation.
We have no longer to take care of reducers. The only thing we have to do is just call the factory function in order to get a particular reducer which will handle given actions for us.
const createReducer = (actionTypes, initialState) => (state = initialState, action) => { if(actionTypes.some(actionType => actionType === action.type)) { const { type, ...actionData } = action; return { ...state, ...actionData }; } return state; }
Now, we can create a todoApp reducer by calling a createReducer function.
const todoApp = createReducer([SET_VISIBILITY_FILTER, ADD_TODO], initialState);
That factory function accepts two parameters: an Array of action types and an initial state, and returns a new function – reducer. Based on the given actionTypes array, we can easily find out if the incoming action belongs to this reducer and if so, we take all fields (excluding the type field) of the action and merge them with the current state. Otherwise, we just return the current state or the initial state.
Important thing is, that the fields of the action have to be named as the fields of the state which we are about to change.
We don’t need to take care of reducers anymore. A single line of code to create one. No big deal. Less code means less work and less room for mistakes. The logic is centralized in a middleware and actions carry final data for reducers. Reducers do one thing and know nothing about an implementation in a middleware.
I would love to know what you think about it. Don’t hesitate to give me some feedback.
Simple query can help you get basic insights on when the index statistics where updated:
SELECT o.name AS TableName, i.name AS IndexName, STATS_DATE(i.object_id, i.index_id) AS StatisticsUpdate FROM sys.objects o INNER JOIN sys.indexes i ON (o.object_id = i.object_id) WHERE (i.type > 0) AND (o.type_desc NOT IN ('INTERNAL_TABLE', 'SYSTEM_TABLE')) ORDER BY TableName, IndexName -- ORDER BY StatisticsUpdate
See also:
For me it was quite confusing to find the 2017 version of LocalDB and it is not a streamline process to upgrade your local default instance. The link for “SQL Server 2017 Express LocalDB” on official website (https://www.microsoft.com/en-us/sql-server/sql-server-editions-express) leads to “SQLServer2016-SSEI-Expr.exe” which runs a SQL Server 2016 with SP2 installer. Now what?
The easiest way to upgrade your LocalDB instance to 2017 is:
sqllocaldb stop MSSQLLocalDB sqllocaldb delete MSSQLLocalDB
sqllocaldb stop MSSQLLocalDB sqllocaldb delete MSSQLLocalDB sqllocaldb create MSSQLLocalDB
Credits:
If you want to restart your App Service on a scheduled basis, you can do that using simple PowerShell:
Stop-AzureRmWebApp -Name '_App Service Name_' -ResourceGroupName '_Resource Group Name_' Start-AzureRmWebApp -Name '_App Service Name_' -ResourceGroupName '_Resource Group Name_'
Basically you have to solve two issues:
Let’s start from the second one.
Credits: This is an updated and fixed version of a procedure originally published by Karan Singh – a Microsoft Employee on his MSDN blog.
It is not a good idea to use a real user-account for authentication of such a job. If you have an Organizational Account with 2-Factor Authentication, forget it.
The right way of authenticating your jobs is to use a Service Principal Id which allows you to proceed with silent authentication.
To create one, save and run following Powershell script from your PC (one-off task):
param ( [Parameter(Mandatory=$true, HelpMessage="Enter Azure Subscription name. You need to be Subscription Admin to execute the script")] [string] $subscriptionName, [Parameter(Mandatory=$true, HelpMessage="Provide a password for SPN application that you would create")] [string] $password, [Parameter(Mandatory=$false, HelpMessage="Provide a SPN role assignment")] [string] $spnRole = "owner" ) #Initialize $ErrorActionPreference = "Stop" $VerbosePreference = "SilentlyContinue" $userName = $env:USERNAME $newguid = [guid]::NewGuid() $displayName = [String]::Format("VSO.{0}.{1}", $userName, $newguid) $homePage = "http://" + $displayName $identifierUri = $homePage #Initialize subscription $isAzureModulePresent = Get-Module -Name AzureRM* -ListAvailable if ([String]::IsNullOrEmpty($isAzureModulePresent) -eq $true) { Write-Output "Script requires AzureRM modules to be present. Obtain AzureRM from https://github.com/Azure/azure-powershell/releases. Please refer https://github.com/Microsoft/vsts-tasks/blob/master/Tasks/DeployAzureResourceGroup/README.md for recommended AzureRM versions." -Verbose return } Import-Module -Name AzureRM.Profile Write-Output "Provide your credentials to access Azure subscription $subscriptionName" -Verbose Login-AzureRmAccount -SubscriptionName $subscriptionName $azureSubscription = Get-AzureRmSubscription -SubscriptionName $subscriptionName $connectionName = $azureSubscription.SubscriptionName $tenantId = $azureSubscription.TenantId $id = $azureSubscription.SubscriptionId #Create a new AD Application Write-Output "Creating a new Application in AAD (App URI - $identifierUri)" -Verbose $secpasswd = ConvertTo-SecureString $password -AsPlainText -Force $azureAdApplication = New-AzureRmADApplication -DisplayName $displayName -HomePage $homePage -IdentifierUris $identifierUri -Password $secpasswd -Verbose $appId = $azureAdApplication.ApplicationId Write-Output "Azure AAD Application creation completed successfully (Application Id: $appId)" -Verbose #Create new SPN Write-Output "Creating a new SPN" -Verbose $spn = New-AzureRmADServicePrincipal -ApplicationId $appId $spnName = $spn.ServicePrincipalName Write-Output "SPN creation completed successfully (SPN Name: $spnName)" -Verbose #Assign role to SPN Write-Output "Waiting for SPN creation to reflect in Directory before Role assignment" Start-Sleep 20 Write-Output "Assigning role ($spnRole) to SPN App ($appId)" -Verbose New-AzureRmRoleAssignment -RoleDefinitionName $spnRole -ServicePrincipalName $appId Write-Output "SPN role assignment completed successfully" -Verbose #Print the values Write-Output "`nCopy and Paste below values for Service Connection" -Verbose Write-Output "***************************************************************************" Write-Output "Connection Name: $connectionName(SPN)" Write-Output "Subscription Id: $id" Write-Output "Subscription Name: $connectionName" Write-Output "Service Principal Id: $appId" Write-Output "Service Principal key: <Password that you typed in>" Write-Output "Tenant Id: $tenantId" Write-Output "***************************************************************************"
You will be asked for a Subscription Name and Password for the Service Principal Id. You will also need to be an admin on your Azure Active Directory to be able to proceed.
Save the results securely, you can use the created Service Principal Id which gets the Owner role (or any other you specify) for many other administrative tasks (although it is a good idea to create a separate Service Principal for every single task).
Use any WebJob deployment procedure of your taste to create a scheduled Powershell WebJob executing following script:
$ProgressPreference= "SilentlyContinue" $password = '_Service Principal Key/Password_' $secpasswd = ConvertTo-SecureString $password -AsPlainText -Force $mycreds = New-Object System.Management.Automation.PSCredential ("_Service Principal Id_", $secpasswd) Add-AzureRmAccount -ServicePrincipal -Tenant '_Tenant Id_' -Credential $mycreds Select-AzureRmSubscription -SubscriptionId '_Subscription Id_' Stop-AzureRmWebApp -Name '_App Service Name_' -ResourceGroupName '_Resource Group Name_' Start-AzureRmWebApp -Name '_App Service Name_' -ResourceGroupName '_Resource Group Name_'
For manual deployment, you can use the Azure Portal directly:
And it’s done. Just be sure to enable Always On for your App Service to execute the WebJobs on schedule.
You can Start the job manually from here (the Start button) if you want to test it and you can verify the execution results using KUDU Dashboard (the Logs button).
There might be a situation where
The trick here is to change the directory of the subscription to your Azure AD directory. Changing the subscription directory is a service-level operation. It doesn’t affect your subscription billing ownership, and the Account Admin still remains the original Microsoft Account.
There are only a few simple steps to follow:
To be able to change the directory, your Microsoft Account owning the subscription must exist in the target Azure AD. To associate the MSA with the AAD:
Now you have to accept the invitation…
Now you can change the directory of the subscription:
Now you have to wait up to 10 minutes for the change to take effect.
To be able to manage the subscription by your Organizational Account, you have to add permissions to it (still signed in with the original Microsoft Account).
Recording from the fourth lesson (13/Mar 2018) of Cloud Applications Development course (NSWI152) for MMF UK (LS 2017/2018). It is published on our HAVIT YouTube Channel.
Topics covered:
You can find the labs instructions on GitHub (LAB4 + LAB5).