Skip to main content


Showing posts from 2017

Part 1 - Overengineering of a cloud application

N-tier architecture is seen nowadays as an old. People tends to migrate to event-base or microservice architecture. It's very common to see people that decide an architecture based on the market trends, ignoring the requirements, business needs and budget.

When you combine this with Azure (cloud in general) you will end up easily with a microservice architecture that combines messaging systems and event-driven architectures. Of course, N-tier application has a lot of disadvantages, but when you have a simple web application, there is no sense to create a complex unicorn that will survive for 100 years.

I was shocked to review a solution that was deployed in Azure 2 months ago, that from architures point of view was beautiful, but from the running and development costs become a nightmare.
I will not go in details of the business requirements, but imagine a system that needs to display some static information, allow users to upload small CSV files that are consolidated in a reportin…

[Post Event] ITCamp Conference 2017 - Cluj-Napoca

What a week! ITCamp Conference took place in Cluj-Napoca Speakers all around the globe joined forces with ITCamp Community team and delivered high quality sessions. As in the last years, topics that are covered by ITCamp Conference were from all technologies - JavaScript to Containers, Azure to Raspberry, OOP to Machine Learning.
The speaker list is pretty long and I invite you to check it out. ITCamp Conference had Google employee's, Principal Program Managers from Microsoft and of course a lot of architects and deep technical people from the field.
Being part of such an event it is a delight. Having live high quality sessions in Cluj-Napoca it is a unique opportunity, offered by ITCamp Community each year.
In figures, ITCamp Community looks very interesting - more than 40 speakers, that deliver 40+ sessions during the two days of the conference to more than 500 attendees.

What a great conference! What a week! Great sessions, great speakers, wonderful people - all of them in one…

[Past Event] DevTalks Cluj-Napoca 2017

This week I was invited at DevTalks to talk about cloud infrastructure and how we can isolated a cloud network from public internet.
DevTalks, as a conference is at the 3rd edition. This year there were 6 track in parallel covering the megatrends of 2017.  It was a good conference, with great speakers and interesting sessions.
Below you can find content related to my session.

Network isolated inside a cloud environment
It is possible to create a private network inside a cloud environment that is fully isolated from the external world? If you want to find out the response to this question that you should join the session.
Additional to this we will talk about how we can migrate existing infrastructure to cloud (partially or fully) persisting the same security level as you had before.

Network isolated inside a cloud environment Radu Vunvulea DevTalks 2017 Cluj Romania from Radu Vunvulea

Azure Cosmos DB | The perfect place for device topology for world wide solutions

In the world of IoT, devices are distributed all around the world. The current systems that are now on the market are offering scalable and distributed systems for communication between devices and our backends.

It is not something out of ordinary if we have an IoT solution distributed to 3 or 5 places around the globe. But behind this performant systems we need a storage solution flexible enough for different schemas but in the same time to be powerful enough to scale at whatever size we want.

Relational database are used often to store data where we have a stable schema. Having different devices across the globe request to have different schemas and data formats. Storing data in non-relational databases is more natural and simple. A key-value, graph or document database is many time more suitable for our needs in IoT scenarios then a relational database.

Current solutions
There are plenty solutions on the market, that are fast, powerful and easy to use. I expect that you heard…

Azure Key Vault | How Secrets and Keys are stored

I'm pretty sure that most of you heard about Azure Key Vault. If not I recommend to take a look over this page that describes in details how Azure Key Vault helps us as a safeguard for our application secrets and cryptographic keys (like certificates).

The main scope of this post is to take a look on how our secrets are stored. This is important because there are keys that cannot be recovered once generated or stored and we might end up without keys in the case we lose them.

What is HSM?
HSM is an acronym for Hardware Security Module. It is a physical device that can manage digital keys by providing cryptographic capabilities. HSM is playing the role of a safeguard by offering cryptographic capabilities directly by the hardware.

Is the tuple <keys, secrets> stored inside HSM?

No, there is no need to store this information in HSM. Secrets are stored outside the HSM, but they are encrypted using a key chain that terminates inside the HSM.
An analogy related to key chains an…

New Azure role - Azure Billing Reader

There is a small new feature on Azure Billing that made my day great. Small things matters and in this case is 100% applicable.

Until now there was no mechanism to share ONLY billing information with users. You could forward billing information to specific users. This was useful for last month consumptions, but nothing that you could do directly for the current month.

Current solution
Some shortcuts could be done to make available this information.
The most simple way is  to give to the user the Co-Administrator rights. This will work great, but that user will have access to other resources also, not only to the billing data. Having access to all resources can be a downside, especially if you need to give access to a non-technical person that might click the wrong button (smile). You might reduce the access by using RBAC (Role-Based Access Control), but you would need some extra configuration steps.
Another approach is by exporting billing and cost information using Azure REST …

Control Azure Users Access using Role-Based Access Control

Problem As a customer I want to be able to restrict user access and rights to Azure Resources that are under the same Azure Subscription.
The requirement can be extended a little more by specifying that a user needs to be able to view and access only Azure Resources that he is allowed. He shall be able to create or modify resources only specific resources.    If possible, all resources that user access or create should be under a predefined subnet.
Options There are multiple approaches to a request like this. Even if Role-Based Access Control is a powerful mechanism to control access, the way how we can restrict access is limited and doesn't allow us to restrict fully the user as we need. A classical approach is to allow user to run ARM (Azure Resource Management) scripts only through a custom component that is hosted by us. Having full control on this component we can implement any kind of logic and business restriction. The downside of a solution like this is complexity and cost.…

[Post Event] Global Azure Bootcamp event, Cluj-Napoca, April 22th, 2017

For the 5th year in the row,  ITCamp community from Cluj-Napoca joined Global Azure Bootcamp event. In comparison with other communities events organized by in Cluj-Napoca, Global Azure Bootcamp is different because is purely hands-on lab.
There were 3 different workshops of 90 minutes each where attendees had the opportunity to play and test Azure Functions, Machine Learning and Azure Resources Managers (ARM). We respected this year also the tradition to organize the event at Endava (ISDC) building, where the location is perfect for hands-on labs. Special thank you for our local sponsor - Endava.

09:00-09:30 - Sosirea participanților
09:30-11:00 - Azure Functions (Radu Vunvulea)
11:00-12:30 - Machine learning for mere mortals with Azure ML (Silviu Niculita)
12:30-14:00 - ARM Templates, how to create them, and use them in your CD pipeline (Florin Loghiade)

Workshops descriptions
Azure Functions (Radu Vunvulea) 

What are Azure Functions? AWS Lambda from Azure. This is the fastest w…

[Post Event] Microsoft Data Amp—where data gets to work meet-up

Together with ITCamp community from Cluj-Napoca we joined our forces for 2 hours. We wanted to find more about how we can transform our business with our data -Microsoft Data Amp.
A lot of new stuff were announced during the live streaming that can be viewed on
If you ask me, the game changers are SQL Server 2017 that runs on Linux and the fully integration between SQL Server and other stacks like R. Nowadays you don't need to move your data outside the Database engine to execute ML. You have the ML full support direct in the database engine.

Running Azure Stream Analytics on Edge Devices

Azure Stream Analytics enable us to analyze data in real time. Connecting multiple streams of data, running queries on top of them without having to deploy complex infrastructure become possible using Azure Stream Analytics.
This Azure service is extremely powerful when you have data in the cloud. But there was no way to analyze the data stream at device or gateway level.
For example if you are in a medical laboratory you might have 8 or 12 analyzers. To be able to analyze the counters of all this devices to detect a malfunction you would need to push counters value to Azure, even if you don't need for other use cases.

Wait! There is a new service in town that enable us to run the same Azure Stream Analytics queries but at gateway or device level - Azure Stream Analytics on Edge Devices. Even if the name is long and has Azure in the name, the service is a stand alone service that runs on-premises.
This service allows us to run the same queries in real time over data streams that w…

Global Azure Bootcamp la Cluj-Napoca | April 22, 2017

Acesta este al cincilea an când ITCamp community organizează Global Azure Boot Camp. Acesta este un eveniment la nivel global care are loc în peste 159 de locații. Ca și anul trecut, Clujul nu se lasă mai prejos și apare pe harta Azure. Pe data de 22 Aprilie vă invităm pe toți la acest eveniment din Cluj-Napoca, care va conține 3 workshop-uri. Link pentru înregistrare: Participarea la eveniment este GRATUITĂ, așa cum a fost și până acum la orice eveniment organizat de comunitatea ITCamp. Pe data de 22 Aprilie ne propunem să avem 3 workshop-uri de câte 90 de minute fiecare, unde putem să învățăm împreună cum să folosim diferite servicii Azure. Fiecare workshop conține o parte teoretică și una practică. Din această cauză, o să aveți nevoie de un laptop. De ce aveți nevoie: Laptop + Visual Studio + Microsoft Azure SDK + Un cont de Azure Dacă doriți puteți să vă grupați în grupuri de 2-3 persoane la același laptop. Link pentru înregistra…

[Post-Event] Cluj ITCodecamp 2017: Azure Data Lake for super-developers

The best way to end an weekend is by a conference. So, this is what I've done. At the end of this weekend I joined Codecamp conference in Cluj-Napoca. As a free event, Codecamp gather IT speakers and attendees from all technologies and companies, being an opportunity to meet other people.

In figures, there were around 900 people, 60 speakers and around 70 session - all of this in one day. At this conference I delivered a high-level session about Azure Data Lake. Being a new service, I decided that a high level session about it is the best approach. Below you can find more about my session.

Azure Data Lake for super-developers
Nowadays digital information is produced by each device that we touch. What we do after we receive this data can change the way how we do business, but to be able to store and process this data we need a powerful system like Azure Data Lake. This session we will discover the secrets that are behind Azure Data Lake and why this service should be…

SQL Server v.Next - Data, analytics and artificial intelligence live event - April 19th, 2017

The IT landscape is changing every day. Nowadays it is more usual than ever to talk about data collecting, processing and visualization. If a few years ago we use to talk in TB of data, now PB is a normal data unit.
I remember than 10 years ago having a database that was bigger than 1GB it was acceptable, but not very common for small or medium company size. If we are looking now around us, a database of 20GB is a commodity. Anyone will think the database is to big or you need special hardware for it.
Security became now the biggest concern, where you persist data, what data you persist, who access what data are the questions that you need to face up.
5 years ago if you would say that you want to run SQL Server on Linux... it would be a joke, but look where we are not. We have SQL Server on Linux, we have containers, Docker and we see how Microsoft embrace the open source word.
We are exited to find out more about new features that SQL v.Next is preparing to us. It's not only abou…

[Post-Event] April BucharestJS Meetup #21 | Building Node.JS Together (with Microsoft)

This week I had the opportunity to be invited by BucharestJS meetup to speak about Node.JS from Microsoft perspective.
It was a challenge for me - speaking about Microsoft in a front of Node.JS people is not an easy task. I was happy to survive after one afternoon with around 80 people with no mercy to Microsoft and Azure. I'm sure that a part of them will give a try to Visual Studio Code and/or Azure Services.

For me it was a pleasure to find people that were open to discover 'exotic' IDE and products. Thank you to all attendees that accepted for one hour to hear about Microsoft technologies.
More information related to session, slides, demos can be find below.

Title: Building Node.js Together
Abstract: JavaScript's simplicity, ubiquity, and event-driven paradigm, and its high performance runtimes like Microsoft's Chakra and Google's V8, have led it to spread beyond web pages and browsers to servers and clouds, IoT devices, mobile apps, and more via the Node.js …

Monitor cost consumption cross Azure Subscriptions at Resource Group level

In this post we'll attack different mechanism that can be used to monitor Azure consumptions cross-subscriptions. We will take a look on the out of the box solutions that are currently available, what are the limitations and a custom solutions that could be used for this purpose.

Define the problem
There is a need of a mechanism that can offer information related to Resource Group consumptions. We need to be able to provide a total cost of all Resource Groups from all subscriptions with the same characteristic (let's say with a specific tag).
The Azure Subscriptions that needs to be monitored are under multiple EA (Azure Enterprise Agreement). Some of the Azure Subscriptions are not under EA accounts.
Out of the box solution
Microsoft Azure Enterprise content pack for Power BI
In the moment when this post was written Microsoft offers for all Azure Subscriptions that are under the same EA full integration with Power BI - Microsoft Azure Enterprise content pack for Power BI.
This p…

VM Creation: Custom Scrips vs Custom Images

When we need custom applications or configuration to be done on the VM we can do this on Azure in two ways:
Custom ISOCustom scripts extensions (known also as Formula in DevLabs context)  I noticed that a recurrent questions appears in discussions with different people: When I should use custom ISO vs custom scripts extensions?
Before jumping to a discussion where we would compare this two options and what are the advantages/disadvantages of each option, let's see what are the steps involved to create a script of an ISO.

Custom ISO We can create a custom ISO on our local machine, with all our applications installed on it. Once we have the ISO created we just need to take our VHD and prepared it for Azure. More about this steps can be found on Microsoft documentation (Capture a managed image of a generalized VM in Azure and Create custom VM images).
Custom scripts extensions
Custom scripts are executed after the VHD is deployed on the VM. Using this scripts we can push or install any k…

ARM Scripts - Extending T-shirt size concept

Working with Azure Resource Manager (ARM) deployment scripts as with any other scripts can be a challenge. Especially in the moment when you want to run a deployment script.

How often did you discover that to be able to run a script you need to specify a lot of parameters? The happy case is when a default value is already specified and even if you don't know what happens behind you don't care and only 'click next button'.
I observed that the number of parameters is directly connected with the size and complexity of the deployment. After a specific threshold, the number of parameters that don't have a default value is high, making almost impossible to run the scripts.
Because of this, complex pre-deployment steps would require a lot of time, especially when it is the first times when you make that deployment or something change.

I remember one time I saw a deployment scripts written in ARM and PowerShell that was a state of an art from the way how it was designe…

[IoT Home Project] Part 9 - Extending Azure Function to support Thieves Alarm

In this post we will discover how to
Crunch distance (sonar) information produced by a GrovePI sensor and send using Raspberry PI and Azure IoT Hub to backendAdd filters on top of Service Bus Topic Subscriptions to receive only information related to distance and temperature on each subscriptionAdd new functionality to the current portal to be able to display alarm and notify our userStory
In this moment we have a system that is able to collect metrics from Raspberry and process some of the data. We already have from GrovePI a sensor that calculate the distance from the sensor to an object. Why not to use it to detect is something is moving in front of it and create a simple alarm system.
Yes, this is not the best sensor that we could use, although this scenario is a great method to learn something new.

Previous post: [IoT Home Project] Part 8 - Connecting to Azure Function and to a virtual heat pump
GitHub source code:

What we have until now

[IoT Home Project] Part 8 - Connecting to Azure Function and to a virtual heat pump

In this post we will discover how to:
Push content from Azure Stream Analytics to Azure Service BusWrite an Azure Function in Node.JS that fetch data from Azure Service Bus Topic and push it to Azure TableDevelop a ASP.NET Core application that plays the role of a heating system that start/stop the heating in a houseStory:
Use temperature data collected from sensors connected to Raspberry PI to start/shop a heating system from a housePrevious post: [IoT Home Project] Part 7 - Read/Write data to device twin
GitHub source code: 

Push content from Azure Stream Analytics to Azure Service Bus
This step is the most simple one. We just need to Add a new output to Stream Analytics and specify the Service Bus Topic where we want to push data. After this, we'll need to update the Stream Analytic query by adding: SELECT * INTO outputSensorDataTopic FROM avgdata , where 'outputSensorDataTopic' is the name of the output that we crea…