How to use a Local Storage Resource

The Windows Azure Managed Library provides classes for accessing the local storage resource from within code that is running in a role instance.

You will just need to retrieve the full path of a named local storage, and then you can store any file you want. To retrieve the full path, you simple need this line of code:

RoleEnvironment.GetLocalResource("MainLocalStorage").RootPath

Create Local Storage on Cloud Services to store temporary files

On a cloud service, you can create a small local storage where you can save temporary files, yes I said temporary, because a local storage would not be guaranteed to be durable, for durable storage in Azure you should consider Azure SQL Database or Azure Storage (blob and tables). This reserved space could be useful to store custom error log files (nlog, log4net, …), for those files that are dynamically created by the application and you would like to cache it somewhere, and also for structured data files like database(sql ce, sqlite, …), but remember, just for volatile data.

You can create a local storage manually configuring the Service Definition configuration file, or through the visual studio user interface:

Open the Properties of the Web Role under the Cloud Service Project, go to Local Storage section, and click on “Add Local Storage”, name the storage, give a size, and save.

Create Local Storage

For more information about Local Storage Resources visit http://msdn.microsoft.com/en-us/library/windowsazure/ee758708.aspx

How to add a Windows Azure Cloud Service Project on an existing Web Project

You can accomplish this easy step just right-clicking on the Web Project and then select the menu entry “Add Windows Azure Cloud Service Project”

How to add a Windows Azure Cloud Service Project on an existing Web Project

How to know if your code is running on the Windows Azure compute emulator

To understand whether the role instance is running in the Windows Azure compute emulator, you simply need to check this static variable:

RoleEnvironment.IsEmulated

Namespace: Microsoft.WindowsAzure.ServiceRuntime
Assembly: Microsoft.WindowsAzure.ServiceRuntime (in Microsoft.WindowsAzure.ServiceRuntime.dll)

http://msdn.microsoft.com/en-us/library/microsoft.windowsazure.serviceruntime.roleenvironment.isemulated.aspx

How to know if your code is running in the Windows Azure environment

To understand whether the role instance is running in the Windows Azure environment you simply need to check this static variable:

RoleEnvironment.IsAvailable

Namespace: Microsoft.WindowsAzure.ServiceRuntime
Assembly: Microsoft.WindowsAzure.ServiceRuntime (in Microsoft.WindowsAzure.ServiceRuntime.dll)

http://msdn.microsoft.com/en-us/library/microsoft.windowsazure.serviceruntime.roleenvironment.isavailable.aspx

Adding a PaaS Cloud Service (Web, Woker Role) to a Virtual Network

A PaaS cloud service, web or worker role, can be added in a Virtual Network only by changing its service configuration file (ServiceConfiguration.Cloud.cscfg).

You need to add the NetworkConfiguration node just after the Role node as follow:

<?xml version="1.0" encoding="utf-16"?>
<ServiceConfiguration xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" serviceName="MyAzureApplication" osFamily="3" osVersion="*" schemaVersion="2013-03.2.0" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration">
    <Role name="MyMvcWebRole">
        <ConfigurationSettings>
            <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="DefaultEndpointsProtocol=https;AccountName=xxx;AccountKey=xxx" />
        </ConfigurationSettings>
        <Instances count="1" />
    </Role>
    <NetworkConfiguration>
        <VirtualNetworkSite name="myazure-vnet"/>
        <AddressAssignments>
            <InstanceAddress roleName="MyMvcWebRole">
                <Subnets>
                    <Subnet name="MainSubnet"/>
                </Subnets>
            </InstanceAddress>
        </AddressAssignments>
    </NetworkConfiguration>
</ServiceConfiguration>

Visual Studio: Create self-signed certificate for ClickOnce (.pfx)

When you want to create a ClickOnce deployment you should sign the automatically generated manifest using an Authenticode certificate, providing a certificate took from the local computer cert store or passing a .pfx file.

To create an homemade self-signed .pfx file (for testing purpose only!!!), open the “Visual Studio Command Promt (2010)” or the “Developer Command Prompt for VS2012” and run the following two steps:

makecert.exe -sv TestCodeSign.pvk -n “CN=Test Code Sign” TestCodeSign.cer

pvk2pfx.exe -pvk TestCodeSign.pvk -spc TestCodeSign.cer -pfx TestCodeSign.pfx -po password

If you want, you can also omit the password.

Now that you have your own homemade certificate you can use it, especially useful while using command-line tools like mage (or mageUI with GUI support).

Stay Tuned! 😉

C#: Enable Automatic Decompression on System.Net.WebClient

The WebClient class is really easy to use, but actually doesn’t provide so much control over the underline request, or so it seems, but we can still inherits from it to get the control over the WebRequest that the WebClient using.

With this in mind I’ve could extend the WebClient enabling the powerfull AutomaticDecompression of the HttpWebRequest, and getting compressed web resources:

public class AutomaticDecompressionWebClient : WebClient
{
	protected override WebRequest GetWebRequest(Uri address)
	{
		var request = base.GetWebRequest(address) as HttpWebRequest;
		if (request == null) throw new InvalidOperationException("You cannot use this WebClient implementation with an address that is not an http uri.");
		request.AutomaticDecompression = DecompressionMethods.Deflate | DecompressionMethods.GZip;
		return request;
	}
}

Azure Cloud: Sql Virtual Machine – Access with Management Studio over Internet

Today I was trying, using Sql Server Management Studio (SSMS), connecting to a Sql Server 2012 instance, installed on a Windows Azure virtual machine, and even if I followed each step of the easy MSDN guide I’ve not been able to connect to it, as if something was blocking the communication port 1433. In fact, despite having opened the endpoint on the virtual machine from the windows azure control panel, I could not connect, when I realized that probably some ports are blocked by default from Microsoft, then I tried to change to a different public port from 1433 to one that has got a number as high as were the others I seen in the control panel, and everything worked out:

Sql Server SSMS To Azure

Honestly, I do not have documentation that actually Microsoft blocks the ports of known services (maybe for security reason), but I’ve not found a way to be able to use the classic 1433 as the public port, instead the random 55890 port perfectly worked.

This is the MSDN guide that I followed

I hope this advice will face save valuable minutes if not hours

Harlem Shake Your Site!

With Chrome open a website, and activate the “Javascript Console” of “Tools” (CTRL + SHIFT + J), copy and paste the following script and press enter:

javascript:(function(){function c(){var e=document.createElement("link");e.setAttribute("type","text/css");e.setAttribute("rel","stylesheet");e.setAttribute("href",f);e.setAttribute("class",l);document.body.appendChild(e)}function h(){var e=document.getElementsByClassName(l);for(var t=0;t<e.length;t++){document.body.removeChild(e[t])}}function p(){var e=document.createElement("div");e.setAttribute("class",a);document.body.appendChild(e);setTimeout(function(){document.body.removeChild(e)},100)}function d(e){return{height:e.offsetHeight,width:e.offsetWidth}}function v(i){var s=d(i);return s.height>e&&s.height<n&&s.width>t&&s.width<r}function m(e){var t=e;var n=0;while(!!t){n+=t.offsetTop;t=t.offsetParent}return n}function g(){var e=document.documentElement;if(!!window.innerWidth){return window.innerHeight}else if(e&&!isNaN(e.clientHeight)){return e.clientHeight}return 0}function y(){if(window.pageYOffset){return window.pageYOffset}return Math.max(document.documentElement.scrollTop,document.body.scrollTop)}function E(e){var t=m(e);return t>=w&&t<=b+w}function S(){var e=document.createElement("audio");e.setAttribute("class",l);e.src=i;e.loop=false;e.addEventListener("canplay",function(){setTimeout(function(){x(k)},500);setTimeout(function(){N();p();for(var e=0;e<O.length;e++){T(O[e])}},15500)},true);e.addEventListener("ended",function(){N();h()},true);e.innerHTML=" <p>If you are reading this, it is because your browser does not support the audio element. We recommend that you get a new browser.</p> <p>";document.body.appendChild(e);e.play()}function x(e){e.className+=" "+s+" "+o}function T(e){e.className+=" "+s+" "+u[Math.floor(Math.random()*u.length)]}function N(){var e=document.getElementsByClassName(s);var t=new RegExp("\\b"+s+"\\b");for(var n=0;n<e.length;){e[n].className=e[n].className.replace(t,"")}}var e=30;var t=30;var n=350;var r=350;var i="//s3.amazonaws.com/moovweb-marketing/playground/harlem-shake.mp3";var s="mw-harlem_shake_me";var o="im_first";var u=["im_drunk","im_baked","im_trippin","im_blown"];var a="mw-strobe_light";var f="//s3.amazonaws.com/moovweb-marketing/playground/harlem-shake-style.css";var l="mw_added_css";var b=g();var w=y();var C=document.getElementsByTagName("*");var k=null;for(var L=0;L<C.length;L++){var A=C[L];if(v(A)){if(E(A)){k=A;break}}}if(A===null){console.warn("Could not find a node of the right size. Please try a different page.");return}c();S();var O=[];for(var L=0;L<C.length;L++){var A=C[L];if(v(A)){O.push(A)}}})()
 
“Enjoy” your harlem shake site! What a stupid thing 🙂