Create Local Storage on Cloud Services to store temporary files

On a cloud service, you can create a small local storage where you can save temporary files, yes I said temporary, because a local storage would not be guaranteed to be durable, for durable storage in Azure you should consider Azure SQL Database or Azure Storage (blob and tables). This reserved space could be useful to store custom error log files (nlog, log4net, …), for those files that are dynamically created by the application and you would like to cache it somewhere, and also for structured data files like database(sql ce, sqlite, …), but remember, just for volatile data.

You can create a local storage manually configuring the Service Definition configuration file, or through the visual studio user interface:

Open the Properties of the Web Role under the Cloud Service Project, go to Local Storage section, and click on “Add Local Storage”, name the storage, give a size, and save.

Create Local Storage

For more information about Local Storage Resources visit http://msdn.microsoft.com/en-us/library/windowsazure/ee758708.aspx

Advertisements

How to add a Windows Azure Cloud Service Project on an existing Web Project

You can accomplish this easy step just right-clicking on the Web Project and then select the menu entry “Add Windows Azure Cloud Service Project”

How to add a Windows Azure Cloud Service Project on an existing Web Project

Adding a PaaS Cloud Service (Web, Woker Role) to a Virtual Network

A PaaS cloud service, web or worker role, can be added in a Virtual Network only by changing its service configuration file (ServiceConfiguration.Cloud.cscfg).

You need to add the NetworkConfiguration node just after the Role node as follow:

<?xml version="1.0" encoding="utf-16"?>
<ServiceConfiguration xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" serviceName="MyAzureApplication" osFamily="3" osVersion="*" schemaVersion="2013-03.2.0" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration">
    <Role name="MyMvcWebRole">
        <ConfigurationSettings>
            <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="DefaultEndpointsProtocol=https;AccountName=xxx;AccountKey=xxx" />
        </ConfigurationSettings>
        <Instances count="1" />
    </Role>
    <NetworkConfiguration>
        <VirtualNetworkSite name="myazure-vnet"/>
        <AddressAssignments>
            <InstanceAddress roleName="MyMvcWebRole">
                <Subnets>
                    <Subnet name="MainSubnet"/>
                </Subnets>
            </InstanceAddress>
        </AddressAssignments>
    </NetworkConfiguration>
</ServiceConfiguration>

Visual Studio: Create self-signed certificate for ClickOnce (.pfx)

When you want to create a ClickOnce deployment you should sign the automatically generated manifest using an Authenticode certificate, providing a certificate took from the local computer cert store or passing a .pfx file.

To create an homemade self-signed .pfx file (for testing purpose only!!!), open the “Visual Studio Command Promt (2010)” or the “Developer Command Prompt for VS2012” and run the following two steps:

 

makecert.exe -sv TestCodeSign.pvk -n “CN=Test Code Sign” TestCodeSign.cer

 

pvk2pfx.exe -pvk TestCodeSign.pvk -spc TestCodeSign.cer -pfx TestCodeSign.pfx -po password

 

If you want, you can also omit the password.

Now that you have your own homemade certificate you can use it, especially useful while using command-line tools like mage (or mageUI with GUI support).

Stay Tuned! 😉

TFS: Scorch to ensure source control and the local disk are identical with tfs power tools

Today looking at my tfs workspace I’ve found that its dimension was really huge than the real space that would taken from a clean download of the sources. This of course is because of all the binaries build for debug, release and all the other build configurations, but also from old the files, branches, projects, that are not anymore in the source control, but still in the folder because eventually tfs forgot to remove.

To clean this situation I’ve found the “tfs power tools” really helpful, using the command: tfpt scorch

You can download the Team Foundation Server 2010 + SP1 version from here:

http://visualstudiogallery.msdn.microsoft.com/c255a1e4-04ba-4f68-8f4e-cd473d6b971f

the TFS 2012 version from here:

http://www.microsoft.com/en-us/download/details.aspx?id=35775

the TFS 2013 version from here:

http://visualstudiogallery.msdn.microsoft.com/f017b10c-02b4-4d6d-9845-58a06545627f

and finally the TFS 2015 version from here:

https://visualstudiogallery.msdn.microsoft.com/898a828a-af00-42c6-bbb2-530dc7b8f2e1

Here’s the complete help took from the executable itself (2011 version for TFS 2010 + SP1):

tfpt scorch – Ensure source control and the local disk are identical

Your local disk will be scanned for:
(1) items that are not in source control
(2) items which are different on disk from the workspace version
(3) items which are in the workspace but are missing on disk

Items not in source control will be deleted from disk, just as with the
tfpt treeclean command. Items determined to be different on disk from the
workspace version will be redownloaded from the server. Items missing on
disk will also be redownloaded. Items with pending changes are exempted.

By default, items deleted from your local disk (#3 above) will not be
scanned for, and local items are determined to be identical/different from
the workspace version *solely by examining the read-only bit on the file*.

To redownload items deleted from your local disk (#3 above), supply the
/deletes option. To detect items which are different from the workspace
version but still have their read-only bit set (+R), supply the /diff option.
When using either or both of these options, tfpt scorch runs more slowly.

Usage: tfpt scorch [/exclude:filespec1,filespec2,…] [filespec…]
[/recursive] [/batchsize:num] [/noprompt [/preview]]
[/deletes] [/diff]

/noprompt Do not show the list of items to be deleted and
redownloaded in a dialog box for confirmation
/exclude:filespec[,..] Files and directories matching a filespec in this list
are excluded from processing
/preview Do not make changes; only list the potential actions
/recursive Switch from one level of recursion to full recursion
/deletes Detect and replace items missing from the local disk
/diff Use MD5 hashes to compare items with source control
/batchsize:num Set the batch size for server calls (default 500)
filespec… Only files and directories matching these filespecs
are processed (inclusion list)

I’ve put my workspace in a clean situation, without having any item in checkout, to avoid loosing any changes, then I’ve run this command:

tfpt scorch /recursive /deletes

the /recursive argument will switch from one level of recursion to full recursion, making a deeper analysis
the /deletes argument will also detect missing files from the local disk and replace with the files from the source control.

After the command finish to run, it will be prompted to confirm the changes that will be done.

tfpt scorch deletes
tfpt scorch deletes

Another command that you can use just to remove the files that are not under version control is treeclean:

tfpt treeclean /recursive

tfpt treeclean – Delete files and folders not under version control

Usage: tfpt treeclean [/exclude:filespec1,filespec2,…] [filespec…]
[/recursive] [/batchsize:num] [/noprompt [/preview]]

/noprompt Operate in command-line mode only
/exclude:filespec[,..] Files and directories matching a filespec in this list
are excluded from processing
/preview Do not make changes; only list the potential actions
/recursive Switch from one level of recursion to full recursion
/batchsize:num Set the batch size for server calls (default 500)
filespec… Only files and directories matching these filespecs
are processed

This saved me from a boring time on cleaning scripts, hope you too.