Archive

Posts Tagged ‘Code’

More on Grooming IIS Logs

September 9th, 2011 No comments

I previously blogged about how to delete blob entries older than a given date.

I have extended the script slightly to first download entries, zip then and upload them again to another storage container, before actually doing the removal. Before the deletion is carried out a simple check of file size is performed to ensure the upload succeced.

The sequence is illustrated below:

And the script:

function FileIsLocked( [string] $filePath )
{
    $script:locked = $false
    $fileInfo = New-Object System.IO.FileInfo $filePath
    trap
    {
        # if we are in here, the file is locked
        $script:locked = $true
        continue
    }
    $fileStream = $fileInfo.Open
[System.IO.FileMode]::OpenOrCreate,
[System.IO.FileAccess]::ReadWrite,
[System.IO.FileShare]::None )
if ($fileStream)
{
$fileStream.Close()
}
    $script:locked
}# Name of your account

$accountName = <Account Name> 

# Account key
$accountKey = <Account Key>
 

# Get current date on format YYYYMMDD, e.g. 20110906
$datePart = Get-Date -f “yyyyMMdd”

#Location of where blob entries should be downloaded to
$downloadLocation = “C:Temp” + $datePart

# Name of source and target storage container
$containerName = “wad-iis-logfiles”
$targetContainerName = “backup”

# Download and removed blob entries older than 90 days
$endTime = (get-date).adddays(-90)

# Download blob entries
Write-Host “Export-BlobContainer”
Export-BlobContainer
-Name $containerName
    -DownloadLocation $downloadLocation
    -MaximumBlobLastModifiedDateTime $endTime
    -AccountName $accountName
    -AccountKey $accountKey

# Zip files
Write-Host “Zip logfiles”
$zipFileName = $downloadLocation + “IISLog” + $datePart + “.zip”
set-content $zipFileName (“PK” + [char]5 + [char]6 + (“$([char]0)” * 18))
(dir $zipFileName).IsReadOnly = $false
$zipFile = (new-object -com shell.application).NameSpace($zipFileName)

$zipFile.CopyHere($downloadLocation)

Write-Host “Check if File is locked”
Start-Sleep -s 30
$fileIsLocked = FileIsLocked $zipFileName
Write-Host $fileIsLocked

while ($fileIsLocked)
{
    Write-Host “File Locked”
    Start-Sleep -s 30
    $fileIsLocked = FileIsLocked $zipFileName
}

# Upload zip-file
Write-Host “Import-File”
Import-File
-File $zipFileName
    -BlobContainerName $targetContainerName
    -CompressBlob
    -LoggingLevel “Detailed”
    -AccountName $accountName
    -AccountKey $accountKey

# Check if upload went well by comparing file size
$ext = “zip”
$timeTo = (Get-Date -f “yyyy-MM-dd”).ToString() + ” 23:59:59″
$timeFrom = (Get-Date -f “yyyy-MM-dd”).ToString() + ” 00:00:00″

$bfc = New-Object Cerebrata.AzureUtilities.ManagementClient.StorageManagementEntities.BlobsFilterCriteria
$bfc.BlobNameEndsWith = $ext
$bfc.LastModifiedDateTimeTo = $timeTo
$bfc.LastModifiedDateTimeFrom = $timeFrom

$blobInfo = Get-Blob
-BlobContainerName $targetContainerName
    -IncludeMetadata
    -BlobsFilterCriteria $bfc
    -AccountName $accountName
    -AccountKey $accountKey

$fileInfo = (New-Object IO.FileInfo “$zipFileName”)

if ($blobInfo.Size -eq $fileInfo.Length)
{
    # Delete downloaded files (and zip-file)
    Remove-Item -Recurse -Force $downloadLocation

    # Remove downloaded blob entries
    Remove-Blob
-BlobContainerName $containerName
        -BlobsFilterCriteria $bfc
        -AccountName $accountName
        -AccountKey $accountKey
}

Categories: Azure Tags:

Grooming Windows Azure Diagnostics Storage and IIS Logs

August 17th, 2011 No comments

People working with Windows Azure are aware that the storage used for diagnostics will continue to grow perpetually if nothing is done about it.

With the introduction of the Windows Azure Management Pack – I call it WAMP; don’t know if it has an official acronym yet – System Center Operations Manager (SCOM) is able to groom the tables.

By default the following three rules are disabled in WAMP:

  • Windows Azure Role Performance Counter Grooming
  • Windows Azure Role .NET Trace Grooming
  • Windows Azure Role NET Event Log Grooming
    Once they have been enabled the rule will run on a periodic basis and will delete data from the relevant table older than T hours.

An online guide to WAMP is available here.

Unfortunately WAMP/SCOM does not come with a similar functionality to groom the IIS logs located in blob storage. By default these logs are written to the blob storage once every hour, so after a few months in production there are quite a number of them. And remember, that it is one log entry per instance.

The cost of storage is not very big, so it would be difficult to argue for an automated solution to groom the logs if price is the only parameter considered. However, as the number of entries grow it will take longer and longer to actually identify the relevant one.

To overcome this challenge one can write a small PowerShell script using e.g. the Azure Management Cmdlets developed by Cerebrata.

The script could look like the following:

# Name of your account
$accountName = <account name>
# Account key
$accountKey = <acount key># Name of container, e.g. wad-iis-logfiles
$containerName = <Container Name>
$lastModified = <UTC Date>

$bfc = New-Object Cerebrata.AzureUtilities.ManagementClient.StorageManagementEntities.BlobsFilterCriteria

$bfc.LastModifiedDateTimeTo = $lastModified

Remove-Blob -BlobContainerName $containerName
-BlobsFilterCriteria $bfc
-AccountName $accountName
-AccountKey $accountKey

The script will remove all blob entries older than the date given.

Categories: Azure Tags:

FTP from Windows Azure Blob Storage

April 11th, 2011 1 comment

In connection with migrating an old CMS system to Windows Azure, I have been playing around with Windows Azure Blob Storage. The CMS system was 10 years old and written in classic ASP, using a lot of local resources mainly the file system. I’ll try go write a blob post later on the few tweaks you  have to make to enable Windows Azure to execute classis ASP.

One feature of the old CMS system was, that you could use FTP to move files to be displayed on the site. Getting files into Blob storage is relatively easy; you can even mount a Blob container using it as a (network) drive.

A colleague asked me, if it was possible to FTP files out of Blob storage (something about moving very large movie files, and trigger the process on the Azure side).

It turned out to be relatively easy to do this.

We first set up the CloudStorageAccount and the CloudBlobClient

var blobStorage = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");

var blobClient = blobStorage.CreateCloudBlobClient();

blobClient.RetryPolicy = RetryPolicies.Retry(4, TimeSpan.Zero);

 

Next we get a reference to the blob entry. For the sake of simplicity we hardcode the name, but you could of course loop through all entries like this

 

IEnumerable<IListBlobItem> blobs = container.ListBlobs();

if (blobs != null)

{

    foreach(var blobItem in blobs)

    {                   

 

and then handling each blobItem.

 

var containerName = "karstens";

CloudBlobContainer container = blobClient.GetContainerReference(containerName);

var fileName = "Windows Azure Platform.pdf";

var uniqueBlobName = string.Format("{0}/{1}", containerName, fileName);

CloudBlockBlob blob = blobClient.GetBlockBlobReference(uniqueBlobName);

Next we need to setup the FTP server. First some housekeeping; again, you would properly not hardcode this

var ftpServerIP = "<FTPSERVER>";

var ftpUserID = "<USERID>";

var ftpPassword = "<PASSWORD>";

var ftpFilename = "ftp://" + ftpServerIP + "//" + fileName;

 

To to the actual FTP transfer we use the FtpWebRequest class

FtpWebRequest ftpReq = (FtpWebRequest)WebRequest.Create(ftpFilename);

ftpReq.Method = WebRequestMethods.Ftp.UploadFile;

ftpReq.Credentials = new NetworkCredential(ftpUserID, ftpPassword);

If the files you with to transfer, remember to set the UseBinary property to true

ftpReq.UseBinary = true;

 

You may also have to turn off the Proxy for the FTP request. This is done by setting the Proxy property to null

 

ftpReq.Proxy = null;

 

Next step is to download the blob entry into a bytearray, set the length on the FTP request and write the file to the (ftp)stream

Byte[] b = blob.DownloadByteArray();

ftpReq.ContentLength = b.Length;

using (Stream s = ftpReq.GetRequestStream())

{

    s.Write(b, 0, b.Length);

}            

 

And we are done.

If you wish to automate this, you could have the WorkerRole monitor the Blob container and FTP any new items.

Categories: Azure Tags:

Formatting code

August 29th, 2009 No comments

I’m trying to find the best plugin for formatting the code.

Categories: Admin Tags: ,