Using of legacy built in command of MS DOS copy con

Usage of copy con is possible in modern Windows OS command prompt is possible

copy con is a legacy command that allows a plain MSDOS to create text file such as autoexec.bat and config.sys

From my experience, the command is supported from MSDOS 3.30 till today. Have yet to test this command in Windows 11 OS.

To commit and save the file into disk, use CTRL + Z

This command was learned/discovered during the days before existence of Internet, by attentively looking over the shoulder of seniors, who are not willing to teach during computer club session.

@echo off
netsh wlan show profiles |findstr All > tmp.txt
for /f "tokens=2 delims=:" %%a in (tmp.txt) do (
echo %%a
netsh wlan show profiles name=%%a key=clear
)
del tmp.txt

This batch script will list out list of saved wifi profiles, save the profile into text file tmp.txt

From the file tmp.txt, print out the 2nd token from the text file which will returns the WiFi profile name, then using the same command to expose WiFi key in clear text.

How to upgrade an old laptop that is lagging

First assuming that compatible hardware is available, get compatible SODIMM RAM. and identify type of hard disk your laptop can use.

General idea, check the laptop user manual to identify what are the maximum RAM that are supported total, what are the maximum RAM supported in a slot. And importantly does the motherboard chipset of your laptop supports SODIMM that are different capacity. Clock speed as long as its higher than the laptop spec, the motherboard chipset will automatically tune the RAM Clock speed to match the maximum speed supported by the motherboard.

For context, the old laptop for upgrade is an Acer Aspire E5-475 which comes with 4GB RAM, 1TB HDD that has an Intel i5-7200U CPU at 2.5GHz.

Generally, a laptop should be serviceable, however by opening the service area, it will void the laptop warranty, ACER Aspire E5-475
Continue reading

Measuring how much time vagrant up is using to complete orchestration

A little bit of effort to search the net and encountering the solution, powershell will is a simple way to measure time spent executing a command. In Linux environment, time command can be used, but in this post, it is limited to Windows environment.

Based on the conventional wisdom from the documentation of Measure-Command of powershell. The command would be as follows:

Measure-Command {vagrant up}

However, by running the above powershell command, the output to the screen will be not available.

Hence, more searching in the internet (Source: time – Timing a command’s execution in PowerShell – Stack Overflow) leads to the complete command below:

Measure-Command {vagrant up|Out-Default}
Console output is shown when the command running is being piped into Out-Default

Based on the previous post of the vagrant orchestration of GitLab, the orchestration takes about 50 minutes in my computer with timedotcom broadband connection.

Total time spent for the command to end, 50 minutes plus.

Conclusion, measure-command is easy way but lack of detail such as time spent on CPU, IDLE time and network operations time. It will not be suitable for use if more details are needed.

How to use Hashicorp Vagrant to quickstart GitLab docker compose sample

The code of the project is available at chowkarmeng/vagrant_gitlab (github.com), the docker compose is based on the sample provided in GitLab Docker images | GitLab

The improvement done was to change the folder sync for virtual box into docker volumes.

First, git clone the repository https://github.com/chowkarmeng/vagrant_gitlab.git

Fire up the quickstart by running “vagrant up” in the localdev

The process will take hours depending on the speed of your computer and speed of your internet connection.

Continue reading

To prevent system out of memory when calculating Azure Blob Storage using CalculateBlobCost.ps1

A month ago, I was working on migrating a system from Azure to another cloud platform. Azure web management portal at the time of this post has no form of UI to identify the amount of space used by your Blob Storage. One can only know how much he owe Azure from the Billing portal.

Fortunately, Microsoft Azure team has came out with a simple powershell script that will provide better insight of how much blob object and how much space used by your Azure Blob. The solution were provided at Get Billable size for Azure Blob at MSDN.

The script provided works fine, until the script stopped due to System.OutOfMemoryException :

PS C:\dev\powershell> .\CalculateBlobCost.ps1 -storageaccountname "myProdatAmerNorth"
VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft
SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Services\Microsoft.WindowsAzure.Commands.dll'.
VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft
SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Automation\Microsoft.Azure.Commands.Automation.dll'.
VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft
SDKs\Azure\PowerShell\ServiceManagement\Azure\.\TrafficManager\Microsoft.WindowsAzure.Commands.TrafficManager.dll'.
VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft
SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Services\Microsoft.WindowsAzure.Commands.Profile.dll'.
VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft
SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Compute\Microsoft.WindowsAzure.Commands.ServiceManagement.dll'.
VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft
SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Sql\Microsoft.WindowsAzure.Commands.SqlDatabase.dll'.
VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft
SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Storage\Microsoft.WindowsAzure.Commands.Storage.dll'.
VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft
SDKs\Azure\PowerShell\ServiceManagement\Azure\.\ManagedCache\Microsoft.Azure.Commands.ManagedCache.dll'.
VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft
SDKs\Azure\PowerShell\ServiceManagement\Azure\.\HDInsight\Microsoft.WindowsAzure.Commands.HDInsight.dll'.
VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft
SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Networking\Microsoft.WindowsAzure.Commands.ServiceManagement.Network.dl
l'.
VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft
SDKs\Azure\PowerShell\ServiceManagement\Azure\.\StorSimple\Microsoft.WindowsAzure.Commands.StorSimple.dll'.
VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft
SDKs\Azure\PowerShell\ServiceManagement\Azure\.\RemoteApp\Microsoft.WindowsAzure.Commands.RemoteApp.dll'.
VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft
SDKs\Azure\PowerShell\ServiceManagement\Azure\.\RecoveryServices\Microsoft.Azure.Commands.RecoveryServices.dll'.
VERBOSE: 11:35:12 AM - Begin Operation: Get-AzureStorageAccount
VERBOSE: 11:35:14 AM - Completed Operation: Get-AzureStorageAccount
WARNING: GeoReplicationEnabled property will be deprecated in a future release of Azure PowerShell. The value will be
merged into the AccountType property.
VERBOSE: 11:35:14 AM - Begin Operation: Get-AzureStorageKey
VERBOSE: 11:35:16 AM - Completed Operation: Get-AzureStorageKey
VERBOSE: Container 'mySettings' with 5 blobs has a size of 0.00MB.
Exception of type 'System.OutOfMemoryException' was thrown.
At C:\dev\powershell\CalculateBlobCost.ps1:77 char:9
+ $Blob.ICloudBlob.DownloadBlockList() |
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : OperationStopped: (:) [], OutOfMemoryException
+ FullyQualifiedErrorId : System.OutOfMemoryException

Exception calling "DownloadBlockList" with "0" argument(s): "Exception of type 'System.OutOfMemoryException' was
thrown."
At C:\dev\powershell\CalculateBlobCost.ps1:77 char:9
+ $Blob.ICloudBlob.DownloadBlockList() |
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : StorageException

Exception calling "DownloadBlockList" with "0" argument(s): "Exception of type 'System.OutOfMemoryException' was
thrown."
At C:\dev\powershell\CalculateBlobCost.ps1:77 char:9
+ $Blob.ICloudBlob.DownloadBlockList() |
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : StorageException

Exception calling "DownloadBlockList" with "0" argument(s): "Exception of type 'System.OutOfMemoryException' was
thrown."
At C:\dev\powershell\CalculateBlobCost.ps1:77 char:9
+ $Blob.ICloudBlob.DownloadBlockList() |
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : StorageException

Exception of type 'System.OutOfMemoryException' was thrown.
At C:\dev\powershell\CalculateBlobCost.ps1:124 char:13
+ $containerSizeInBytes += Get-BlobBytes $_
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : OperationStopped: (:) [], OutOfMemoryException
+ FullyQualifiedErrorId : System.OutOfMemoryException

ForEach-Object : Exception of type 'System.OutOfMemoryException' was thrown.
At C:\dev\powershell\CalculateBlobCost.ps1:123 char:9
+ ForEach-Object {
+ ~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [ForEach-Object], OutOfMemoryException
+ FullyQualifiedErrorId : System.OutOfMemoryException,Microsoft.PowerShell.Commands.ForEachObjectCommand

VERBOSE: Container 'whoamI-version-2012' with 79895 blobs has a size of 33.56MB.
VERBOSE: Container 'whoamI-version-2' with 0 blobs has a size of 0.00MB.
VERBOSE: Container 'whoamI-version-3' with 0 blobs has a size of 0.00MB.
Exception of type 'System.OutOfMemoryException' was thrown.
At C:\dev\powershell\CalculateBlobCost.ps1:77 char:9
+ $Blob.ICloudBlob.DownloadBlockList() |
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : OperationStopped: (:) [], OutOfMemoryException
+ FullyQualifiedErrorId : System.OutOfMemoryException

VERBOSE: Container 'whoamI-version-2' with 10070 blobs has a size of 7.29MB.
VERBOSE: Container 'whoamI-version-3' with 423186 blobs has a size of 1337.75MB.
VERBOSE: Container 'whoamI-version-4' with 95 blobs has a size of 95.96MB.

After few hours later, identify that one of my Azure Blob container “whoamI-version-3” has more than 1 Million object, with size exceeding 1GB was the culprit that caused the script to fail. The cmdlet that failed was Get-AzureStorageBlob.

From, MSDN Get-AzureStorageBlob has the parameter -MaxCount that would be able to calculate the object in the AzureBlob in batches; to SQL paging.

By adding the use of MaxCount parameter and adding the logic to have a continuation token, the improved script is now able to run without any fear of OutOfMemoryException.

To immediately use the improved powershell script, kindly download the file attached in this post.CalculateBlobCost powershell script

Adding Amazon Web Service EC2 instances IP and names into PuTTY session automagically

Introduction :

The motivation of creation of this script were due to non-persistent and ever changing state of Amazon Web Service(AWS) that causes my infrastructure changes more frequently. It will be labor intensive to create, update and remove session manually in PuTTY to reflect the changes in the AWS.

The pre-requisite :

The idea :

Using a batch scripting to warp and call the powershell. Then the powershell script will call the installed EC2 API tools provided by Amazon.

 

Then, powershell is too used to do the parsing of text returned by the EC2 API tools. In all the complexity of powershell will generate a new registry file for windows.

 

Finally, the batch script will call registry editor that would import the exported EC2 instance values into the PuTTY session repository.

 

In implementing the script, I have created 4 files, the batch script generate_putty_session.bat, the poweshell script generate_putty_session.ps1 , the registry file header reg_header.txt and lastly, the reg_putty.txt contains the text of default PuTTY configuration in a form of windows registry format.

 

Codes of generate_putty_session.bat :


@echo off
powershell -version 2.0 -ExecutionPolicy unrestricted %~dp0generate_putty_sessions.ps1
regedit.exe /s %userprofile%\putty_list.reg

 

Codes of generate_putty_session.ps1 :

 


#Preloading scripts
#Removing old reg file
if ( Test-Path $env:userprofile\putty_list.reg){
  del $env:userprofile\putty_list.reg
}

#Check environment for Windows x86 or x86_64
if ([IntPtr]::Size -eq 4){
  if ( Test-Path "C:\Program Files\AWS Tools\PowerShell\AWSPowerShell"){
    import-module "C:\Program Files\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1"
  }
  else{
    write-host "AWS Tools for PowerShell was not install, exiting. Download at http://aws.amazon.com/powershell/"
    exit
  }
}
else{
  if ( Test-Path "C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell"){
    import-module "C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1"
  }
  else{
    write-host "AWS Tools for PowerShell was not install, exiting. Download at http://aws.amazon.com/powershell/"
    exit
  }
}

#Check env variable for required EC2 configuration
if (-not (Test-Path Env:\EC2_HOME)){
  write-host "Environment variable EC2_HOME was not found, please ensure your EC2 API Tools were properly installed or configured or setup."
  exit
}

if (-not (Test-Path Env:\EC2_CERT)){
  write-host "Environment variable EC2_CERT was not found, please ensure your EC2 API Tools were properly installed or configured or setup."
  exit
}

if (-not (Test-Path Env:\EC2_PRIVATE_KEY)){
  write-host "Environment variable EC2_PRIVATE_KEY was not found, please ensure your EC2 API Tools were properly installed or configured or setup."
  exit
}

#Get my script path
$myPath = split-path -parent $MyInvocation.MyCommand.Definition

if(-not (Test-Path -path $myPath\reg_header.txt)){
  write-host "Please make sure reg_header.txt is in " $myPath
  exit
}

if(-not (Test-Path -path $myPath\reg_putty.txt)){
  write-host "Please make sure reg_header.txt is in " $myPath
  exit
}

Copy-Item $myPath\reg_header.txt $env:userprofile
Copy-Item $myPath\reg_putty.txt $env:userprofile

#Main body and function of the script.
#Creating file to link instance ID with Public DNS
ec2-describe-instances --filter `"virtualization-type=paravirtual`" --filter `"instance-state-name=running`" --filter `"tag:Name=/*/*`" | Select-String -pattern INSTANCE -caseSensitive | foreach { "$($_.ToString().split()[1,3])" >> $env:userprofile\awsinstanceIP.tmp}

#Creating a file to link instance ID with Name tag
ec2-describe-instances --filter `"virtualization-type=paravirtual`" --filter `"instance-state-name=running`" --filter `"tag:Name=/*/*`" | Select-String -pattern Name -caseSensitive | foreach { "$($_.ToString().split()[2,4])" >> $env:userprofile\awsinstanceName.tmp}

# Clean up results, removing RenderWorkerGroup
Get-Content $env:userprofile\awsinstanceName.tmp | Select-String -pattern RenderWorkerGroup -NotMatch | foreach { "$($_.ToString().split()[0,1])" >> $env:userprofile\awsinstanceNameClean.tmp}

#$awsInstanceIDIP = Get-Content $env:userprofile\awsinstanceIP.tmp
$awsInstanceCleanName =  Get-Content $env:userprofile\awsinstanceNameClean.tmp
$count = 0

# Create HashTable from File.
ForEach ($line in $awsInstanceCleanName) {
  if ($count -le 0 ) {
    $myHash = @{ $line.ToString().Split()[0] = $line.ToString().Split()[1]}
  }
  else{
    $myHash.Set_Item($line.ToString().Split()[0], $line.ToString().Split()[1])
  }
  $count = $count + 1
}
$count = 0

Get-Content $env:userprofile\awsinstanceIP.tmp | ForEach-Object {

  $line = $_
  $myHash.GetEnumerator() | ForEach-Object {
    if ($line -match $_.Key)
    {
      if ($_.value.ToString().Contains("render-worker")){
        $replacement = $_.Key.ToString() + " " + $_.Value.ToString() + "/" + $_.Key.ToString()
      }
      else{
        $replacement = $_.Key.ToString() + " " + $_.Value.ToString()
      }
      $line = $line -replace $_.Key, $replacement
    }
  }
  $line
} | Set-Content -Path $env:userprofile\awsinstanceResult.tmp

del $env:userprofile\awsinstanceIP.tmp
del $env:userprofile\awsinstanceName.tmp
del $env:userprofile\awsinstanceNameClean.tmp

$awsinstanceResult = Get-Content $env:userprofile\awsinstanceResult.tmp

#Adding header into file content.
Add-Content $env:userprofile\awsinstanceReg_List.tmp $(Get-Content $env:userprofile\reg_header.txt)
Add-Content $env:userprofile\awsinstanceReg_List.tmp "`r"

# Populating body of the file before converting into registry file.
foreach ($line in $awsinstanceResult){
  $reg_line = "`[HKEY_CURRENT_USER\Software\Simontatham\PuTTY\Sessions\" + $line.ToString().Split()[1] + "]"
  Add-Content $env:userprofile\awsinstanceReg_List.tmp $reg_line
  $reg_line = "`"HostName`"=`"" + $line.ToString().Split()[2] + "`""
  Add-Content $env:userprofile\awsinstanceReg_List.tmp $reg_line
  # Add fillers to the sessions
  Add-Content $env:userprofile\awsinstanceReg_List.tmp $(Get-Content $env:userprofile\reg_putty.txt)
  Add-Content $env:userprofile\awsinstanceReg_List.tmp "`r"
}

Get-Content $env:userprofile\awsinstanceReg_List.tmp | Add-Content $env:userprofile\putty_list.reg

#Removing all temporary files
del $env:userprofile\awsinstanceReg_List.tmp
del $env:userprofile\awsinstanceResult.tmp
del $env:userprofile\reg_header.txt 
del $env:userprofile\reg_putty.txt
Before running the script, pay special attention to the powershell code of generate_putty_sessions.ps1 at line 61, 64, 67 and 91. Make needed change to the format of your AWS Tag “Name”.

 

The filter in line 61, 64 would create 2 different files using the same ec2 api tools command. It is working under assumption that you have named your AWS instances using format as such /[product_name]/[environment]/[sub-system]/[server-number] .

 

Line 67 would use the similar pattern that I have used in my environment to remove unwanted servers from being added into the PuTTY sessions. In my code I am removing output that contains RenderWorkerGroup.

 

Line 91, just enforcing the format name for instances which is generated by AutoScaling /[product_name]/[environment]/[sub-system]/[aws-instance-id]

 
 

Using the script :

Place all the files into a single folder in your Windows machine.

 

Run the generate_putty_sessions.bat in the Administrator command prompt. In less than 2 minutes

 

Your PuTTY should contains all the session imported from Amazon Web Service EC2.

 
 

Code Download :

putty_session_generator

Cloudkick customed plugins for windows

Cloudkick which is part of RackSpace provides Agent based monitoring for external servers and cloud servers. Other than monitoring computer resources such as CPU, memory, HDD utilization. It is able to cater for customed check which is known as customed plugin. In other words, Cloudkick offers limitless way of administrator to monitor their IT assets as host level which is limited to the operating systems shell scripting support as well as limited to the creativity of the scripters.

Continue reading

Migrating MSDE 2000 db to MSSQL 2005 Express

Scenario :
An ASP web application in which the backend DBMS is running on MSDE 2000 SQL Server.

Objective :
Making the ASP web application to use MSSQL 2005 Express on to the server. On top of that database created on the MSDE 2000 Instance needs to be migrated over to the MSSQL 2005 .

Scope :
1. MSDE 2000 SQL Server instance name is localhostHELM , MSSQL db name HelmDb.
2. MSDE 2000 SQL Server instance is running at a non MSSQL default port (not 1433).
3. MSSQL Server 2005 Express is used.
4. Using default MSSQL Server 2005 Express instance localhostSQLEXPRESS .
5. OS used Windows Server 2003.

Workflow : Continue reading